Filters: Type: quotation ×
[On 5G] This is the great thing about the decentralized, permissionless innovation of the internet—telcos don’t need to decide in advance what the use cases are, any more than Intel had to decide what the use cases for faster CPUs would be.
Without deep understanding of the basic tools needed to build and train new algorithms, he says, researchers creating AIs resort to hearsay, like medieval alchemists. “People gravitate around cargo-cult practices,” relying on “folklore and magic spells,” adds François Chollet, a computer scientist at Google in Mountain View, California.
If you wrap your main content – that is, the stuff that isn’t navigation, logo and main header etc – in a <main> tag, a screen reader user can jump immediately to it using a keyboard shortcut. Imagine how useful that is – they don’t have to listen to all the content before it, or tab through it to get to the main meat of your page.
for those open source companies that still harbor magical beliefs, let me put this to you as directly as possible: cloud services providers are emphatically not going to license your proprietary software. I mean, you knew that, right? The whole premise with your proprietary license is that you are finding that there is no way to compete with the operational dominance of the cloud services providers; did you really believe that those same dominant cloud services providers can’t simply reimplement your LDAP integration or whatever? The cloud services providers are currently reproprietarizing all of computing — they are making their own CPUs for crying out loud! — reimplementing the bits of your software that they need in the name of the service that their customers want (and will pay for!) won’t even move the needle in terms of their effort.
npm users have downloaded more than 489 billion packages in the 9 year life of the project, with the strange effect of exponential growth being that 286 billion, or 58% of those, were just in the last year.
The nature of NPM is such that I’d expect most large corporate Node software to depend on at least a couple of single individuals’ hobby projects. The problem is that those projects don’t tend to fulfill the same expectations of security, quality and maintenance.
Whether you like it or not, whether you approve it or not, people outside of your design team are making significant design choices that affect your customers in important ways. They are designing your product. They are designers.
The premise of “The Good Place” is absurdly high concept. It sounds less like the basis of a prime-time sitcom than an experimental puppet show conducted, without a permit, on the woodsy edge of a large public park.
If you stop thinking in terms of MVC you might notice that at its core, React is a runtime for effectful functions that don’t execute “once”, but run continuously while being anchored to a call tree.
Every bitcoin proof-of-work mined is an incremental addition to a vast distributed summoning ritual powering the demon-soul at the heart of the maze, the computational equivalent of a Buddhist prayer wheel spinning in a Himalayan breeze.
Among other things at Netflix the Mantis Query Language (MQL an SQL for streaming data) which ferries around approximately 2 trillion events every day for operational analysis (SPS alerting, quality of experience metrics, debugging production, etc) is written entirely in Clojure.
The ASGI specification provides an opportunity for Python to hit a productivity/performance sweet-spot for a wide range of use-cases, from writing high-volume proxy servers through to bringing large-scale web applications to market at speed.
Relational databases are a commodity now, but they power a much larger fraction of the world’s economy that AI ever will. And no company has a “relational database strategy”.
When you’re pumping messages around the Internet between heterogeneous codebases built by people who don’t know each other, shit is gonna happen. That’s the whole basis of the Web: You can safely ignore an HTTP header or HTML tag you don’t understand, and nothing breaks. It’s great because it allows people to just try stuff out, and the useful stuff catches on while the bad ideas don’t break anything.
In too many organizations, deploy code is a technical backwater, an accumulation of crufty scripts and glue code, forked gems and interns’ earnest attempts to hack up Capistrano. It usually gives off a strong whiff of “sloppily evolved from many 2 am patches with no code review”. This is insane. Deploy software is the most important software you have. Treat it that way: recruit an owner, allocate real time for development and testing, bake in metrics and track them over time.
Most administrators will force users to change their password at regular intervals, typically every 30, 60 or 90 days. This imposes burdens on the user (who is likely to choose new passwords that are only minor variations of the old) and carries no real benefits as stolen passwords are generally exploited immediately. [...] Regular password changing harms rather than improves security, so avoid placing this burden on users. However, users must change their passwords on indication or suspicion of compromise.
In case you missed it: @GoogleColab can open any @ProjectJupyter notebook directly from @github! To run the notebook, just replace “github.com” with “colab.research.google.com/github/” in the notebook URL, and it will be loaded into Colab.
How about if, instead of ditching Twitter for Mastodon, we all start blogging and subscribing to each other’s Atom feeds again instead? The original distributed social network could still work pretty well if we actually start using it
Every day more than 1 trillion events are written into a streaming ingestion pipeline, which is processed and written to a 100PB cloud-native data warehouse. And every day, our users run more than 150,000 jobs against this data, spanning everything from reporting and analysis to machine learning and recommendation algorithms.
With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody.
Easy explainer: a “blockchain” is a linked list with an append-only restriction, and appending is made incredibly expensive but super parallelizable, so when things work well a big group of people can work together and it’s too expensive for a small evil group to compete. [...] Does your problem benefit from storing information in an append-only list, and relying on a central authority to manage it is so bad that it’s worth paying the enormous append costs to have a bunch of Chinese servers manage it for you? Then *maybe* look at a blockchain.
Interviewing a developer for whom English wasn’t his first language and he kept calling legacy code “legendary code” and now that’s all I want to write.
Over the last twenty years, publishing systems for content on [BBC] News pages have come and gone, having been replaced or made obsolete. Although newer content is published through dynamic web applications that can be readily modified, what lies beneath this sometimes resembles layers of sedimentary rock.
Our provisioning tools for developer environments broke and no one knew how to fix them, so we reassigned new hires the zombie VMs of recently departed coworkers.
Raccoons don’t think ahead very much, so raccoons don’t have very good impulse control. I don’t think the raccoon realized when it started climbing what it was in for.
One of the ways the internet has changed around us over the years is the blog-o-sphere of MetaFilter’s early years has all but disappeared, and so has the kind of link-sharing culture that went with it.
Open Source gives engineers the power to collaborate across legal entities (companies) without involving bizdev. The benefits of this workaround are extraordinary and underappreciated.
At Harvard we’ve built out an infrastructure to allow us to deploy JupyterHub to courses with authentication managed by Canvas. It has allowed us to easily deploy complex set-ups to students so they can do really cool stuff without having to spend hours walking them through setup. Instructors are writing their lectures as IPython notebooks, and distributing them to students, who then work through them in their JupyterHub environment. Our most ambitious so far has been setting up each student in the course with a p2.xlarge machine with cuda and TensorFlow so they could do deep learning work for their final projects. We supported 15 courses last year, and got deployment time for an implementation down to only 2-3 hours.
Half of the time when companies say they need “AI” what they really need is a SELECT clause with GROUP BY.