Physicists speak of dark energy, the label applied to the expansive oomph permeating the universe. The Internet has its own dark energy: the legions of nerds who code for fun, challenge, and uncertain profit. They do not make a business plan or solicit lawyers and VCs before jumping in, and they have no particular political or economic power. Yet they are the ones who developed the Internet in a backwater and declined to patent its protocols. They are the ones who took the hobbyist platforms of the first PCs and turned them into powerhouses that, together with the Internet, gave us one pleasant surprise after another: the electronic spreadsheet, instant messaging, Internet telephony, Wikipedia. But two problems threaten the Web’s dark energy.
First, the trust in reasonable behavior embedded within our open, generative networks and utterly reprogrammable PCs–for example, consider that neither network participants nor software authors are accredited or, for the most part, identified–is too readily abused. People find their connections disrupted and their PCs turned into zombies, and they seek security. Millions of PCs, especially in corporate and school environments, are then locked down.
To deal with this problem, technologists need to develop better code to help us deal with bad apples while preserving an open environment. If a small but broad fraction of Internet users were to agree to pass along their PCs’ anonymized vital signs and running processes, we could learn how new code is affecting those PCs’ performance. We’d also get a sense of how trustworthy new code is, partly on the basis of how long it’s been around and who’s actually using it. This could help identify annoying applications that fall short of being outright viruses, such as screen savers that generate pop-up ads. Such strategies could also help detect Internet filtering around the world.
The second threat is that consumers and developers are being charmed by new, managed technologies whose vendors assert control and promise new levels of reliability. We see the rise of the iPhone, with its walled-garden App Store, and a new generation of Web platforms like Facebook Platform and Google Apps–each of which naturally reserves the right to kill outside code. But once outside code can be effortlessly controlled, regulators can push vendors to do just that. Old-fashioned PC architecture meant that Bill Gates could not reasonably have been asked to reach out and kill, say, peer-to-peer software running on Windows PCs. And Net architecture famously makes censorship difficult (though by no means impossible). But the new platforms are not so naturally insulated. Thus Facebook and others can potentially be pressured to forbid a new round of disruptive but potentially useful applications.
Nerds writing what could be amazing code for new platforms need to push those platforms’ makers to yield some control. Apple’s, Facebook’s, and Google’s current business plans don’t (yet) depend on monopolizing all the outside apps that run on top of them. The right market forces can persuade them to help ensure that the emerging cool infrastructure will remain hospitable to dark energy for years to come.
Jonathan Zittrain is a professor of law at harvard law school and author of The Future of the Internet–And How to Stop It.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.