Seven Must-Read Stories (Week Ending September 27, 2013)
Another chance to catch the most interesting, and important, articles from the previous week on MIT Technology Review.
- Startup Shows Off Its Cheaper Grid Battery
Sun Catalytix is making a new type of flow battery that could store hours’ worth of energy on the grid. - Bruce Schneier: NSA Spying Is Making Us Less Safe
The security researcher Bruce Schneier, who is now helping the Guardian newspaper review Snowden documents, suggests that more revelations are on the way. - How Google Converted Language Translation Into a Problem of Vector Space Mathematics
To translate one language into another, find the linear transformation that maps one to the other. Simple, say a team of Google engineers. - World’s Largest Solar Thermal Power Plant Delivers Power for the First Time
The $2.2 billion Ivanpah solar power plant can now generate electricity. But was it worth the money? - The First Carbon Nanotube Computer
A carbon nanotube computer processor is comparable to a chip from the early 1970s and may be the first step beyond silicon electronics. - Facebook Launches Advanced AI Effort to Find Meaning in Your Posts
A technique called deep learning could help Facebook understand its users and their data better. - In Search of the Next Boom, Developers Cram Their Apps into Smart Watches
Clever apps might persuade people that they need a wrist-worn computer. <
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Google’s Gemini is now in everything. Here’s how you can try it out.
Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.
This baby with a head camera helped teach an AI how kids learn language
A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.