In 1965, when Fairchild Semiconductor’s Gordon Moore predicted that the number of transistors on a computer chip would double every year , the most advanced chips had around 60 components . In 1975, Moore–who cofounded Intel in 1968–reconsidered his prediction and revised the rate of doubling to roughly every two years. So far, history has proved him more or less right. But growth may soon slow as engineers find it harder to contend with the heat produced and power consumed by transistor-crammed chips (see “Parallel Universe”).
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.