Over the decades, as computers have become less scarce, they have been put to less and less valuable use. (I confess I have caught myself adding three numbers in Excel.) Computers were once used only for the most complex and important tasks, and the hurdle for getting access to one was high. Some Technology Review readers can remember the days of punch cards–the careful preparation, the waiting, and the cost of making a mistake. I’m of a later generation, but I grew up under Communism, which wasn’t known for its abundance of computing power. I have vivid memories of participating in secondary-school programming competitions in the mid-1980s in Bulgaria, where there weren’t enough computers in any given school district to pair machines with students. On the morning of a competition, students would study one or more problems, develop algorithms to solve them, write code on paper, and then painstakingly write down variable traces of the code on sample data. During a lunch break, judges would pore over the mostly incomprehensible algorithms and try to figure out which kids had a chance of getting a program to work. In the afternoon, a select few would then be chosen to use the few available computers.
As this example demonstrates, highly motivated people in computing-starved communities make great use of the first few machines they get. But as more computers arrive, the value per computer goes down significantly, because the know-how needed to put them to good use is scarce. In Bulgarian schools, for example, I saw classrooms where only half the machines were functioning; where teachers didn’t know, and didn’t want to learn, a thing about computers; where basic educational materials for computer use were lacking. At that time, some Bulgarian educators argued that computers were never going to become useful tools for students. Of course, they were wrong. When the resources became available for more than a couple of hours per person per day, there was a big (and, at the time, surprising) jump in productivity driven not only by the increased presence of computers but also by the changed nature of the interaction between humans and computers. When people could access computers frequently and predictably, they were willing to invest in learning what to do with them, whether it was touch-typing or using software. Today, with the development of the Internet, that kind of jump would be even greater.
I expect these lessons to be “generalizable” to other underserved communities, as well as to philanthropic initiatives such as Nicholas Negroponte’s One Laptop per Child (see “Philanthropy’s New Prototype”). The initial focus should be on introducing just a few machines and keeping them in working order. Then, as more computers are brought in, equal effort should be made to train educators and students, as well as to manage the naysayers who’d rather see the money spent elsewhere. When there are enough resources that people can reliably depend on computers in schools, Internet cafés, and homes, the true value of access to computing will become apparent.
Simeon Simeonov is a technology partner at Polaris Venture Partners, a venture capital firm based in Waltham, MA.
Forget dating apps: Here’s how the net’s newest matchmakers help you find love
Fed up with apps, people looking for romance are finding inspiration on Twitter, TikTok—and even email newsletters.
How AI could solve supply chain shortages and save Christmas
Just-in-time shipping is dead. Long live supply chains stress-tested with AI digital twins.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.