Skip to Content

Robots Make Computer Science Fun Again

Students who program robots are more likely to stick with their computer science curriculum.
The long-term trend for interest in computer science at the university level is relatively bleak. As the graph at above makes apparent, interest has declined precipitously since the 2001 bursting of the dot-com bubble, leading to something of an existential crisis in the field of computer science instruction.

The latest survey on the subject, which charts 2007-08 data, showed a widely-reported up-tick in enrollment of 8 percent, which is great for a year-on-year change, but neglects the long-term trend.

One has to wonder whether it’s the very ubiquity of computers that has made them uninteresting to students–note the spike of interest in the early 80’s, when the advent of personal computers slaked a pent-up demand for access to the instruments that everyone believed would define the future.

Robots, in contrast, are still rare in our everyday lives–plus, they’re the furthest thing from remote and abstract. So goes the reasoning behind a new effort to get them into classrooms, described earlier this month in a paper by Tom Lauwers and Illah Nourbakhsh, in which they unveiled the Finch.

The Finch is cheap, simple and avoids the major complexifying factor most previous efforts to add robots to computer science have encountered: namely, robots break, and debugging physical objects is a headache students don’t need.

The results were profound: retention rates for the 2009 computer science classes in which the Finch was used (shown below, in red) increased by 25 percent.

And why not? The Finch sounds like exactly the kind of Maker project everyone’s inner geek cries out for:

The Finch can express motion through a differential drive system, light through a color- programmable LED, and sound through a beeper and using computer speakers. Similarly, it can sense light levels through two photoresistors, temperature through a thermistor, distance traveled through two wheel encoders, obstacles placed in front of it, and its orientation in three dimensional space through an accelerometer […] In addition to these hardware-based capabilities, the accompanying software allows students to easily have the Finch speak or play songs over computer speakers, read real-time data from internet RSS feeds, and react to video from computer webcams.

Follow Christopher Mims on Twitter, or contact him via email.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.