Even a good teacher may not always be able to tell, at a glance, which students are quietly struggling and which need more of a challenge. Fortunately, laptops may soon come with enough emotional intelligence built in to do the job for them.
A recent study from North Carolina State University shows how this might work. Researchers there used video cameras to monitor the faces of college students participating in computer tutoring sessions. Using software that had been trained to match facial expressions with different levels of engagement or frustration, the researchers were able to recognize when students were experiencing difficulty and when they were finding the work too easy.
The project suggests a way for technology to help teachers keep track of students’ performance in real time. Perhaps it could even help massively open online courses (or MOOCs), which can involve many thousands of students working remotely, to be more attuned to students’ needs (see “The Crisis in Higher Education”).
It also hints at what could prove to be a broader revolution in the application of emotion-sensing technology. Computers and other devices that identify and respond to emotion—a field of research known as “affective computing”—are starting to emerge from academia. They sense emotion in various ways; some measure skin conductance, while others assess voice tone or facial expressions (see “Wearable Sensor Knows What Overwhelms You” and “Technology that Knows When to Hand You a Hankie”).
The NC State researchers’ experiment involved students who were using software called JavaTutor to learn to write Java code. The researchers analyzed 60 hours of video footage with the Computer Expression Recognition Toolbox, a program that recognizes facial expressions. They compared the software’s conclusions with the students’ own reported state of mind and found that they matched closely. The work will be presented Saturday at the Sixth International Conference on Educational Data Mining.
The ultimate goal is to develop a tutoring system that helps students who are having difficulty and “bolster their confidence and keep them motivated,” says Joseph Grafsgaard, a PhD student at NC State who coauthored a paper on the work.
A number of different researchers are exploring applications for affective computing in education. Jacob Whitehill, a software engineer and research scientist at Emotient, a startup exploring commercial uses of affective computing, recently coauthored an as-yet-unpublished paper showing that facial expressions identified by a computer could predict test performance.
“There is an emerging agreement that facial-expression recognition can play a constructive role in teaching,” Whitehill says. But he notes that feedback about students’ emotional state can impair teachers’ performance if they don’t know how or when to respond. “It’s a hard problem to know how to use these [emotion] sensors effectively,” he says.
Whitehill does, however, believe the technology could prove useful for online learning platforms. “Udacity and Coursera have on the order of a million students, and I imagine some fraction of them could be persuaded to turn their webcams on,” he says. “I think you would learn a lot about what parts of a lecture are working and what parts are not, and where students are getting confused.”