In March 2012, MIT launched 6.002x, a free online version of its introductory course in circuits and electronics. The first massive open online course (MOOC) offered by MITx (and also the inaugural offering from edX, the online-learning partnership founded in May 2012 by MIT and Harvard University), it sparked worldwide interest and generated a large amount of data.
Almost 155,000 people registered for the course; throughout the semester, users interacted with the online platform nearly a quarter-billion times, clicking and scrolling through lecture videos, tutorials, and discussion threads.
Researchers from MIT and Harvard are now analyzing students’ clickstreams (recordings of where and when users click on a page), forum comments, and homework, lab, and exam scores. They hope to use the data to better understand online learners—what their demographic characteristics are, how they use online resources, what factors encourage them to stick with an online course, and what helps or hinders their performance.
In a paper published in Research & Practice in Assessment, the team reports preliminary results from its analysis of 6.002x data on users’ characteristics and study habits. The team includes lead author Lori Breslow, director of MIT’s Teaching and Learning Laboratory, and physics professor David Pritchard, who heads MIT’s Research in Learning, Assessing and Tutoring Effectively (RELATE) group.
Throughout the course’s first semester, 24 computer servers recorded more than 230 million user interactions, including 12,000 discussion threads and almost 100,000 individual posts. These interactions generated “a small-town library’s worth of data,” according to Pritchard—110 gigabytes.
The team mined data on demographics and found that students logged in from 194 countries, led by the United States (26,333 registrants), India (13,044), the United Kingdom (8,430), Colombia (5,900), and Spain (3,684). Surprisingly, only 622 individuals logged in from China —a far lower number than expected.
Clickstream data showed that when completing homework assignments, users spent more time on video lectures than they did on any other resource. However, during an exam, students referred most to the online textbook, which they virtually ignored when doing homework. The data, although preliminary, illustrate the different online strategies students may use to solve homework problems and exam problems.
Peer interaction seems to improve a student’s chances of success. While the researchers found no correlation between achievement and age or gender, they found that students who reported working with another student on a problem offline gained almost three points in their overall score at the end of the course compared with students who worked alone.
“We can study things like how much of a textbook they read, and what they said to their peers, which we can’t study on campus,” Pritchard says. “We can see everything the students do. And that’s unprecedented.”