Skip to Content

Educational Technology Faces a Pivotal Privacy Moment

Student performance data can hold valuable insights for educators developing personalized learning programs, but some highly restrictive laws threaten to deprive students of the potential benefits.
July 27, 2015

Mark Pickard’s in-box is full of pitches from software companies. “I bet I get two or three e-mails a day from somebody trying to get us to use their new platform or whatever it is they have,” says Pickard, an eighth-grade science teacher who works in Malden, Missouri.

Though Pickard’s been getting such solicitations for years, there’s something different about the recent batch. They still promise to provide personalized experiences for students based on information a company collects—for example, by recognizing specific problem areas or particular types of learning styles and then creating learning plans tailored to individual profiles. What’s changed is how the companies treat that data once they have it: whereas a few years ago it was obvious that many of them were planning to sell the data, now most make absolutely clear that they will not do that.

The reason can be found in a recent explosion in state legislation regulating the use of student data and safeguarding its privacy and security. Responding to growing parental fears that hackers will steal their children’s personally identifiable information, or that companies will sell such data or use it to target advertising to kids, legislators in 30 states have passed laws dealing with the issue since the beginning of 2014. They either spell out procedures for collecting, storing, and using student data or prohibit the gathering of certain types of sensitive data, like information related to health, religion, or political affiliations.

Still, parents remain worried, and their concerns are not unfounded. Last year Google admitted to scanning the e-mail of students using its Apps for Education software, gathering data that could have been used to target ads at those students. (The company said in a subsequent blog post that it had discontinued the practice.)

Responding to growing parental fears, legislators in 30 states have passed data privacy laws.

Online service providers like Google are not explicitly regulated under the 40-year-old Family Educational Rights and Privacy Act, which safeguards the privacy of student records. Many argue that the law should be updated to reflect the new class of educational software makers vying for a piece of the estimated $8 billion market for such products. But although President Obama has called student data privacy a priority, federal progress has been slow, and states are filling the void.

Without some clear guidelines, the risk is that more education technology providers could go the way of InBloom. A nonprofit data management and storage company launched in 2013, InBloom closed its doors last year under pressure from fearful parents after activist groups cast the company, which had been backed by the Gates Foundation, as a shady group looking to profit from student data.

Defenders say InBloom was not doing that at all. But the backlash against the company is seen as one of the major factors setting off the flurry of legislative activity by the states. California led the way: last fall it enacted a law that clearly restricts companies from selling student data or using it for targeted advertising.

The danger is that parents’ fears, which often stem from a lack of information about how and why student data is collected and used, could lead to restrictive policies that stifle innovation, says Rob Curtin, a Microsoft veteran who is now the chief privacy officer at the Boston-­based startup Pip Learning Technologies. Curtin has a reason to care: his company is building a service that would securely and privately connect educational institutions and those who would like access to student data, including parents and technology companies.

“There are a ton of positive outcomes that can come from sharing data,” says Curtin. Insights drawn from data like multi-year sets of student assessments and tests, for example, could be used to help educators tailor instruction to individual students, he says, while overly restrictive policies could prevent that by closing off opportunities to share such data. Curtin also sees value in enabling parents, who spend billions every year on supplementary education, to securely share their child’s school data with outside specialists to create tailored instruction. Most of all, he says, parents need to know what is happening to the data and how it is being used.

“There is a right way and a wrong way to do this,” says Curtin. “And if we follow the rules, we can move data around, and there are really good reasons for doing that.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.