Skip to Content
Humans and technology

Teachers in Denmark are using apps to audit their students’ moods

Companies say the software can help improve well-being, but some experts worry it could have the opposite effect.

surveillance on playground concept
Nicole Rifkin

In a Copenhagen suburb, a fifth-grade classroom is having its weekly cake-eating session, a common tradition in Danish public schools. While the children are eating chocolate cake, the teacher pulls up an infographic on a whiteboard: a bar chart generated by a digital platform that collects data on how they’ve been feeling. Organized to display the classroom’s weekly “mood landscape,” the data shows that the class averaged a mood of 4.4 out of 5, and the children rated their family life highly. “That’s great!” the teacher exclaims, raising two thumbs up in the air. 

She then moves to an infographic on sleep hygiene. Here the data shows the students struggling, and the teacher invites them to think of ways to improve their sleeping habits. After briefly talking among themselves, the children suggest “less screen time at night,” “meditation before sleep,” and “having a hot bath.” They collectively make a commitment to implement these strategies. At next week’s cake time, they will be asked whether or not they followed through. 

These sorts of data-driven well-­being audits are becoming more and more common in Denmark’s classrooms. The country has long been a leader in online services and infrastructure, ranking as the most digitally developed nation in the UN’s e-government survey. In recent years its schools, too, have received big investments in this type of technology: it is estimated that the Danish government allocated $4 to $8 million, a fourth of the high school budget for teaching aids, to procuring digital platforms in 2018. In 2021, it invested some $7 million more.  

These investments are rooted in a Nordic tradition of education that centers the child’s experience and encourages interactive learning; some Scandinavian education researchers think technology can help draw children in as playful, active participants. “Technology is an extended pencil and drawing pad. It’s a tool that is bound to the child’s opportunity to express themselves,” Mari-Ann Letnes, an education scientist in Norway, said in a 2018 interview. In a 2019 status report on the use of technology in schools, the Danish Ministry of Education stated that “creativity and self-expression with digital technologies are a part of building students’ motivation and versatile development.” Now, some teachers and administrators are hoping technology can be used to tackle mental health as well.

Danish schoolchildren are in the midst of a mental-health crisis that one of the country’s biggest political parties has called a challenge “equal to inflation, the environmental crisis, and national security.” No one knows why, but in just a few decades, the number of Danish children and youth with depression has more than sextupled. One-quarter of ninth graders report that they have attempted self-harm. (The problem isn’t exclusive to Denmark: depressive episodes among US teens increased by some 60% between 2007 and 2017, and teen suicide rates have also leaped by around 60% over the same period.) A recent open letter signed by more than 1,000 Danish school psychologists expressed “serious concerns” over the mental state of the children they see in their work and warned that if action isn’t taken immediately, they “see no hope for turning the negative trend around.” 

To help address the problem, some Danish schools are moving to address children’s well-being through platforms like Woof, the one used in the fifth-grade classroom. Built by a Denmark-based startup, it frequently surveys schoolchildren on a variety of well-being indicators and uses an algorithm to suggest particular issues for the class to focus on. 

These platforms are quickly gaining ground. Woof, for example, has been implemented in classrooms in more than 600 schools across Denmark, with more on the way. Its founders believe Woof fills an important niche: they say teachers have expressed widespread dissatisfaction with existing tools, in particular a government-run well-being survey. That survey audits schools once a year and delivers results on a delay; it might provide a snapshot for policymakers but is hardly useful for teachers, who need regular feedback to adjust their work.

“There is simply a need for tools to check in [with the children] where you don’t need to be active,” says Mathias Probst, a cofounder of Woof. “Where you don’t need to talk to all 24 children before starting a class, because before you know it, 15 minutes of class time has already passed.” And teachers could benefit, he suggests, from “something that can bring a data structure into all of this.” 

Woof is not alone in its attempt to quantify children’s moods. A handful of other platforms have been adopted by Danish schools, and schools in Finland and the UK are using mood-monitoring software as well. In the US, the tech can extend beyond collecting self-reports to hunting for hints of concerning behavior by surveilling students’ emails, chat messages, and searches on school-issued devices. 

A number of people say mood-­monitoring tech has great potential. “We can use digital tools to evaluate well-­being on a 24-hour basis. How is the sleep? How is the physical activity, how is the interaction with others? ... How does [the child’s] screen time compare to physical time? That’s central to understanding what well-being actually is,” the late Carsten Obel, who was a professor of public health at Aarhus University and a leader in the development of another student-surveying tool called Moods, said in a 2019 video

But some experts are heavily skeptical of the approach. They say there is little evidence that quantification of this sort can be used to solve social problems, and that fostering a habit of self-surveillance from an early age could fundamentally alter children’s relationship to themselves and each other in a way that makes them feel worse rather than better. “We can hardly go to a restaurant or to the theater without being asked how we feel about it afterwards and ticking boxes here and there,” says Karen Vallgårda, an associate professor at the University of Copenhagen who studies family and childhood history. “There is a quantification of emotions and experiences that is growing, and it’s important that we ask ourselves whether that’s the ideal approach when it comes to children’s well-being.”

Others are asking how much children and their parents actually know about what data is being collected—and how it is being used. While some platforms say they are collecting minimal or no personally identifiable data, others mine deep into individual children’s mental states, physical activity, and even friend groups. 

“Their practice is very Silicon Valley–like. They preach data transparency but have none themselves,” says Jesper Balslev, a research consultant at the Copenhagen School of Design and Technology, of some of these platforms. Balslev says he is concerned that Woof and other platforms are being swiftly and naively rolled out without adequate regulation, testing, or efforts to make sure that the school culture allows children to abstain from participating in them. “Our regulatory technologies to deal with this are terrible,” he says. It’s possible that will change, he adds, “but right now, all the hobs are turned on at the same time.” 


Woof is run from a basement office on the outskirts of Copenhagen, with a small team of three full-time staffers. The founders, Mathias Probst and Amalie Danckert, got the idea for the company after working as public school teachers through Teach First Denmark, an organization similar to Teach for America in the United States. 

When Probst and Danckert entered the public school system, they say, they quickly realized that schools in low-income neighborhoods face a vicious cycle. Difficult circumstances at home can make students in these schools more challenging to teach. Staff turnover rates are high because of stress and burnout, with some teachers keen to switch to “easier” schools. Parents with resources often take their children elsewhere, so kids with more problems make up an even greater proportion of those who remain, exacerbating the stress teachers face and the likelihood that they’ll leave. All this compounds the well-being crisis that children are experiencing elsewhere. 

“I saw so many children ending up in difficult situations, which could have been prevented if action had been taken earlier,” says Danckert, who before her stint as a teacher worked as an analyst in the children and youth section of Copenhagen’s Social Services Administration.

Danckert and Probst, who has a background in consulting, set out to build a way to help schools manage such situations before they spiral into serious mental-health problems—problems that schools’ thinly stretched counseling systems may not catch until it’s too late.

Woof, the solution they devised, is a web app that children can access on computers or phones (a 2019 study found that 98% of Danish children between 10 and 15 have access to a smartphone). Its user interface primarily features a cartoon dog, which asks the children various questions about their life. The tool is designed to be used on a weekly basis, generating a “mood landscape” for the class by prompting kids to rate their mood and other aspects of their lives on a 1–5 scale. The result is supposed to add up to a comprehensive image of child welfare in that classroom over time. 

Teachers and administrative staff can read weekly reports on a class’s overall self-reported mood and how factors like  their sleep hygiene, social activity, academic performance, and physical activity affect that mood. Classrooms are profiled, and interventions are recommended to improve the scores in categories where they are doing less well. Finally, the teacher and the children look at the data together and help each other with tools and strategies to improve these sticking points. 

“It’s worrying that there is so much personally attributable data on platforms working with children.”

Mathias Probst, a cofounder of Woof

Woof’s data is anonymized; the app reports on classroom averages instead of individual children. Danckert says that’s because the company was unwilling to walk right up to the edge of what was legally and ethically feasible under data privacy laws. Probst also describes feeling uneasy that collecting data on individual children might create a narrative and lock them into it, rather than helping them break negative patterns. “It’s worrying that there is so much personally attributable data on platforms working with children,” he says.

The startup fully launched Woof less than a year ago, in the fall of 2022. According to beta test data collected on 30 schools before its full launch, 80% of classes that use Woof see mood improve by, on average, 0.35 points on the 1–5 scale within one month. Woof maintains that the platform isn’t meant to replace teacher-student contact. It should rather be understood as a support tool for teachers that provides structured action plans and feedback. 

surveilllance of child at school desk concept
NICOLE RIFKIN

But some experts have doubts about whether Woof’s methods are effective. They are particularly skeptical about the self-reported nature of the platform’s data. 

According to Balslev, education apps have not proved that they perform any better than analog interventions, such as having teachers advise children to turn off their computers and ask them how they slept last night. He points to historical lessons, such as a 2015 OECD study finding that digitalization in schools in a variety of countries had exacerbated a range of problems it was supposed to improve, with a net negative effect on learning outcomes.

“We intuitively trust data or the quantitative regime more than we trust humans,” he says. “I have found no, or very few, studies that examine the use of ed tech in controlled environments.” 

And there is good reason to take self-reported well-being data with caution: children may not be providing honest information. Balslev claims that when technology is introduced into a social context, it can’t be assumed that students will demonstrate ideal behavior and cooperate with its intentions. For example, in interviews he has done with high school students, he says they have reported gaming digital systems to do things like get more time for an assignment or make a writing exercise look longer than it actually is.  

Though dishonest answers are of course possible, Probst and Danckert argue that Woof’s anonymous approach makes authentic responses more likely than they might be otherwise. “Many students from low-­income areas are very aware of whether they are anonymous or not. And they are very aware of what is disclosed about their family life,” says Danckert. “The students don’t want to talk about what is happening at home, because they are worried that it will start a case [with a social services agency],” Probst adds. He and Danckert believe that the anonymous approach builds trust and promotes honest disclosure, as students can be sure that it won’t trigger the teacher’s legal obligation to report red flags further up in the system. 


Woof isn’t the only well-being platform making inroads in Danish schools. Platforms like Bloomsights, Moods, and Klassetrivsel (Danish for “classroom well-being”) are also getting traction. Each takes a more data-­intensive and less anonymous approach than Woof, tracking and identifying schoolchildren individually. Bloomsights and Klassetrivsel even go as far as generating “sociograms”—network diagrams that display the children’s relationships with each other in detail. 

Bloomsights turns self-reported data from the same individuals over time into indicators including “signs of loneliness,” “academic mindset,” and “signs of bullying.” Bloomsights is also used in the US, where some school districts are including it as part of an “early warning system” for identifying potential school shooters. 

The company’s US operations are based in Colorado. Cofounder Adam Rockenbach says the hope in bringing Bloomsights to the US was to spread the Scandinavian values of well-being and community. He asserts that the app is not meant to be a dystopian “Big Brother” but an extension of what teachers already do. 

“You notice the student is coming into class, and maybe they’re coming to class late more frequently than before, and they look a little disheveled,” he says. “A good teacher is going to go find two or three minutes to connect with that student: ‘Hey, it seems like there’s something off here. Is there any way I can help you?’”

Citing his experiences as a teacher in inner-city schools in Los Angeles for six years, Rockenbach says it can be a challenge to know what is really going on with children who struggle in an environment that might be marked by gang violence and poverty. He says Bloomsights can help in situations where the signals are not so clear.

Rockenbach believes that anonymous data only makes early intervention more difficult, since it creates more work for teachers and educators in trying to identify who has problems and needs help. For this reason, he thinks collecting individual data is a necessity.

The program, which operates through a web app, takes self-reporting measurements similar to Woof’s: monthly surveys of students, measuring various indicators of mental and physical well-being and students’ evaluation of their learning environment. 

But Bloomsights stands out in its use of sociograms, which are constructed from the students’ reports of who their friends are and who they connect and spend time with.

Rockenbach says these sociograms are crucial tools to detect social isolation and might even help identify children who are vulnerable to bullying. He points to testimonial reports from schools as an indicator that the platform helps improve well-being. But, he adds, “we haven’t conducted a full-on research project that might compare, for example, a school that uses Bloomsights versus a school that doesn’t. That’s something that we’re looking to do.”

Indeed, some teachers wonder how useful—or even ethical—the app is. “It’s some very intimate things that are asked, and they [the children] don’t necessarily know who is going to see it,” says Naya Marie Nord, a teacher at a suburban Copenhagen school that uses Bloomsights. “Of course, I as a teacher should have insight into how my students are feeling. But that’s something that I prefer to have conveyed in the confidentiality between me and the student, rather than it being told to a computer.” Nord is concerned about how many teachers who don’t work directly with the children still have access to their data. She believes the app straddles ethical boundaries given how much it impinges on students’ private lives. 

“They have no chance of understanding what is going on. It’s not like we give them a long presentation explaining how it’s used and who has access [to the data],” Nord says. “And if we did, we would get no honest answers. If they actually understood the amount of data I can see about them and how many others can see it as well, I believe they would answer differently.”

According to the data policies of Klassetrivsel, one of the platforms that collect non-anonymized data, consent is not required from either parents or children before the app is used in the classroom. The company claims that since the app is an integrated tool used for “well-being purposes” at a public institution, it falls under a Danish legal clause that exempts public authorities from requirements about obtaining consent for data collection. And since the platforms aren’t classified as “information society services” like Facebook or Google, there is no parental consent required under the General Data Protection Regulation, the European Union’s sweeping data privacy law. 

Legal precedents seem to back up Klassetrivsel’s claims about how the data law applies to its work. In 2019, a parent submitted a complaint to the Danish Data Protection Agency, claiming that a data-driven well-being platform at her child’s school was engaging in forced monitoring of the child. The parent further argued that “measuring and monitoring well-being is not the same as improving well-being.” The agency ruled in favor of the school’s municipality: the app was deemed a tool for maintaining tasks of “crucial social interest” that fall under the responsibility of schools.

“Usually, the legal authority that these third-party apps operate under is that they are offering a service on behalf of the public authorities,” says Allan Frank, an IT lawyer at the agency. But they must still store data correctly and not collect more than is necessary. They must also operate under the aegis of governmental authorization, he says: “If there is a random teacher or a school that has been convinced to suddenly set it up without the supervision of the municipality or the Ministry of Education, then that would be a problem.”

In Denmark, parents can opt out if they don’t want data collected on their children through these apps. According to Bloomsights, this is also the case in the US: although practices vary, Rockenbach says that parents typically sign a paper once a year that lists all the different services the school uses. 

But because the apps are used in an educational context and are framed as altruistic, both parents and policymakers tend to have their guard down. “There are a lot of other apps where I limit my son’s use, but I’m not concerned about apps used in the school the same way I am about TikTok and YouTube, for example,” says Janni Hindborg Christiansen, mother of one of the children in the fifth-grade classroom that uses Woof. “At least Woof is used in a controlled environment and has a good purpose. I trust it more than so many other apps that I’d be more critical toward.”

And for parents who don’t want their children using such platforms, opting out is not always straightforward. 

Henriette Viskum, the teacher of the fifth-grade class, describes Woof lessons as a part of her class’s core programming, just like math, and says parents need to talk with the teacher to pull their child out of the program. “If it’s a huge problem, we’ll find a solution and then the child doesn’t have to participate,” Viskum says. “But then I would, as a teacher, put a big question mark around why the parents are so strongly opposed to working with well-being. I would be a bit concerned and curious about that.” 

The closeness between teachers and students can also make the degree of anonymity blurry. Viskum told me that if almost an entire class reports high scores on family life, for example, but one child does not, she can usually intuit who that person is and might casually try to take steps to help.


For Balslev, the embrace of slick data-driven solutions is due partly to their political appeal. In Denmark, technology sometimes tends to be presented as the solution to everything connected to teaching and education. The simple infographics that ed-tech companies offer, he says, have an allure for government officials faced with thorny social and pedagogical issues.

“What is fantastic about the digital [initiatives] is that they are good at making politicians look actionable—as if they have made some decisions,” Balslev says. 

But efficacy is not as much of a priority, he says: “It’s quick and easy to produce some metrics that appear rhetorically convincing. The infographic might provide a very thin sliver of truth about reality, but it doesn’t touch the core of the situation.”

“The infographic might provide a very thin sliver of truth about reality, but it doesn’t touch the core of the situation.”

Jesper Balslev, research consultant at the Copenhagen School of Design and Technology

In fact, the technology risks actually making the situation worse, says Karen Vallgårda, the University of Copenhagen researcher. She is concerned that the “surveillance paradigm” could have unintended consequences for children’s self-understanding. 

“If we are asked to monitor ourselves according to a quantitative logic, emotions such as indignation and sorrow can appear as problematic emotional reactions, despite the fact that they are completely natural in certain scenarios of life. The children can feel that what they are feeling is wrong or undesirable, which is likely to propel greater well-­being issues rather than ameliorating them,” Vallgårda says.

“When we instill a measure of self-­surveillance with children based on a clearly communicated ideal of how to structure one’s everyday life, one’s eating habits, and how to feel in certain contexts, there is a risk that children develop ‘double unhappiness’ due to not just being unhappy but also failing to live up to these ideals.”

Vallgårda’s concerns are echoed by other researchers, who argue that an excessive focus on whether children are happy can cause them to pathologize normal fluctuations in life. New studies also indicate that declining well-being is largely attributed to environmental and social pressures rather than individual factors.

Vallgårda believes that rather than pouring resources into tools that further a quantitative agenda, schools should instead be prioritizing efforts to hire and train professionals like teachers and school psychologists.

But digital platforms are significantly cheaper than hiring or training more people. Viskum, the fifth-grade teacher, points out that budgets are tight and waiting lists for appointments with the school psychologist are miles long. Given the material reality, the appeal of ed tech is understandable, even when there are few results to back it up. 

While the quantification of children’s lives might make academics balk, the children I met told me that they enjoyed using Woof and especially liked how the app helped them talk more nicely to each other. At a school I visited in a low-income neighborhood (the class scored 3.4 on the mood scale), a teacher said she was just happy to have a tool that might give her a general idea of what was going on with the children.

When I asked Woof’s Probst about Vallgårda’s criticisms, he said that unlike researchers studying children academically, those who work with children every day in the classroom can’t afford to think in abstract terms. 

“It’s all well and good to be a theorist and have the opinion that you shouldn’t be doing certain things, but there is also a reality out there in the classrooms,” he says. “There is a practical situation where teachers face children who are struggling so much that they break down in tears during class. You have to do something there.” 

Arian Khameneh is a freelance journalist based in Copenhagen.

Deep Dive

Humans and technology

Building a more reliable supply chain

Rapidly advancing technologies are building the modern supply chain, making transparent, collaborative, and data-driven systems a reality.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Let’s not make the same mistakes with AI that we made with social media

Social media’s unregulated evolution over the past decade holds a lot of lessons that apply directly to AI companies and technologies.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.