In the last few months, millions of people around the world stopped going into offices and started doing their jobs from home. These workers may be out of sight of managers, but they are not out of mind. The upheaval has been accompanied by a reported spike in the use of surveillance software that lets employers track what their employees are doing and how long they spend doing it.
Companies have asked remote workers to install a whole range of such tools. Hubstaff is software that records users’ keyboard strokes, mouse movements, and the websites that they visit. Time Doctor goes further, taking videos of users’ screens. It can also take a picture via webcam every 10 minutes to check that employees are at their computer. And Isaak, a tool made by UK firm Status Today, monitors interactions between employees to identify who collaborates more, combining this data with information from personnel files to identify individuals who are “change-makers.”
Now, one firm wants to take things even further. It is developing machine-learning software to measure how quickly employees complete different tasks and suggest ways to speed them up. The tool also gives each person a productivity score, which managers can use to identify those employees who are most worth retaining—and those who are not.
How you feel about this will depend on how you view the covenant between employer and employee. Is it okay to be spied on by people because they pay you? Do you owe it to your employer to be as productive as possible, above all else?
Critics argue that workplace surveillance undermines trust and damages morale. Workers’ rights groups say that such systems should only be installed after consulting employees. “It can create a massive power imbalance between workers and the management,” says Cori Crider, a UK-based lawyer and cofounder of Foxglove, a nonprofit legal firm that works to stop governments and big companies from misusing technology. “And the workers have less ability to hold management to account.”
Whatever your views, this kind of software is here to stay—in part because remote work is normalizing it. “I think workplace monitoring is going to become mainstream,” says Tommy Weir, CEO of Enaible, the startup based in Boston that is developing the new monitoring software. “In the next six to 12 months it will become so pervasive it disappears.”
Weir thinks most tools on the market don’t go far enough. “Imagine you’re managing somebody and you could stand and watch them all day long, and give them recommendations on how to do their job better,” says Weir. “That's what we’re trying to do. That’s what we’ve built.”
Weir founded Enaible in 2018 after coaching CEOs for 20 years. The firm already provides its software to several large organizations around the world, including the Dubai customs agency and Omnicom Media Group, a multinational marketing and corporate communications company. But Weir claims to also be in in late-stage talks with Delta Airlines and CVS Health, a US health-care and pharmacy chain ranked #5 on the Fortune 500 list. Neither company would comment on if or when they were preparing to deploy the system.
Weir says he has been getting four times as many inquiries since the pandemic closed down offices. “I’ve never seen anything like it,” he says.
Why the sudden uptick in interest? “Bosses have been seeking to wring every last drop of productivity and labor out of their workers since before computers,” says Crider. “But the granularity of the surveillance now available is like nothing we’ve ever seen.”
It’s no surprise that this level of detail is attractive to employers, especially those looking to keep tabs on a newly remote workforce. But Enaible’s software, which it calls the AI Productivity Platform, goes beyond tracking things like email, Slack, Zoom, or web searches. None of that shows a full picture of what a worker is doing, says Weir—it’s just checking if you are working or not.
Once set up, the software runs in the background all the time, monitoring whatever data trail a company can provide for each of its employees. Using an algorithm called Trigger-Task-Time, the system learns the typical workflow for different workers: what triggers, such as an email or a phone call, lead to what tasks and how long those tasks take to complete.
Once it has learned a typical pattern of behavior for an employee, the software gives that person a “productivity score” between 0 and 100. The AI is agnostic to tasks, says Weir. In theory, workers across a company can still be compared by their scores even if they do different jobs. A productivity score also reflects how your work increases or decreases the productivity of other people on your team. There are obvious limitations to this approach. The system works best with employees who do a lot of repetitive tasks in places like call centers or customer service departments rather than those in more complex or creative roles.
But the idea is that managers can use these scores to see how their employees are getting on, rewarding them if they get quicker at doing their job or checking in with them if performance slips. To help them, Enaible’s software also includes an algorithm called Leadership Recommender, which identifies specific points in an employee’s workflow that could be made more efficient.
For some tasks, that might mean cutting the human out of the loop and automating it. In one example, the tool suggested that automating a 40-second quality-checking task that was performed by customer service workers 186,000 times a year would save them 5,200 hours. This meant that the human employees could devote more attention to more valuable work, improving customer-service response times, suggests Weir.
Business as usual
But talk of cost cutting and time saving has long been double-speak for laying off staff. As the economy slumps, Enaible is promoting its software as a way for companies to identify the employees who must be retained—“those that are making a big difference in fulfilling company objectives and driving profits”—and keep them motivated and focused as they work from home.
The flipside, of course, is that the software can also be used by managers to choose whom to fire. “Companies will lay people off—they always have,” says Weir. “But you can be objective in how you do that, or subjective.”
Crider sees it differently. “The thing that’s so insidious about these systems is that there’s a veneer of objectivity about them,” she says. “It’s a number, it’s on a computer—how could there be anything suspect? But you don’t have to scratch the surface very hard to see that behind the vast majority of these systems are values about what is to be prioritized.”
Machine-learning algorithms also encode hidden bias in the data they are trained on. Such bias is even harder to expose when it’s buried inside an automated system. If these algorithms are used to assess an employee’s performance, it can be hard to appeal an unfair review or dismissal.
In a pitch deck, Enaible claims that the Dubai customs agency is now rolling out its software across the whole organization, with the goal of $75 million in “payroll savings” over the coming two years. “We’ve essentially decoupled our growth rate from our payroll,” the agency’s director general is quoted as saying. Omnicom Media Group is also happy with how Enaible helps it get more out its employees. “Our global team needs tools that can move the needle when it comes to building our internal capacity without adding to our head count,” says CEO Nadim Samara. In other words, squeezing more out of existing employees.
Crider insists there are better ways to encourage people to work. “What you’re seeing is an effort to turn a human into a machine before the machine replaces them,” she says. “You’ve got to create an environment in which people feel trusted to do their job. You don’t get that by surveilling them.”
Why Meta’s latest large language model survived only three days online
Galactica was supposed to help scientists. Instead, it mindlessly spat out biased and incorrect nonsense.
A bot that watched 70,000 hours of Minecraft could unlock AI’s next big thing
Online videos are a vast and untapped source of training data—and OpenAI says it has a new way to use it.
Google’s new AI can hear a snippet of song—and then keep on playing
The technique, called AudioLM, generates naturalistic sounds without the need for human annotation.
Responsible AI has a burnout problem
Companies say they want ethical AI. But those working in the field say that ambition comes at their expense.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.