‘Dystopian’ worker surveillance techniques likely to disproportionately affect young people, says IPPR

27 Mar 2023 10:24 AM

Worker surveillance practices are increasingly the new normal, despite their negative and discriminatory impact on employees, according to a new report from IPPR. 

Technologies like webcam, movement and email monitoring exploded in popularity during the pandemic, and now appear here to stay, yet regulation has not kept up with the reality, the report finds. 

Workers in non-unionised, ‘low autonomy’ and low-skilled jobs are more likely to be surveilled at work, IPPR argues, and people aged 16 to 29 are the most likely to be in such jobs. 

In the private sector, this means that women are at higher risk of worker surveillance, while among ethnic groups, black people are at the greatest risk.  

Being surveilled at work has significant negative consequences for employees. One worker told IPPR that even going to the toilet made them “feel like someone’s watching you”, and said that, as a result, “I don’t stand up and I just stay in my seat the whole time, and you’re just really paranoid.” 

A union representative said that surveillance was used selectively “as a form of retribution” to intimidate and discipline staff. This was confirmed by a worker who said their manager would often threaten: “We’ll get the cameras on you.” 

However, there are also negative consequences for employers, including increased staff turnover, greater likelihood of worker sabotage, and an over emphasis on ‘measurable activity’ rather than genuine outcomes. 

The pandemic saw a huge increase in worker surveillance, and this has shown little sign of abating. Previous research shows that the number of online searches for ‘how to monitor employees working at home’ is 383 per cent higher than before the pandemic, while searches for ‘best employee monitoring software’ are up 201 per cent. 

Common types of surveillance include: 

Companies often use AI to automatically analyse data from worker surveillance, but this can lead to unfair decisions due to algorithmic bias. The complex algorithms used to make these decisions are not transparent and may benefit certain groups over others, the report points out These algorithms may be used to support assigning tasks, creating work schedules or determining how pay and promotions are awarded. 

IPPR’s recommendations for policymakers to prevent a permanent power shift from workers to employers and to reduce the risks of algorithmic bias, include: 

Henry Parkes, senior economist at IPPR and the report’s author, said: 

“Dystopian worker surveillance techniques have exploded in popularity since the pandemic, becoming normalised and seeping into an increasing number of industries. However, regulation to safeguard employees has not kept up with the pace of this. 

“Young people, women and black workers are likely to be disproportionately affected negatively by worker surveillance and as it stands, the law is not keeping up with reality. This could have disastrous consequences for the mental and physical wellbeing of the workforce. The government must urgently review what is acceptable.”