This report explores Algorithmic Affect Management, where biometric data is used to measure workers’ emotions, behaviour and activities, and used to make management decisions.
What’s this?
Digital profiling using algorithms and datasets is becoming more and more pervasive in our lives. Increasingly, technology is being used to monitor people’s behaviour and activity in the workplace.
This report explores Algorithmic Affect Management (AAM), which is when workplaces use biometric data to measure workers’ emotions and inform management decisions. It’s a new frontier in workplace surveillance, with significant implications for workers’ rights and wellbeing.
Key findings
AAM has a wide range of applications
- It can be used to monitor worker communication, productivity, location, tiredness levels and emotion.
- Current examples include wristbands to track muscle fatigue, earbuds to measure stress levels and wearables that prompt you to record how you’re feeling throughout the day.
- AAM can have a number of benefits. For example, measuring fatigue can help reduce workplace accidents.
- Another example is that AAM helped one company understand the links between employees’ social behaviours at work, and their productivity – leading the company to invest in workplace furniture that encouraged these behaviours.
But there are problems with measuring emotions in this way
- AAM raises ethical and social questions about measuring workers’ wellbeing.
- Emotional AI algorithms can be trained on potentially biased datasets, reinforcing cultural, gendered and racial stereotypes.
- Accurately measuring emotions is also difficult – not least because everyone’s physical. responses to emotions are different, and people often experience emotions without displaying them physically.
The technology can have a negative impact on workers
- Tracking technologies are often rolled out to solve issues like stress and anxiety – but can end up making them worse.
- Surveys indicate workers perceive AAM as negatively impacting their health, safety and well-being.
Workers often have a lack of agency
- AAM and new technologies are often rolled out without consulting employees, and with workers given little information about how the technology will be used.
- Often there isn’t a clear way for workers to feedback on the technology.
- There is also a risk that companies won’t be transparent about what data they collect, and why.
Recommendations
- There is a need for sharper definitions, thresholds and regulatory guidance.
- Data protection at work should be clarified and extended to clearly cover AAM.
- Employment protection from surveillance and fire and re-hire should be extended.
- Programmes to promote AAM literacy for workers, unions and managers should be developed and incorporated into AI literacy work.
- More resources should be provided to regulators to develop guidance.