The UK authority claims that such technologies could discriminate against individuals.

The Information Commissioner’s Office (ICO) has issued a warning urging organisations to assess the public risks of emotion analysis technologies. The office says that organizations need to do a thorough assessment before implementing these systems.

Organisations that do not act responsibly could pose risks to vulnerable people, they say. Any UK organisation that fails to meet ICO expectations can be investigated.

Acknowledging inherent bias in algorithms

Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and even skin moisture, the warning explains.

Such technologies could, for example, be used to monitor the physical health of workers by offering wearable screening tools. They could also be used to assess visual and behavioural methods including body positioning, speech, eyes and head movements to register students for exams.

Biometric data is highly personal

Emotional analysis relies on collecting, storing and processing a range of personal data. Such data can even include subconscious behavioural or emotional responses. This kind of data use is riskier than traditional biometric technologies used to verify or identify a person, the ICO says.

The inability of underdeveloped algorithms to detect emotional cues means there’s a risk of systemic bias, inaccuracy and even discrimination. Stephen Bonner, Deputy Commissioner of the ICO, said: “Developments in the biometrics and emotion AI market are immature.”

“While there are opportunities present, the risks are currently greater”, Bonner explained. “At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.”

The ICO is due to issue biometric guidance in the Spring of 2023. That guidance will aim to “further empower and help businesses, as well as highlight the importance of data security”.

The warning also includes a serious admonition. “Biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used”, the ICO stated.

Tip: EU data protection authority calls for stronger AI regulations