2 min

A study by MIT shows that Amazon’s facial recognition software seems to be biased against women with dark skin tones. It also turned out that the software had more difficulty distinguishing between men and women than similar Microsoft and IBM techniques.

This is the conclusion of a study published by MIT last Thursday. Amazon’s Recognition software turned out to have misclassified women as men in 19 percent of the cases. In addition, in 31 percent of the cases, women with a dark skin colour were wrongly classified as men. By way of comparison, Microsoft’s software was in the latter situation only 1.5 percent of the time wrong.

Different analyses

Matt Wood, General Manager of Artificial Intelligence at Amazon Web Services, has stated that the results of the study are based on a facial analysis and not on facial recognition. There is an important difference here, because according to Wood, facial analysis can also find faces in videos or images and, on the basis of this, can mark features such as wearing glasses. Face recognition tries to find a match of an individual’s face in videos and photos. The rekognition technique contains both functionalities.

It is not possible to draw a conclusion on the accuracy of facial recognition for a particular case based on results using facial analysis. According to Wood, the study did not use the latest study by Rekognition either. Amazon would have carried out a similar study with rekognition and similar data and would not have found any errors.

Skepticism

But according to the authors of the study, they did understand the difference between facial analysis and facial recognition. In our paper we also make it clear that we have chosen to evaluate the extent to which the model understands what it sees.

In any case, according to the authors, people should keep certain scepticism about companies claiming to have developed completely accurate systems.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.