From Apple to Microsoft , many of the world’s biggest technology companies are incorporating facial recognition systems into their devices.
But a new study, by researchers from MIT, has found that several systems appear to be biased in terms of race and gender.
In their study, called the Gender Shades project, the researchers looked at the accuracy of facial recognition systems from IBM, Microsoft and Face++.
Over 1,200 images were chosen of people from three African countries and three European countries, who were grouped by gender and skin type.
A spokesperson for Gender Shades said: “While the companies appear to have relatively high accuracy overall, there are notable differences in the error rates between different groups.”
Results showed that all three companies’ facial recognition systems performed better on males than females, and were more accurate when analysing light rather than dark faces.
The Gender Shades spokesperson added: “When we analyse the results by intersectional subgroups – darker males, darker females, lighter males, lighter females – we see that all companies perform worst on darker females.”
In particular, IBM’s system was found to have the largest gap in accuracy, with a difference of 34.4 per cent in error rate between lighter males and darker females.
IBM Watson has since responded to the study, and says it is making changes to the system.