Eight years after controversy on blacks who are poorly labeled as gorillas by image analysis software – and despite great advances in computer vision – the technology giants are still afraid of repeating the error.
When Google published its application of autonomous photos in May 2015, people were seduced by what it could do: analyze images to label people, places and things, an amazing consumers offer at the time. But a few months after the release, a software developer, Jacky Alciné, discovered that Google had labeled photos of him and a friend, who are both black, like “gorillas”, a particularly offensive term because he echoes centuries of racist tropes.
In the controversy that followed, Google prevented its software from categorizing anything in photos like gorillas, and has promised to solve the problem. Eight years later, with significant progress in artificial intelligence, we tested if Google had solved the problem, and we examined comparable tools of its competitors: Apple, Amazon and Microsoft.
There was a member of the Primates family that Google and Apple were able to recognize – lemurs, animals with a constantly tail tail that share the opposable thumbs with humans, but are more distant than monkeys.
The tools of Google and Apple were clearly the most sophisticated with regard to image analysis.
However, Google, whose Android software underpins most smartphones in the world, has made the decision to deactivate the possibility of visually looking for primates for fear of making an offensive error and labeling a person as an animal. And Apple, with the technology that was done in a similar way to that of Google in our test, also seemed to deactivate the capacity to search for monkeys and monkeys.
Consumers may not need to frequently carry out such research – although 2019, an iPhone user is complained on the Apple customer assistance forum that the software “software”Impossible to find monkeys on photos on my device. “But the problem raises more important questions about other unsuccessful or unforeseen defects hiding in services that rely on computer vision – a technology that interprets visual images – as well as other products supplied by the IA