Apple credit Card, launched in August, encountered major problems last week when users noticed that he seemed to offer smaller credit lines to women than to men. The scandal spreads to Twitter, with influential technicians who mark the Apple card “sexist fucking“”Beyond F’Ed Up“and so on. Even the friendly co -founder of Apple, Steve Wosniak, wondersMore politely, if the map could house misogynist trends. It took a long time before a Wall Street regulator has controlled in the chronology of indignation, announcement that he would investigate the operation of the card to determine if he breaks the financial rules.
Apple’s response just added confusion and suspicions. No one in the business seemed to describe how the algorithm even worked, and even less justify their production. While Goldman Sachs, the Apple card issuing bank, immediately insisted that there is no gender bias in the algorithm, he did not give proof. Then, finally, Goldman landed on what looked like a foolproof defense: algorithm, He saidhas been checked for a potential bias of a third party; In addition, it does not even use the genre as an entry. How could the bank discriminate if nobody ever tells him what customers are women and who are men?
This explanation is doubly misleading. On the one hand, it is quite possible for algorithms to discriminate sex, even when programmed as “blind” to this variable. On the other hand, imposing a voluntary blindness on something as critical as sex only makes it more difficult for a company to detect, prevent and reverse the biases on exactly this variable.
The first point is more obvious. An algorithm of the absence of gender could be biased against women as long as it relies on any entry or entry which is in correlation with sex. There are many research showing how such “proxies” can lead to undesirable biases in different algorithms. Studies have shownFor example, that solvency can be predicted by something as simple as if you use a Mac or PC. But other variables, such as home address, can be used as a proxy for breed. Likewise, where a person’s stores could possibly ride information about their gender. The book Weapons of mathematics destructionBy Cathy O’Neil, an old as for Wall Street, describes many situations where proxies have contributed to creating horribly biased and unjust automated systems, not only in finance but also in education, criminal justice and care health.
The idea that the suppression of a contribution eliminates biases is “a very common and dangerous false idea”, says Rachel Thomasprofessor at the University of San Francisco and co -founder of Fasting.aiA project that teaches people.
This will not become a more important headache for consumer companies because they become more dependent on algorithms to make critical decisions concerning customers – and as the public becomes more Pretty practice. We saw Amazon Pull an algorithm used in hiring Due to gender bias, Google Criticized for a racist automatic entryand IBM and Microsoft Embarrassed by facial recognition algorithms It turned out to recognize men than women and whites than other breeds.
This means that algorithms must be carefully audited to make sure that bias has not somehow slipped. Yes, Goldman said it did exactly that in last week’s statement. But the very fact that the sex of customers is not collected would make such an audit less effective. According to Thomas, companies must, in fact, “actively measure protected attributes such as sex and race” to ensure that their algorithms are not biased from them.
The BrooKings Institution published a useful report In May on the detection and attenuation of algorithmic biases. He recommends examining the data provided to an algorithm as well as his exit to check if he treats, for example, women differently from men, on average or if there are different error rates for men and women.