Navigation auf uzh.ch

Suche

UZH News

The Prejudices

The Secret Life of Algorithms

Algorithms sort the internet for us. The problem is, no one knows exactly how they decide. Information scientist Anikó Hannák is trying to change this. What she’s finding out in the process offers food for thought.
Thomas Gull
Anikó Hannák
“In the analog world, dis­crim­i­nation is forbidden. The same rules have to apply online,” says information scientist Anikó Hannák.

The World Wide Web promises great freedom, with virtually unlimited access to information around the globe. But as we surf our way through the net, apparently free and unimpeded, in search of products, job offers or vacation destinations, we’re steered by an invisible hand – or rather by many different invisible hands. These hands are algorithms, the computer programs on which online platforms such as Google, Facebook and Amazon are built. They probably know us better than we know ourselves. They know where we live, and what friends and preferences we have. On the basis of this knowledge they’re constantly making suggestions about what we should buy, where we should travel, and what friends we could link up with online.

As secret as the recipe for Appenzell cheese

This means that algorithms have an enormous influence on us as individuals and on society as a whole. The crazy thing, as information scientist Anikó Hannák says, is that “these programs are a black box.” She is investigating the secret lives of algorithms. “We don’t know how they function, and the companies don’t disclose this information.” These algorithms are part of these companies’ business model, and they guard their codes as jealously as Appenzell cheesemakers guard the secret of the herbal brew that gives their product its unique flavor and aroma. For this reason, Hannák’s re­search involves spying on the systems to understand what the algorithms do and how they affect users. Isn’t that illegal? “That’s a complex question,” replies Hannák mis­chie­vously. “It’s against the terms of use, for sure.” But she’s adamant that re­searchers have to have an insight into these data, because the decisions algorithms make often have far-reaching consequences for users that they’re not clear about. In the United States, her former re­search group has a lawsuit pending trying to do pre­cisely that – get re­searchers access to the algorithm codes. As long as the programs aren’t dis­closed voluntarily, they have to be hacked. “We now have a very good idea of how to do that,” says Hannák.

They know not what they do

Even crazier than the lack of transparency is the fact that most of the companies that have written algorithms don’t know themselves what they do or how they work. This is a function of the way algorithms are programmed to learn autonomously and draw their own conclusions from the behavior of users. A simple example: When I buy a book online, the algorithm will suggest other similar books, for example by the same author or on the same topic. That’s fairly harmless. But algorithms don’t just suggest books. For example, they might also suggest people we could hire to do work for us, or people matching the job profile of a recruitment agency looking for hires online. If in such cases the algorithm is in any way biased or discriminating, this can have serious consequences for those affected, who might not get a contract or a job because the algorithm has suggested someone else. Anikó Hannák is looking into whether algo­rithms really have prejudices and biases, and what effects this has. Originally from Hungary, since the beginning of the year she’s been an assistant professor of social computing at UZH. She already explored this question in her PhD thesis at David Lazer’s lab at Northeastern University in Boston. Hannák’s research involves investi­gating various US platforms such as TaskRabbit, Fiverr and Stack Overflow that place jobs, gigs and tasks. TaskRabbit, backed by furniture retailer IKEA, places people offering to do physical work, from moving and assembling furniture to shopping. Fiverr brokers intellectual work, such as writing copy, making videos or devising marketing strategies. Stack Overflow is the online platform for tech nerds, providing the answer to any programming problem. Stack Overflow has now also evolved into a job ex­change also popular with recruiters seeking talented programmers.

Undiscriminating discrimination

Hannák has analyzed how these platforms function. For example, who do they come up with if I’m looking for someone to make a video, move my piano, or write a com­puter program for me? What she’s found out in the process offers food for thought: All three websites she investigated discriminate blithely. It’s against always the same people: women and black people. There are two explanations for this: The algorithms reflect the real-life situation; they’re the mirror in which we stare our prejudices in the face. And, even more problematic, they amplify these prejudices. The first of the three websites Hannák examined is Stack Overflow. Programmers use the online platform to discuss problems that crop up in their work. “If I have a tricky problem I can usually solve it within a few minutes thanks to Stack Overflow,” explains computer scientist Hannák. Women are rare in the programmer world. An online platform like Stack Overflow could be a good opportunity for a female programmer to showcase her skills. But certain mechanisms lead to a situation where here too, men are more visible than their female colleagues. For example, the platform gives out points for questions and answers to problems that pro­gram­mers encounter. If you ask or answer questions, you’re rewarded with points. The thing is, says Hannák, questions are just as import­ant as answers, because “practically all programmers’ questions have been answered, making it difficult to ask good questions.” Despite this, for a long time there were only five points for questions but ten for answers. These points are impor­tant, because the more points you have, the higher up the rankings you appear, the more visible you are, and the more likely you are to be proposed if a recruiter is looking for suitable pro­gram­mers. The problem is that women ask more questions and men give more an­swers. This means the difference in rewards put women at a disadvantage. “That wasn’t intentional,” says Hannák. Before she scrutinized the website, the operators weren’t even aware that there was a gender-specific difference in this re­spect. The system has been changed, and there are now ten points for questions as well. It’s a minor success for Hannák. And an exception: In most cases, the website operators, when informed that there are distortions that disadvantage certain groups, respond negatively or not at all.

Women and black people disadvantaged

This also applies to the Task Rabbit and Fiverr platforms, where freelancers offer their services. In both cases, Hannák was able to show that women and black people are at a disadvantage, and that the platforms’ algorithms make this tendency even more pronounced. The main reason for the discrimination is the users themselves, because their biases and predilections are reflected in the preferences. Bids from white men are more likely to be clicked than those from women and black people. Hannák also examined the verbal feedback. It emerges that on Task Rabbit, women get significantly fewer ratings than men, and black people receive worse ratings than white people. It’s similar at Fiverr, where black people received 32 percent fewer ratings, and these ratings were considerably less positive. Asian sellers, by contrast, get substantially better ratings than black people. “We found out that feedback shows a clear negative tendency against black people,” comments Hannák. The algorithms make these tendencies even stronger, because they “learn” from the feedback. In other words, they take account of how buyers rate sellers and which they choose. On the basis of this information, they then suggest sellers to customers. Since white men do best, the algorithm is more likely to recommend them to new customers. Women and black people are thus at a disadvantage from the outset, regardless of the quality of their work. This creates a real vicious circle that is difficult to break out of. Possible responses could be to set things up so that the same number of men and women, and black and white people, would have to be proposed for each selection; or that women and black people got a bonus. “However, a system like that would definitely be contro­versial because people might see it as reverse discrimination,” says Hannák. There are no easy solutions. “But to address the problem in the first place, you have to be aware of it,” says Hannák. Unfortunately, experience shows that companies have little interest in doing so. “Many trim their platforms to profit, and fairness falls by the wayside,” she criticizes. Ensuring fairness is too expensive. For a start, it would mean that online companies would have to keep track of what their algorithms were up to. And if it turned out that they were discriminating against certain groups, this would have to be rectified. But this kind of monitoring costs money. Some companies haven’t done it because they’re still too small; others don’t have the willingness or awareness to change things.

Disclosing algorithms

Hannák is convinced this has to change. Appealing to people’s sense of responsibility usually doesn’t help much, so laws are required that force companies to act respon­sibly. They should be obligated to disclose their algorithms so that what they do can be examined – by independent parties. “In the analog world, discrimination is forbidden,” she says. “The same rules have to apply online.” After all, as Hannák points out, the EU is trying to regulate the online jungle and crack down on violations. Google, for example, was found guilty of distorting competition because the platform’s recommen­dations favored its own products. The EU is also providing plenty of re­search funding for efforts to develop systems that can monitor platforms. “That’s a ray of hope,” says Hannák. For her, it’s clear that transparency is needed. It’s the only way of making sure that the users of online platforms aren’t discriminated against and cheated.