How do sociology students become data scientists

Diversity and AlgorithmsLorena Jaume-Palasí: "Only the hands of white people were recognized"

"It took a while to understand that automatic soap dispensers were a problem because they excluded darker people. For a long time, only the hands of white people were recognized by these soap dispensers." The infrared standard technology was developed by white people, in teams of all men of the Caucasian type - who would never have thought about the fact that we are not dealing with mathematical but with socio-technical systems, according to the political scientist.

For the same reason, people of color would live dangerously if self-driving cars were developed according to today's standards, says Lorena Jaume-Palasí: "The facial recognition programs are extremely flawed. A prominent example was raised by Joy Buolamwini from MIT two years ago. She has proven that great Companies like Amazon, like IBM, like Microsoft, hadn't developed good technology because they trained that technology with data that was very one-sided. " The image recognition technologies were initially not able to distinguish women from men, or to recognize darker skin types. "There were darker people who stood in front of a camera and the camera couldn't see that there was a person standing there!"

Lorena Jaume-Palasi, founder of the Ethical Tech Society in Berlin (dpa / Steffen Leidel)

Artificial intelligence will change societies significantly. But it can also reproduce prejudices. Put simply, AI is progressive automation, based on statistics and standards. The standards, however, were often based on premises that reflected unconscious bias - "unconscious bias" in the jargon. Language assistants, for example, have standardized language in such a way that dialects are not understood because they are not taken into account in the database. "Siri doesn't understand the Welsh or Scottish accent. You have to speak Oxford English in order to be understood by the language assistant," says Lorena Jaume-Palasí.

Defining standards means doing violence to minorities

Too often, the social context in which technologies are implemented is forgotten. It took a long time for social scientists to put standardization at the center of their research. Today we know that standards bring with them a form of violence, because they are always an expression of majorities. Even with a wide variety of standards, violence that is done to minorities can hardly be avoided. "There are always people who get in between, between the categories. And it is the task of democracies, but also especially in the development of such technologies, to constantly think about how to include what is in between."

Understand algorithms and AI as socio-technical systems

It is simply not a question of purely mathematical or IT systems that have to be developed exclusively by data scientists. "It takes sociologists, it takes ethnologists who also understand the human-machine interaction." In this context, Jaume-Palasí criticizes "iBorderCtrl", an AI-supported system that is supposed to analyze the facial expressions of people who apply for asylum at the border. The polygraph test for migrants is supposed to assess whether people are making true statements or not, so it is a kind of lie detector. "There are many good reasons why you sweat a little more or be nervous at a border, and it doesn't have to be lies. The situation alone can cause that, for example when you are old or as a woman from a different culture in front of a group of men. "

It is not about demonizing the technology. "But it is the first step in a chain of decisions, whereby a technology is programmed that is predestined for discrimination by its very conception."

Statements by our interlocutors reflect their own views. Deutschlandfunk does not adopt statements made by its interlocutors in interviews and discussions as its own.