University of Virginia Professor of computer science Vicente Ordóñez has noticed a pattern in image recognition software which he says indicates gender bias. This has caused him to question whether or not he and other researchers subconsciously inject biases into software.
According to the findings of Ordóñez’s research team, both Microsoft and Facebook, two of the most prominent research-image collections, show a highly predictably pattern of gender bias in their depiction of activities. For example, images of cooking and shopping are linked to women, whereas images of sports and hunting are linked to men.
Ordóñez tested his theory by looking into two large collections of photos used to “train” image-recognition software. Machines trained on such datasets with gender bias didn’t just duplicate the biases, but actually expanded upon them. For example, if a photoset linked women with cooking, software trained by studying these photos and their labels made the association even stronger, linking for example, food or other household chores.
Mark Yatskar, a researcher at the Allen Institute for Artificial Intelligence, expressed concerns that this phenomenon could become replicated in other datasets, such as racial biases for example. Yatskar also worked on the research project at the University of Washington with Ordóñez and the team of researchers.
“As sophisticated machine-learning programs proliferate, such distortions matter. In the researchers’ tests, people pictured in kitchens, for example, became even more likely to be labeled ‘woman’ than reflected the training data. The researchers’ paper includes a photo of a man at a stove labeled ‘woman,’” reports Wired.
If this pattern is duplicated in the practices of tech companies, it could influence photo-storage services as well as companies like Amazon Look or similar in-home assistants who use cameras or similar social media tools to discern customer preferences.
Yatskar believes that as these systems become more complex, gender bias will become more practically manifested.
“A system that takes action that can be clearly attributed to gender bias cannot effectively function with people,” says Yatskar.
In other words, a robot in the future may be unsure of an individual’s kitchen activities, but will default to offering a man a beer and a woman a spatula.