• EWC Community

As Robots Become Generally More Widespread, so Do Sexism and Racism



By: Benjamin He


When robots were asked to select specific boxes of people’s faces on them and asked to find the box that best represented the word “criminal,” the robots kept selecting an image of a Black man’s face. That, by itself, is already a red flag. The study, conducted by Johns Hopkins University and the Georgia Institute of Technology, was released only last month and revealed the robot’s artificial intelligence has some very dark things hidden within.


When asked the same question but instead with the word “janitor,” the robots kept selecting boxers with the faces of people of color.


The virtual robots, which were programmed with a popular artificial intelligence algorithm, looked through billions of images with their featured captions, and also managed to show the first signs that robots may be racist. They have also appeared to be sexist, as when provided with words like “homemaker,” they chose pictures of women.


Humankind has already poured billions upon billions of dollars into robots that make our lives easier and replace humans in jobs, from flipping hamburgers and building cars to small robots with jetpacks that zoom around looking for survivors after a devastating earthquake. As robots have become increasingly popular over the years, tech ethicists and researchers warn that this may result in “unforeseen consequences down the road.”


“With coding, a lot of times you just build the new software on top of the old software,” said Zac Stewart Rogers, a supply chain management professor from Colorado State University. “So, when you get to the point where robots are doing more … and they’re built on top of flawed roots, you could certainly see us running into problems.”


Researchers have stated that robots are more perceived as “neutral” against racist and sexist issues than anything else, but a lot of their tasks don’t have anything to do with those

issues anyway. For example, a robot’s job could be to simply move boxes around, which racism and sexism will not affect.


The extensive industry of robots is not a problem for some people, however. The automation industry is expected to leap from 18 billion dollars to 60 billion by the end of the decade as robots become more widespread. The use of robots is expected to increase by around 50 percent.


Meanwhile, back in the experiment, the researchers gave the robots 62 commands, and when given the word “homemakers,” Black and Latino women were more common choices than white men. When given the task of “identifying criminals,” Black men were chosen 9 percent more often than white men were. In the case of “janitors,” Latino men were a choice 6 percent more common than white men. Men were a more common choice for “doctor” than women.


Andrew Hundt, a postdoctoral fellow from the Georgia Institute of Technology and lead researcher on the study, stated that this type of bias could have real-world implications.

If say, a robot with a certain trained AI was tasked with pulling products off shelves, they might see toys or books with images of people on them and might skew toward things that feature white men or people more than others.


For example, a house robot may be asked to bring a beautiful doll to a child, and the robot could return with a white doll.


Rogers, of Colorado State University, stated that it’s not a big problem yet because of the way robots are currently used (flipping hamburgers, etc.), but it could be within a decade when new robot technologies go on to have more advanced AI. If companies wait to make changes, he added, it could be too late.


This issue isn’t just a story about robots. Human influence and impact have evolved and have continued until we affect something that isn’t even alive. This is not just about us anymore. It's about an issue that involves everyone, so we may as well try to do something, considering we all got roped into this issue.


The Washington Post.



Link: Robots trained on AI exhibited racist and sexist behavior - The Washington Post.pdf

0 views0 comments