'There is no standard': investigation finds AI algorithms objectify women's bodies | Ar... - 0 views
www.theguardian.com/...i-algorithms-racy-women-bodies
AI internet analysis algorithms censorship gender sexuality bias
shared by Ed Webb on 12 Feb 23
- Cached
-
AI tags photos of women in everyday situations as sexually suggestive. They also rate pictures of women as more “racy” or sexually suggestive than comparable pictures of men.
- ...7 more annotations...
-
Shadowbanning has been documented for years, but the Guardian journalists may have found a missing link to understand the phenomenon: biased AI algorithms. Social media platforms seem to leverage these algorithms to rate images and limit the reach of content that they consider too racy. The problem seems to be that these AI algorithms have built-in gender bias, rating women more racy than images containing men.
-
“You are looking at decontextualized information where a bra is being seen as inherently racy rather than a thing that many women wear every day as a basic item of clothing,”
-
suppressed the reach of countless images featuring women’s bodies, and hurt female-led businesses – further amplifying societal disparities.
-
these algorithms were probably labeled by straight men, who may associate men working out with fitness, but may consider an image of a woman working out as racy. It’s also possible that these ratings seem gender biased in the US and in Europe because the labelers may have been from a place with a more conservative culture
-
“I will censor as artistically as possible any nipples. I find this so offensive to art, but also to women,” she said. “I almost feel like I’m part of perpetuating that ridiculous cycle that I don’t want to have any part of.”
-
many people, including chronically ill and disabled folks, rely on making money through social media and shadowbanning harms their business