top of page
peace-at-any-price.webp

ALGORITHMIC BIAS & FEMALE OBJECTIFICATION (first published in ‘Medium’)

  • Writer: Pam Saxby
    Pam Saxby
  • Oct 31
  • 3 min read

Updated: Nov 14

AI text-to-image community ‘art’ platforms: a booby trap?
AI text-to-image community ‘art’ platforms: a booby trap?

I’m no algorithmic fundi, but this I know: AI text-to-image generator models can be trained to reflect a bias. And when it comes to the way women are portrayed, that’s worrying.


Of course, fascination with the female body has been a feature of human society for centuries, regardless of cultural and regional differences. And as societally subservient creatures we couldn’t do much about it.


But women haven’t always been subordinate to men — at least according to 19th century German social theorist and philosopher Friedrich Engels. Apparently, it was only with the demise of a “nomadic, hunter-gatherer” lifestyle and the emergence of “settled agricultural communities with surplus production” that men began accumulating and controlling private property — including their women, who then conveniently became “economically dependent” (CSR Education).


Engels argued that women’s sexuality and reproductive capacity was “commodified” in that context — mainly to ensure the paternity of heirs. It was taken for granted that they would raise the children and attend to household chores in exchange for the security of permanent place in the homestead of a strong male provider and protector … Or so the theory goes …

Over hundreds of years, commodified sexuality and reproductive capacity — combined with economic dependence — embedded a sense of inferiority and submissiveness in the collective female psyche. But eventually, women began to recognise the power of their sexuality as objects of desire. The rest is history, most recently manifest in the pre- and post world war popularity of pin-up girls (Wikipedia).


AI text-to-image ‘art’ has simply taken that to a new level. But here’s the thing: sexual objectification of the female form may well be responsible for the prevalence of people-pleasing behaviour among adolescent girls and young women. Conditioned from childhood to believe that worth is linked to appearance, they internalise this message — eventually seeing themselves as physical objects rather than human beings. How they look becomes more important than who they are and what they do— a phenomenon known as self-objectification.


Writing for The Conversation, Peter Coval, Elise Holland and Michelle Stratemeyer explain how this happens, drawing on empirical research to prove that when women are sexually objectified by others, “they momentarily view their own bodies from the perspective of the person objectifying them”. While this may trigger “both positive and negative emotions”, over time “the self-objectification that arises as a result of being objectified by someone else appears to have an exclusively negative impact” psychologically. The targeted woman becomes “preoccupied with … (her) physical appearance and sexual value to others” — often leading to shame, anxiety and long-term emotional damage.


Which is where algorthmic bias comes in. Apparently, the ‘most effective’ AI text-to-image models “have generally been trained on massive amounts of image and text data scraped from the web” using an automated bot or web crawler (Wikipedia). It’s when the data used to train an AI algorithm is neither diverse nor representative that a biased output can occur (Chapman University AI hub) — a likely outcome if most images of women on the web are oversexualised, feature exaggerated body parts and are portrayed in suggestive poses.

And it gets worse. Because human annotators control the process of labelling data for analysis, cultural and/or personal biases may further influence data interpretation and the algorithmic outcome (Chapman University AI hub).


Add social media to the mix — with its pressure to attract ‘followers’ and ‘likes’ — and self-objectification can spin out of control. Some platforms even include image enhancing software for pre-posting photo tweaking. As for social media account avatars: AI text-to-image models make it possible to take self-objectification to ever more worrying extremes. Not only can a young woman create a fake face, body and lifestyle (sometimes anonymously); online, she can communicate and behave in ways that might well be out-of-character or even frowned upon in real life.


Social media addiction, a preference for online virtual relationships, AI-generated persona — all facilitated by those superficially innocuous community ‘art’ platforms I wrote about yesterday. Scary times …

 
 
 

Comments


bottom of page