top of page
peace-at-any-price.webp

AI TEXT-TO-IMAGE ‘ART’ & FEMALE OBJECTIFICATION: WHAT NEXT? (first published in ‘Medium’)

  • Writer: Pam Saxby
    Pam Saxby
  • Nov 1
  • 2 min read

Updated: Nov 14

female objectification and AI text-to-image eye candy
female objectification and AI text-to-image eye candy

Empirically based research tends to suggest that “all women are subject to sexualized evaluation by men”, regardless of whether they fit “traditional gender norms about beauty” (Frontiers). As an earlier article on this topic illustrated, the sexual objectification of women has been a feature of countless different cultures across the world since time immemorial.


Sadly, in the context of ‘western’ culture and its engrained right to freedom of expression the sexual objectification of women has become the norm, is often unconscious and as such is deliberately reinforced by the media (Integrative Life Centre). AI text-to-image ‘art’ apps and online community platforms have simply taken this to another level by making the tools for creating oversexualised images of women widely accessible. Some platforms even make them available free of charge. So, anyone can generate an AI woman with exaggerated body parts posing suggestively.


According to the landing page of one popular online community platform, images of women attract the largest number of views, ‘likes’ and comments. Many are so scantily clad and posing so seductively that it’s difficult to understand how they escape being labelled not-safe-for-work (NSFW). But escape they do — turning the platform into a paradise for pin-up girl enthusiasts who know where to look.


The nub of the NSFW screening problem seems to be that the cost of employing human moderators to catch images that fall through cracks in the platform’s automoderating system would make providing the service unprofitable. Which is why popular community platforms tend to rely on users to report unlabelled NSFW images appearing in their feeds. This is bearing in mind that publishing images of provocatively dressed, suggestively posing women is allowed — as long as it’s labelled NSFW.


Also, while every AI text-to-image model does include input and output filters, users have become adept at circumventing them. One way of doing this might be to use words in a text prompt that describe what’s required without triggering the filter. So, including ‘well endowed’ in a prompt instead of ‘huge breasts’ could trick a model into generating a NSFW image it might otherwise block.


It’s a minefield even the most comprehensive legislation may struggle to navigate effectively. At the time of writing, the European Union and Britain were the only jurisdictions with anything remotely resembling potentially effective laws in place for regulating AI text-to-image apps and community art platforms. And the EU AI Act is being rolled out over time. Meanwhile, AI text-to-image app users concerned about the abuse of online community art platforms should file complaints with consumer protection authorities in their home countries.

 
 
 

Comments


bottom of page