top of page
peace-at-any-price.webp

AI EROTICA & FEMALE OBJECTIFICATION ON COMMUNITY ‘ART’ PLATFORMS

  • Writer: Pam Saxby
    Pam Saxby
  • Nov 10
  • 3 min read

Updated: Nov 13

AI ‘art’, women objectification and erotica
AI ‘art’, women objectification and erotica

Anyone browsing casually through the pages of this website might be forgiven for wondering what’s sinister or even mildly provocative about some of the images I’ve used. If comments on the site’s blog posts were permitted, someone might have asked why such mildly erotic and occasionally even superficially innocent images of women were included at all. “How can a fully clothed woman enjoying a cup of coffee in the company of her cat possibly perpetuate women objectification?”, they might ask – having missed the cleavage peeping modestly from the neckline of the subject’s sweater.


Not wanting to encourage women objectification or promote the darker side of AI text-to-image community ‘art’ platforms, I deliberately included innocent-looking AI-generated pictures of women to illustrate an algorithmic bias in the image-generating models available on those platforms. Because more often than not, a word prompt including ‘woman’ will produce one with cleavage – or in clothing that reveals exaggerated curves. Which – depending on a viewer’s ‘taste’ in women – may be perceived as erotic.


And believe me, the images of more provocatively posing, scantily clad women portrayed on the pages of this website are mild by comparison with many available on the platform of which I’m still reluctantly a member (having paid for a subscription that only ends in February 2026, notwithstanding my recent temporary suspension for not toeing the line).


In his article, ‘What distinguishes erotica from pornography?’ (Psychology Today), Leon Seltzer suggests that “if the subjects are portrayed in a manner that focuses on their inner and outer radiance, their fleshy vitality, and the work itself seems to manifest a passionate and powerful affirmation of life and the pleasures of this world, then I think we’re talking erotic”. Written long before AI and the ever-increasing popularity of text-to-image ‘art’, Seltzer describes pornography as reducing the subjects to “so many body parts” the beauty of which (if any at all) “appears subordinate to the overriding purpose of arousal”.


In that context, we provide links to the online ‘galleries’ of two AI text-to image ‘artists’ publicly available on the websites of the community ‘art’ platforms to which they belong: ‘Shockwave’ (who only creates ‘pin-up girl’ images) an SirlamaAi (who focuses on well-endowed anime ‘girls’).


Separate links to the results of a search for ‘girls’ on one platform and ‘girl’ on another illustrate the disturbing mix of ‘erotic’ AI generated images of women with early adolescent girl children and innocent pre-adolescent girls (although the number of harmless images varies from day to day, generally outnumbered by erotica). There are also a number of images with faces similar to those on the ‘childlike sex dolls’ causing such public outrage in France (Time). 


The worrying possibility is that – in the context of women objectification – this could lead to a blurring of the distinction between a thoughtful facial expression with soulful eyes (or a child’s innocent glance) and one deliberately intended to be provocative. Because, if you look closely at the array of female images accessible on both these platforms, you may notice how few of them are smiling (especially in a sweet, innocent way).


In fact, on each community platform used to illustrate the extent to which women objectification is encouraged, most AI images of the female form portray them with vacant facial expressions – although that ‘come hither look’ is also popular.


Why (a rhetorical question, naturally)? Because they’re commercial ventures, silly – and erotica sells. Which is why we need the regulators to step in and clamp down. Now … before it’s too late.

 
 
 

Comments


bottom of page