top of page
peace-at-any-price.webp

TO SUM UP: you can lead a horse to water, but if it’s enmeshed in a collective anti-water mindset … (first published in ‘Medium’)

  • Writer: Pam Saxby
    Pam Saxby
  • Nov 8
  • 2 min read
enmeshed, entangled, caught, trapped (or not?)
enmeshed, entangled, caught, trapped (or not?)

Having mulled over the pros and cons of naming and shaming the AI ‘art’ generating community platform that triggered this mini campaign, I’ve decided not to — at least for the time being. Exposing it would simply underscore a possibly growing and increasingly widespread perception among AI ‘art’ generating community platform enthusiasts that this is a personal vendetta.


There are probably hundreds — if not thousands — of similar platforms offering people with limited artistic skills the opportunity to combine human imagination with machine intelligence to generate images from text prompts. For starters, a Bing search for a list of the most popular AI community ‘art’ platforms in the US found 23 on The Hive Index and 12 more on Olitt. Some are linked to Reddit, others to Discord (including the one I joined so naively).


My sense is that the community of which I’m a member (albeit reluctantly and temporarily suspended anyway) appeals most to people with limited artistic ability, like me. According to a flurry of articles about it published towards the end of 2024 across an array of online media, the platform has 25 million users. How many of those are active and regularly publish their images is a moot point. And how visually captivating those images may or may not be tends to be subjective.


But it’s probably fair to say that some community members join for all the wrong reasons — manipulating the system and circumventing the auto-moderator filters to generate a portfolio of images never published but downloaded for use elsewhere. Starter/input images are fundamental to the nefarious but increasingly popular ‘nudifying’ and ‘deepfake’ process, so imagine what can be done with provocatively posing AI ‘pin-up girls’ (my own pet hate, as you’ve probably gathered).


Some community platforms have public ‘galleries’ — one of which can be found here in all its explicitness. If you can stomach it, a ‘girls’ search will illustrate how far women objectification has gone with the help of AI. And please bear in mind that I’ve provided the link as a resource for anyone as concerned as I am about this disturbing social issue. Of course, not everyone visiting the ‘gallery’ via this blog post will do so for altruistic reasons. I’m painfully aware of that. But sadly, there’s nothing I can do about it.


Against that backdrop, by raising awareness of the dangers of AI text-to-image community ‘art’ platforms, I’m hoping that my mini campaign will nevertheless (and against formidable odds):

  • encourage visitors to this site (and the readers of its blog posts) to motivate for AI ‘art’-specific legislation that protects women and children in particular from being exposed to the industry’s dark side

  • spark a review of AI text-to-image community ‘art’ platform standards and terms of service, with the aim of prohibiting women objectification and severely penalising non-compliance industry-wide, and

  • prompt some serious introspection among the members of AI text-to-image community ‘art’ platforms about their complicity in normalising women objectification either directly or by default.


For starters, a platform offering its members a safe browsing account settings option is probably used by many of its supporters to create adult/not-safe/suitable-for-work images. Why else would the option be available? So, if your community platform falls into that category, take heed! The chance of becoming similarly enmeshed is not to be underestimated.

 
 
 

Comments


bottom of page