X is being flooded with AI graphic images of Taylor Swift

[ad_1]

Sexually explicit AI-generated images of Taylor Swift have been circulating on X (formerly Twitter) for the past day in the latest example of the proliferation of fake AI-generated pornography and the challenge of stopping its spread.

One of the most prominent examples on The post was up on the platform for approximately 17 hours before it was removed.

But when users began discussing the viral post, the images began to spread and were reposted on other accounts. Many are still awake and Since then, an avalanche of new graphic forgeries has appeared. In some regions, the term “Taylor Swift AI” became a trending topic, promoting the images to a wider audience.

Swift fans spam the platform to cover up explicit deepfakes.

Swift’s fanbase has criticized X for allowing many of the posts to remain active for so long. In response, fans have responded by flooding the hashtags used to circulate the images with messages that instead promote real clips of Swift performing to cover up the explicit fakes.

The incident speaks to the very real challenge of stopping deepfake pornography and AI-generated images of real people. Some AI image generators have restrictions that prevent nude, pornographic, and photorealistic images of celebrities from being produced, but many others do not explicitly offer such a service. The responsibility of preventing fake images from spreading often falls on social platforms, something that can be difficult to do under the best of circumstances and even more difficult for a company like X that has exhausted its moderation capabilities.

The company is currently being investigated by the EU over allegations that it is being used to “spread illegal content and disinformation” and is reportedly being questioned over its crisis protocols after it was found to be promoting misinformation about the company. war between Israel and Hamas on the entire platform.

Leave a Comment