X blocks searches for “Taylor Swift” after explicit deepfakes go viral

[ad_1]

X has disabled searches related to Taylor Swift on its platform in an attempt to stop the spread of false pornographic images in the likeness of the singer who began to circulate on social media last week.

Since last Sunday, searches for “Taylor Swift” on X have returned the error message “Oops, something went wrong.” X blocked the search term after pledging to remove fake AI-generated images from the platform and take “appropriate action” against accounts that shared them.

“Posting non-consensual nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” X saying Friday in an official publication. Security account.


Scammers use voice deepfakes to attack banks, report says

Still, some fake images of the pop star continue to circulate on the social network, and some bad actors get around search blocking by manipulating search terms, such as adding words between the artist’s first and last name, CBS MoneyWatch noted in a test internal system of X. search engine.

When contacted by CBS MoneyWatch for comment, X responded: “Busy now, check back later.”

The deepfake images last week racked up 27 million viewers and approximately 260,000 likes in 19 hours, NBC News. reported. They also landed on other social networks, including Reddit and Facebook.

The massive reach of the images exposes an increasingly important problem facing technology companies: how to remove deepfakes or “synthetic media” images, from their platforms. In 2023, more than 95,000 deepfake videos were spread online, a 550% increase compared to the number of fake videos that circulated on the Internet in 2019, according to the latest report from cybersecurity company Home Security Heroes. report.

Leave a Comment