X blocks Taylor Swift’s searches after viral AI nudes

Juliet Anine
2 Min Read

X has taken measures to prevent the spread of fake explicit images resembling Taylor Swift on its platform by disabling searches related to the pop star.

The move comes in response to the circulation of deepfake AI-generated images that began appearing on social media last week.

Since last Sunday, attempts to search for “Taylor Swift” on X have resulted in an error message, stating, “Oops, something went wrong.” X implemented this block after committing to removing the deepfake images from its platform and promising to take “appropriate actions” against accounts sharing such content.

X emphasized its strict policy against “Posting Non-Consensual Nudity (NCN) images” in an official post on its Safety account, stating, “We have a zero-tolerance policy towards such content.”

Despite these efforts, some false images of Taylor Swift persist on the social network, as bad actors find ways to bypass the search block by manipulating terms.

The deepfake images of Taylor Swift, which garnered 27 million views and approximately 260,000 likes within 19 hours last week, also spread to other social networks, including Reddit and Facebook.

A recent report by cybersecurity firm Home Security Heroes revealed a significant increase in the dissemination of deepfake videos online, with over 95,000 such videos in 2023, marking a 550% rise compared to 2019.

TAGGED:
Share This Article