Taylor Swift: ‘Disgusting’ AI pornographic images of the singer causes stir on X

Some disturbing images of popular singer Taylor Swift generated by AI that have been circulating on the internet for days has left the singer once again in the middle of a huge controversy.

Taylor Swift, is one of the most successful figures of the moment, but sadly, that success is a double-edged sword because just as it has brought her fame and fortune, it also sometimes causes her troubles.

The AI-generated images portrayed the singer in hypersexualized situations, most focused on her recent relationship with the Kansas City Chiefs tight end Travis Kelce.

According to reports, the images were generated on the infamous porn site Celeb Jihad, but from there, they soon made the jump to social media.

These unclad images spread like wildfire, and for a couple of days have appeared repeatedly on the timelines of X and Instagram, provoking the fury of the singer’s fans.

Social media users are angry

Many social media users have expressed their outrage at the images. One of the most eloquent asks, “How is this not considered sexual assault? I cannot be the only one who is finding this weird and uncomfortable?”

And it continues: “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable. How are there no regulations or laws preventing this?”

Another made a forceful call: “I’m gonna need the entirety of the adult Swiftie community to log into Twitter, search the term ‘Taylor Swift AI,’ click the media tab, and report every single AI-generated pornographic photo of Taylor that they can see because I’m f*cking done with this BS. Get it together Elon!”

It did not end there, another one expressed, “Whoever is making this garbage needs to be arrested. What I saw is just absolutely repulsive, and this kind of s*ht should be illegal … we NEED to protect women from stuff like this.”

The law

Although regulation is lacking at the federal level, states such as Georgia, Hawaii, Virginia, New York, Minnesota, and Texas have laws against so-called deepfake pornography.

In California and Illinois, victims can sue creators for defamation.

The said site where the images were generated is under constant attack as NGOs and women rights activists have refused to shield their swords.

However it continues to operate, while the social media outlets where the images are spread, such as Instagram, Reddit, 4Chan, and X, have been overwhelmed by the avalanche of images.

Swift is yet to issue a statement on the recent intrusion.
The social media where the images have been seen issued are also silent on the matter, so we will have to wait and see how the drama unfolds.

LEAVE A REPLY

Please enter your comment!
Please enter your name here