AI Nudify Apps Explode, Your Clothes Are No Longer Safe


Artificial intelligence is changing the way that we work. As we rush towards this uncertain future, legal, moral, and ethical questions regarding where and how AI is used are multiplying at an astounding rate.

In the past two years, there has been a proliferation of “undressing apps”. AI is so advanced that it has surpassed the deepfake tech available to people a few short years ago.

If it happens to you, the experience is enraging.

It is not illegal to create pornographic images that depict an adult. It is legal to distribute it online. Images are often stolen from social media and then distributed without the consent or knowledge of their subjects.

Graphika, a company that analyzes social networks, found out that 24,000,000 people accessed these websites to undress in just September. The majority of apps offer access to the technology every month.

According to Time Magazine, Graphika found that the number of links promoting undressing apps on social media has increased by more than 2,400% since the start of the year. This includes X, Reddit, and X.

Santiago Lakatos is an analyst with Graphika. He said that the images of deepfakes in previous years were not as clear.

The Messenger

AI deepfakes have been used to target young women and girls. The Messenger reported in January that teenage girls at a high school in Westfield New Jersey were advocating for protections because their classmates had circulated AI-generated images of themselves around the school.

Graphika’s analysts warn that, in the end, “the increased prominence and accessibility of such services will likely lead to more instances of online harm.” These include the creation and distribution of non-consensual images of nude women, targeted harassment campaigns, and sextortion.

Eva Galperin is the director of cybersecurity for the Electronic Frontier Foundation. She said, “We see more and more people doing this with ordinary targets.” You see it in high school students and college students.

A North Carolina child psychiatrist was sentenced just this month to 40 years of prison for using photos with undressing applications on his patients. This was the very first prosecution brought under a law that made it illegal to take naked pictures of minors.

The law struggles to keep up with the technology. Few rules protect the public from this brutal invasion of privacy. Social media sites are responsible for policing their websites.

I am not confident.

TikTok blocked the term “undress”, a popular search phrase associated with its services. The app warns anyone searching the word to know that “it may be associated with content or behavior that violates our guidelines.” A TikTok spokesperson declined to provide further details. Meta Platforms Inc. has also begun blocking keywords associated with searches for apps that aren’t dressed. A spokesperson declined to comment.

Deepfake technology used to dress women up is a scandal, but deepfake technologies that put disinformation in the mouths of leaders are not too far off. AI could become so tightly controlled, that it will lose its usefulness.

It would be a sad and gloomy day for all of us.