Automating Exploitation: A New Generation of Sex Trafficking

In April, the Minnesota Senate achieved a national first by passing a bipartisan ban on artificial intelligence “nudification.” Other states should quickly move to follow suit.

Rapidly evolving artificial intelligence technology has created a situation where elements of sex trafficking could victimize someone without them even realizing it. 

Nudification tools enable such exploitation through the use of generative artificial intelligence to produce explicit deepfake photographs or videos of real people. Users may generate a variety of content through nudification applications, ranging from states of partial undress to hardcore sexual content. Such media then circulates publicly through social media posts or is privately sold for profit. 

As Rep. Jessica Hanson (DFL-Minnesota 55A), sponsor of the Minnesota bill, explained last month, “[Nudification technology has] empowered and enabled pedophiles and sexual predators around the globe. It has harmed children who are made victims by their cruel peers, women who are made victims by men they have trusted for decades.”

Victims of nudification share elements of human trafficking as defined by the United Nations (UN), which says it is the “recruitment, transportation, transfer, harboring, or receipt of people through force, fraud, or deception, to exploit them for profit.” In this new frontier, the bodies of victims are virtually held, manipulated, and transferred for exploitative profit, inflicting deep psychological and sexual violence on victims, who have no autonomy through the process. 

Online sexual exploitation through deepfake technology poses a particular concern among children. A recent UN study found that over 1.2 million children, or one in 25, disclosed that someone had manipulated their images into sexually explicit deepfakes in the past year. Another report found that one in eight teens knows another minor victimized by deepfake nudes.

Though nefarious artificial intelligence nudification technology represents a relatively new development, online sexual exploitation has long plagued Western culture. Trends in the pornography industry show a clear preference towards the sexualization of minors. Data from 2021 suggests that one of the most popular search terms in explicit internet use is “youth,” and among the most popular search terms in the explicit website PornHub is “teen.”

In addition to promoting the sexualization of minors, the pornography industry promotes unabashed violence and exploitation against women. A 2020 survey found that nearly 50% of pornographic videos contained physical aggression. In one interview, Carlo Scalisi, owner of pornsite 21 Sexury Video, discussed why the industry likes to recruit “amateur” women, remarking that they “come across better on screen [because] they feel strong pain.”

In the history of the pornography industry, pain and abuse have not been the unfortunate byproducts; they are a feature much sought after by exploiters. Violence is a major selling point of the product, whether the explicit material is artificially produced or physically recorded.

In Isaiah 1:17, God instructs His people to “learn to do good; seek justice, correct oppression; bring justice to the fatherless, plead the widow’s cause.” We have a moral obligation to protect those at greatest risk of exploitation, especially innocent children who deserve protection and care. 

Although the roots of sexual abuse and exploitation extend far beyond technological change, the rise of AI-generated deepfakes has intensified the threat. States should urgently follow Minnesota’s lead in confronting modern forms of sex trafficking.

Related