Search
Close this search box.

Putting a Stop to Online Exploitation: AI Deepfakes, Intimate Depictions, and Invasions of Rights

Since 2020, the use of artificial intelligence (AI) deepfake videos for deceptive purposes has doubled from 50,000 videos to 100,000. The use of AI deepfake technology poses an avenue by which online predators may create and dispense explicit imagery to exploit individuals for blackmail. Sen. Ted Cruz (R-Texas) has introduced the Take It Down Act, or S. 4569, which would effectively criminalize the intentional publishing of intimate visual depictions of individuals online, including images generated with the use of AI deepfake algorithms.

The advent of artificial intelligence, better known as AI, has introduced technological alterations into every aspect of modern life. Some of these changes have proven to be positive, like how online chatbots streamline customer service methods for greater workplace efficiency.  Other changes to modern life have proven detrimental, such as the ability to create fake images of real people, known as AI deepfakes.

Deepfakes, as defined by Northwestern University, occur when AI is used to generate machine-learning algorithms combined with facial-mapping software that inserts data into digital content without permission. AI deepfake technology may be used not only to publish an individual’s likeness in photos, but to publish such photos that show them engaged in an intimate sexual act that never actually occurred.

Because deepfakes are not real images, there are many legal questions regarding the extent to which they actually exploit the individuals involved. There are two primary ways by which AI deepfakes are increasingly and obviously leveraged to exploit individuals. First, AI may be used in “sextortion schemes,” or “revenge porn,” in which criminals steal photos and videos from social media platforms to fabricate deepfakes, which are then used to blackmail victims for money or to force victims to provide real sexual content. Second, teens are often exploited by AI-created nude images. These fake nude depictions can then be used to blackmail teens and cause them to engage in actions that violate their conscience.

Sen. Cruz’s Take It Down Act seeks to criminalize the making of such deepfakes and end the online publishing of intimate sexual images without consent. Considering the underhanded nature of AI deepfakes and nonconsensual publication of explicit content, it is wise to regard how such exploitation violates individuals on a deeper level beyond just the physical aspects.  

First, individuals who are exploited via AI deepfakes or nonconsensual online publication experience an invasion of privacy. Although an AI deepfake is not accurate in the scene it depicts, the imagery brings into focus an individual whose life was not previously well-known to the public. Their privacy is violated, and they are exposed to the stares of an online audience, regardless of whether the action depicted took place or not.  

Second, individuals whose likeness is used in online exploitation experience a violation of their property. The most obvious iteration of this is when an individual receives blackmail for money or explicit online content. The loss of resources or the compromise of one’s decision-making faculties denotes a loss of personal property and, with it, a certain level of dignity.

Moreover, a person’s likeness can be considered property in the legal sense. Multiple court cases speak to this reality. Amongst them are cases like Zacchini v. Scripps-Howard Broadcasting Co. (1977) in which the Supreme Court held that the First Amendment did not immunize a television corporation from broadcasting Hugo Zacchini’s likeness without consent. Likewise, cases such as Lugosi v. Universal Pictures Ltd. (1979) addressed the consideration of a person’s likeness as property, concurring that a person’s appearance needed respect like any other sort of right. Sen. Cruz’s bill seeks to protect individuals from invasions of both privacy and property.

Invasions of privacy and property do not, in any respect, compare to the preliminary invasion of conscience that occurs when an individual is forced to capitulate to blackmail demands. The right to one’s conscience is also not a new concept in legal precedent. Rather, it is approached as the nexus between both religious liberty and freedom of speech.

Cases such as Masterpiece Cakeshop v. Colorado Civil Rights Commission (2018) and Arlene’s Flowers v. State of Washington (2019) both addressed an individual’s right to refuse serving clients in a manner that violated their consciences. When a victim of sextortion schemes online is blackmailed, it is sometimes for money but more often for explicit content that is the real-life recreation of an AI deepfake. In other words, victims are often strong-armed into selling their bodies online in order to prevent something perceivably worse from happening.

Concerned Women for America Legislative Action Committee therefore endorses Sen. Cruz’s Take It Down Act and urges you to email your Senators asking them to vote in support of S. 4569. It is imperative that legislation such as the Take It Down Act be used to spearhead the criminalization of online exploitation and end the invasion of individuals’ privacy, property, and conscience.