Search
Close this search box.

Children Deserve Online Protection

Print Friendly, PDF & Email

The last time any meaningful legislation safeguarding children online passed was in 1998, the Children’s Online Privacy Protection Rule (COPPA). How effective was COPPA? One might argue, hardly. How could it be when Google was founded in the very same year, only a month before its passage and Facebook was founded six years later. In 1998, the population and legislators had no idea what protections would be necessary for the behemoth that the internet and social media would become in such a short period of time. However, the behemoth has indeed manifested, ravaging our children in its wake.

The National Center for Missing & Exploited Children’s (NCMEC) CyberTipline, founded in 1998, has received more than 144 million reports of Child Sexual Abuse Material (CSAM).  In 2022 alone, there were 31.9 million reports. It is hard to imagine the depths of depravity that are producing, proliferating, and viewing CSAM, but unfortunately, the depths exist and flourish greatly online.

Historically and still in legal statute, CSAM is known as child pornography, but that has proven to be a deceitful title in some contexts, because children cannot consent to sexual activities. As you can imagine, these kinds of productions are the worst kind, even subjecting children to conduct acts on animals, bondage, etc. – these videos and pictures are always abusive. This is why CSAM is a more precise and overall better terminology to address this menace.

Thanks to recent investigations from The Wall Street Journal, researchers at Stanford University, and the University of Massachusetts Amherst, we now know that Instagram and other social media platforms are doing the bare minimum when it comes to protecting the exploitation of children on their websites. The researchers created test accounts where they viewed a single CSAM account and were immediately flooded with similar “suggested for you” content. They used hashtags associated with underage sex to find 405 sellers of CSAM. The group even found blatantly disturbing hashtags, that should never be searchable, produced the exact content for which they were looking. If a group of external users can easily and quickly find this horrific content, Meta (Instagram’s owner) surely could quickly combat and shut down these dark corners of the internet. In 2023, Meta Platforms generated a profit of over 134 billion dollars – they could find room in their budget to address the issue.

COVID no doubt exacerbated the issue. NCMEC intercepted several disturbing conversations from the internet discussing the very topic, one quote being, “Great, finally some new stuff out here. I hope that means those who are stuck at home during COVID-19 are creating some new material with their kids?!?” As sick as the quote is, it is real and should not be avoided because of the disgust it rouses. If it makes us uncomfortable, think of the sweet, precious child who is having to endure it. It is estimated that 41% of child trafficking is facilitated by family members or caregivers, meaning that these children can never evade their abuser.

An easy ask of Congress in this area is to change all current statutes utilizing the term “child pornography,” to use the more percise term “Child Sexual Abuse Material.” Though it may seem like a small ask, clear terminology when speaking about this issue will ensure we put the emphasis on the children being abused, helping us identify the perpetrators as the exploiters they are. We have much further to go when it comes to protecting children online and we are committed to taking every step available to defeat this evil.