At what age does a child reach “Digital Adulthood”?
That is, at what age can a minor independently consent to the collection, use, and sharing of their personal data without parental involvement?
According to the latest version of the Children’s Online Privacy Protection Act (COPPA 2.0), which recently passed in the Senate by unanimous consent, that age is 13. Yet no major medical or psychological body would consider 13 a developmentally appropriate age to let children make high-stakes decisions about their online life.
Early adolescence is a particularly volatile developmental stage. The prefrontal cortex – the region of the brain responsible for impulse control, planning, judgment, and problem‑solving — is still maturing. At the same time, elevated dopamine and serotonin activity in the limbic system heightens emotional reactivity and reward‑seeking behavior. These neurological changes make teens more sensitive to stress, more driven by immediate rewards, and less able to anticipate long-term consequences, all of which are conditions that directly heighten online vulnerability.
These biological shifts occur alongside a major social transition: teens begin looking to peers rather than parents for cues, affirmation, and identity formation. The perceived need for peer support, community, and an identity separate from their parents is one of the primary forces pushing children onto social media platforms in the first place. This combination of immature executive function, heightened emotional responsiveness, and intense peer-driven identity seeking creates a “perfect storm” of susceptibility to online risks.
The data backs this up:
- A 2025 nationally representative survey of 3,466 students aged 13–17 found that more than half (58%) had experienced cyberbullying at least once; 33% had been cyberbullied in the previous 30 days.
- According to data from Bark’s monitoring of billions of online activities, 10% of teens encountered predatory behavior online, including persistent grooming, sextortion, and online enticement.
- The National Center for Missing & Exploited Children (NCMEC) reported major increases in online crimes in the first half of 2025: Online enticement reports increased from 292,951 to 518,720 year‑over‑year. Financial sextortion cases rose from 13,842 to 23,593. While these figures apply to all minors, teens 13–17 constitute the largest at‑risk demographic for enticement and sextortion, according to NCMEC’s historic age distribution.
- NCMEC highlighted a disturbing rise in sadistic online enticement, involving violent coercion, forced self‑harm, and grooming via chat platforms such as Discord and Roblox. Examples include minors manipulated into live‑streamed self‑harm. Teens (13–16) are among the most targeted age groups.
- One in six individuals report being a victim of online sexual abuse before age 18, according to a study published in
Taken together, the neurological immaturity, social vulnerability, and documented high rates of cyberbullying, predation, and coercive harm reveal a profound mismatch between what this version of COPPA assumes 13‑year‑olds can handle and what the evidence shows they are actually equipped to manage.
There is little doubt that lowering the age of digital adulthood to 13 was a capitulation to tech lobbyists. Tech and Social Media companies are chomping at the bit to hook users at a young age.
Melissa McKay of the Digital Safety Institute recently highlighted the message Google sends to children, encouraging them to remove parental controls as soon as they turn 13. “A trillion-dollar corporation is directly contacting every child to tell them they are old enough to ‘graduate’ from parental supervision. The email explains how a child can remove those controls themselves, without parental consent or involvement … Google is asserting authority over a boundary that does not belong to them. It reframes parents as a temporary inconvenience to be outgrown and positions corporate platforms as the default replacement. Call it what it is. Grooming for engagement. Grooming for data. Grooming minors for profit.”
Compounding the problem, the Federal Trade Commission issued a policy statement just two weeks ago to “incentivize the use of age verification technologies to protect children online.” Sounds like a positive development, doesn’t it? Until you read further and realize that the policy statement explicitly authorizes these companies and platforms to collect and store personal information on children “without first obtaining verifiable parental consent.” A policy change that is diametrically opposed to the original intent of COPPA.
Keeping children off platforms designed for adults is a priority, but it appears the government trusts Google and Meta more than it trusts parents, as parental controls are being systematically stripped away.
If Congress’s goal is truly to strengthen children’s online privacy, then the solution cannot be to saddle 13‑year‑olds with adult‑level responsibilities while simultaneously giving Big Tech unfettered access to their personal data.
We are thankful for this problem getting more attention and some progress being made, but much more is needed. Updates to Children’s Online Privacy Protection must reflect an understanding of developmental reality and encourage the development of age-verification methods that protect children without creating new risks. In a digital ecosystem already failing to protect adults’ most sensitive data, treating 13‑year‑olds as fully responsible for their own long‑term privacy is reckless.



