It has been over a year since Mark Zuckerburg stood in front of a room full of grieving parents and apologized for the many children who have been irreversibly harmed by his online platforms. On that day, he promised to make things right, to protect future users from exploitation. But in a Senate hearing last week, it was revealed that not only has Meta not done enough to make its products safer for children, but they might have been actively working against it.
Is Meta, Facebook’s parent company, prioritizing profits over its users? Its various products – from Facebook, Virtual Reality (VR) headsets, and various messaging apps – all end up getting pushback from mothers because they are dangerous for kids.
An internal Meta document from 2021, unsealed in a lawsuit brought by the attorney general of New Mexico, revealed that as many as 100,000 children are harassed on Facebook and Instagram every single day. According to the National Center for Missing and Exploited Children, in 2022, Meta platforms accounted for nearly 85% of reports of child sexual abuse materials (CSAM). In 2025, this number actually dropped, not because Meta implemented new safeguards for kids but because the implementation of end-to-end encryption made many instances of CSAM invisible to authorities.
The dangers of these platforms is, by now, well known. But what is new is the lengths to which the company goes to suppress any data proving that fact.
Earlier this month, two former product safety researchers for Meta testified in front of the Senate subcommittee on privacy and technology that Meta actively suppresses research that shows just how dangerous its products are, specifically those of VR. Jason Sattizahn and Cayce Savage are part of a larger group of whistleblowers whose claims were reported by the Washington Post a few days before the hearing.
Zuckerburg has fully embraced VR as integral to the company’s future. The rebrand from Facebook to Meta in 2021 was part of this with “meta” being a nod to the virtual reality metaverse. The company has invested nearly $100 billion into developing headsets, glasses, and software for VR. Meta has an expensive vested interest in ensuring any rumors or hints that the product is not safe does not affect its bottom line.
This perhaps is why Sattizahn and Savage were both repeatedly told not to investigate and to cover up evidence of the harms that VR technology poses to children.
Savage told the committee that “it is not uncommon for children in VR to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, and to be regularly exposed to mature content like gambling and violence, and to participate in adult experiences like strip clubs and watching pornography with strangers.” However, there is no telling just how many children are exposed to this kind of content because “Meta would not allow [Savage] to conduct that research.”
Savage also flagged the dangers of Roblox, an online game platform popular with kids, to Meta. It has become widely known that Roblox is used by coordinated pedophile gangs to exploit vulnerable children. Savage told Meta that Roblox should not be hosted on its VR platform, but as of today, Roblox is still on there.
Jason Satizahn, who worked for the company for over six years before raising concerns about Meta’s practices and subsequently being fired, told senators about multiple instances in which Meta’s legal department blocked his efforts to reveal safety flaws in the VR products. In one instance, while conducting research on VR use in Germany, Sattizahn found that “underage children using Meta VR in Germany were subject to demands for sex acts, nude photos, and other acts that no child should ever be exposed to.” In response to that discovery, “Meta demanded that we erase any evidence of such dangers that we saw.”
Towards the end of the hearing, Sen. Josh Hawley (R-Missouri) asked if Meta could be a force for good in the world, if they were capable of changing course and ensuring its products are safe to use and valuable to society. Sattizahn answered that after working there for six years in an effort to reform the company from the inside out, he had decided that, “[Meta] is aggressively ambivalent to people.”
For its part, Meta has claimed that the whistleblowers’ testimonies are misleading and based on “selectively leaked internal documents.” But one must ask – why are there any documents that show Meta has deleted evidence of child exploitation or suppressed research into product safety? Meta should correct course, instead of trying to justify it.
But perhaps an even more important question is why Washington has still not acted to address this problem, outside of holding hearings like this one. No meaningful legislation related to child online safety has passed Congress since 1998, far before the advent of social media, despite numerous pieces of legislation being introduced. Bills such as the Kids Online Safety Act (KOSA), Stop CSAM, the App Store Accountability Act, and the Screen Act, all of which CWALAC has endorsed, would take important steps towards protecting kids online.
Meta has supported some of these efforts to protect children online, but it is clearly not enough. The shocking new details about Meta’s actions at this hearing also raise questions about whether it is undermining those efforts at the same time. It is imperative that lawmakers continue to develop more tools to protect the innocent from online predators who seek only to use and abuse them.