Senate Whistleblower Testimony Spurs Hawley Push to Let Families Sue Meta Over VR Child-Safety Claims
Former Meta researchers told senators the company suppressed evidence of sexual exploitation and prioritized engagement; Meta calls the allegations selective and inaccurate as lawmakers press for new legal and regulatory tools

Two former Meta researchers told a Senate Judiciary subcommittee that the company concealed evidence showing children were exposed to sexual solicitations and other harm on its virtual reality platforms, prompting Sen. Josh Hawley, R-Mo., to urge Congress to "open the courtroom doors" and allow families to sue the company.
Cayce Savage and Dr. Jason Sattizahn testified on Sept. 9, 2025, that Meta at times suppressed, edited or ordered deletions of research documenting child sexual abuse, halted age-verification studies and permitted interactions between AI chatbots and minors that could become romantic in tone. Meta disputed the witnesses' account, saying the claims were based on selectively leaked internal documents and noting it had approved hundreds of Reality Labs studies since 2022.
Sattizahn, who said he was sent to Germany after the country briefly barred Meta's VR sales over data concerns, testified that his research there found underage users in VR were subjected to demands for sex acts, requests for nude photos and other exploitative conduct. He said that, in response, Meta instructed him to erase evidence and to avoid collecting data on emotional and psychological harm in future studies.
Savage, who said she led youth-safety research, testified that social VR spaces including an Oculus app store version of Roblox had been exploited by coordinated predators. She told senators she observed virtual "strip clubs" where children were paid to expose themselves and said in her view nearly every child who entered a social VR space would encounter serious inappropriate material.
Sen. Hawley cited that testimony in arguing that Meta CEO Mark Zuckerberg misled Congress when he testified in January 2024 that the company does not want users under 13 and that anyone under that age would be removed. "I don't see how you can square what he told us under oath last year with what these whistleblowers said today," Hawley said following the hearing. He called for Zuckerberg to return to testify under oath and for legislation allowing civil suits against social platforms that fail to protect minors.
Meta's response, provided to Fox News Digital, called the hearing's claims "nonsense," saying they relied on a curated selection of documents and that there was never a blanket prohibition on research involving young people. The company said that since early 2022 it had approved nearly 180 Reality Labs-related studies addressing youth safety and well-being. Meta also said it is training its AI systems not to engage teenagers on topics including self-harm, suicide, eating disorders and potentially inappropriate romantic conversations, and that it is limiting teen access to a subset of AI characters "for now."
The testimony drew bipartisan concern from senators. Republican Sen. Marsha Blackburn, who closed the meeting, invited Meta to respond to the witnesses and said the hearing reflected broad frustration with social platforms and virtual-reality environments that may be harming children. Several lawmakers said they would pursue measures to strengthen protections for minors online and to create avenues for accountability.
Savage and Sattizahn described a pattern they said prioritized engagement and monetization over safety. Savage told the panel she faced internal resistance when she flagged prior to broader deployment that certain apps and features were unsafe for children. Sattizahn said some company decisions curtailed age-verification efforts and that researchers who sought to document harm were constrained by legal or corporate directives.
Roblox, a popular online gaming and social platform, has been at the center of broader concerns about child safety in virtual environments. Savage said Robux — Roblox's virtual currency — could be converted into real money, creating incentives for exploitative behavior. State attorneys general and other prosecutors have in recent years pursued cases and investigations tied to online child exploitation on game and social platforms.
Germany's 2021 move to ban sales of Meta's VR headset over data-privacy concerns, and the subsequent resumption of sales in 2022, was cited by Sattizahn as motivation for the company's decision to study perceived risks in that market. The witnesses' allegations follow public scrutiny over how large technology companies build and deploy AI features and manage minors' access to digital services.
Hawley, who has previously advanced legislation in the Judiciary Committee to allow victims of online child sexual abuse to sue platforms, said civil liability is necessary to force change. "I don’t think we’re going to see real change at these companies until this becomes law and parents and victims can get into court and hold these people accountable," he said after the hearing.
Meta's public statements emphasize internal research approvals and efforts to refine AI responses to teenagers, but lawmakers at the hearing said those steps may not address the core concerns raised by the whistleblowers. The senators did not set an immediate timeline for follow-up actions, but several said they would seek additional testimony and legislative remedies.
The hearing adds to ongoing congressional scrutiny of major technology firms' handling of child safety and AI, a policy area that has produced state and federal inquiries, litigation and proposed reforms. Lawmakers from both parties have increasingly signaled interest in tightening oversight of AI-driven features and in creating clearer legal pathways for victims of online harms to seek accountability.
As the matter progresses, potential next steps identified by senators include summoning Meta executives for further testimony, advancing statutory changes to allow more civil suits against platforms, and exploring regulatory measures to strengthen age verification and content moderation in immersive and AI-enabled environments.

Meta did not immediately provide additional documentation to counter the specific claims recounted by Savage and Sattizahn during the hearing. The company has previously argued that platform safety requires a combination of technology, policy and partnerships with law enforcement and child-safety organizations. Senators and witnesses at the Sept. 9 session left open the prospect of further hearings and legislative action as they seek more details and accountability on child safety in virtual reality and AI-driven interactions.