express gazette logo
The Express Gazette
Sunday, March 1, 2026

Meta whistleblowers urge Congress to curb social media harm as lawmakers consider Kids Online Safety Act

Testimony from former Meta researchers highlights alleged design choices that prioritize engagement over child safety, as the push for stronger protections faces political headwinds.

US Politics 5 months ago
Meta whistleblowers urge Congress to curb social media harm as lawmakers consider Kids Online Safety Act

Two former Meta researchers told a Senate Judiciary subcommittee this month that Facebook and Instagram knowingly expose children to harm in pursuit of profits, describing a system in which research is constrained, records are manipulated, and child safety is subordinated to engagement metrics. The whistleblowers, Jason Sattizahn and Cayce Savage, testified that Meta’s internal practices included erasing evidence of sexual abuse and using third-party contractors to house harm reports so the company could plausibly claim ignorance of findings. They said product teams were discouraged from taking steps that might reduce engagement, and that lawyers controlled what topics could be researched and how findings were framed.

The hearing occurred as Congress weighs stronger rules to protect young users, including the Kids Online Safety Act, or KOSA, which would impose a duty of care on online platforms to design products that prevent and mitigate dangers to minors. Senators from both parties expressed a sense of urgency, acknowledging the testimonies as a call to action rather than a warning about hypothetical risks. The witnesses argued that current accountability mechanisms do not adequately deter practices that encourage heightened screen time, targeted advertising to youths, or the spread of exploitative content.

Becca’s story, recounted by her mother, Deb Schmill, sits at the heart of the debate. Becca, who grew up in Massachusetts, planned to attend the University of Richmond but her life unraveled after joining social platforms at a young age. Deb Schmill described a sequence that began with a party arranged online, where Becca was drugged and raped by older men she met on the internet. A compromising photo shared on Snapchat then subjected her to relentless cyberbullying, eroding her self-esteem and driving her toward drugs. The family relocated to Maine to remove Becca from local drug markets, but online access to illicit substances remained all too easy. A fentanyl-laced supply purchased via social apps contributed to a fatal overdose when Becca was 18. Deb Schmill said the family’s efforts to shield Becca from danger were undermined by the platforms’ reach and persistence. The mother emphasized that there was no easy escape from the online world.

The personal tragedy frames a broader policy argument: if KOSA becomes law, platforms would be compelled to report on how often minors experience harms and what steps they take to mitigate them, and they would be required to design products that reduce addiction, sexual exploitation and cyberbullying. The bill would also give parents tools to opt out of personalized algorithms and strengthen privacy protections for young users, while limiting features that encourage addictive use. Advocates say fewer than 10% of parents currently use the available parental controls, and researchers have found those controls often fail to stop determined teens from circumventing safeguards.

Proponents say KOSA would create a formal duty of care for social networks similar to other industries that market products to children, closing gaps that allow platforms to claim ignorance of harm. Savage, who led research on youth safety at Meta for four years, told lawmakers that the company’s research was restricted and that researchers seldom had access to full data, with the implication that the company could avoid accountability by default. “It was not uncommon in virtual reality for children to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, to be regularly exposed to mature content like gambling and violence and to participate in adult experiences like strip clubs and watching pornography with strangers,” she said. She added that Meta would not allow researchers to determine the age distribution of users in VR labs, and that acknowledging underage users would force the company to take action it preferred not to take because it would reduce reported user numbers. “It is more profitable,” she said, “to pretend to have no way of better identifying the real ages of their users.”

If KOSA were law, the companies would face a duty to identify and mitigate harms, to disclose the frequency of incidents involving minors, and to demonstrate the steps they are taking to protect young users from sexual predation, bullying and other online threats. The bill would also require platforms to report on their progress in reducing harms, making it harder to bury unfavorable findings in internal files.

The current political landscape complicates passage. Senate lawmakers approved KOSA last year, but House leaders did not bring it to a vote, leaving supporters hoping for a party-agnostic push this session. Critics—led by some in the technology industry—argue that the bill would impose new compliance costs and could curb innovation. The industry has spent millions on lobbying to defeat or weaken the measure, while Meta and other companies have highlighted investments in safety initiatives and user protections as evidence of responsible conduct.

The policy debate occurs as Meta has signaled large-scale, long-term investments in artificial intelligence, including a $10 billion data center planned for Louisiana, the home state of Speaker Mike Johnson and House Majority Leader Steve Scalise. Supporters of KOSA say the bill would not stifle innovation but would ensure that growth does not come at the expense of children’s safety. Opponents contend that current safety programs and user controls are sufficient and that further restrictions could hamper legitimate uses of social media by young people.

American teens already spend an average of about nine hours a day online, a statistic cited by advocates to illustrate the scale of exposure and the challenges of self-regulation. Parents and educators have long argued that tech companies must shoulder responsibility for the content and environments their platforms create, not just provide tools that allow users to opt out of certain features after harm has occurred.

The witnesses stressed that the conversation is not about banning online platforms but about enforcing accountability and transparency. Sattizahn testified that Meta’s own lawyers often blocked researchers from asking questions that might yield answers shareholders would not like, even as the company claimed to be pursuing safer products. Savage described a culture in which engagement metrics trumped child welfare, and where researchers faced obstacles when trying to study the effects of platforms on youth.

Beyond the policy fight, advocates argued that the experiences of families like Deb Schmill’s underscore the urgent need for guardrails that can limit the reach of harmful content and prevent predators from exploiting online networks. They urged lawmakers to enact and strengthen KOSA, arguing that the current legal framework gives tech companies too much latitude to prioritize revenue over safety. "Becca’s death is just one reason I wish we could get kids off social media," Deb Schmill said, underscoring the human cost that policy experts say is often obscured by corporate dashboards and quarterly earnings.

If Congress can combine robust safety standards with practical, enforceable rules, supporters say, platforms would still be capable of innovation while being more accountable for harms experienced by minors. The debate continues as lawmakers weigh the trade-offs between free expression, consumer choice, and the protection of vulnerable users who increasingly populate every corner of the online world.

Deb Schmill at robotics competition with Becca


Sources