Mother sues Roblox and Discord over 15-year-old son’s suicide, alleging online sexual coercion
Lawsuit says gaps in safety features allowed an adult to groom and coerce an autistic teenager into sending explicit images before his death

A San Diego mother has sued Roblox Corporation and Discord Inc., alleging the platforms’ safety failures allowed an adult predator to groom and coerce her 15-year-old son into sending explicit images and contributed to his suicide.
The wrongful-death complaint, filed last week in the Superior Court of San Francisco County, says the boy, identified in court filings as Ethan Dallas, met a user who called himself “Nate” while playing Roblox, a world-building online game popular with children. The suit alleges the user taught the boy to disable parental controls and moved their conversations to Discord, where the adult allegedly demanded sexually explicit photos and threatened to share messages if the teenager refused. The complaint seeks unspecified damages for emotional distress and other harms.
According to the lawsuit and media reporting, Ethan, described by his mother as autistic and a high-school baseball pitcher, began playing Roblox around 2015 and used parental controls that limited play time and required approval for friend requests. The suit says those controls did not stop private messaging from adults. The complaint recounts that late in 2023 Ethan confided in his mother about the online exchanges; his parents placed him in a residential treatment center for a year after bouts of intense anger. He was found dead at home about four months after he disclosed the online contact, the suit says.
Plaintiff Becca Dallas told reporters she believed Roblox was a children’s game and that she assumed the company monitored conversations because her son had been temporarily banned for profanity. A year after the suicide, she said Florida law enforcement identified the alleged predator as a 37-year-old man, Timothy O’Connor, who had been arrested on unrelated charges of possessing child pornography and transmitting harmful material to minors. Public records cited in reporting say O’Connor was deemed mentally unfit to stand trial in December 2023.
Roblox has long been a focus of child-safety scrutiny. The company reports that about 40 million of its users are under age 13, more than a third of its player base. While the platform is oriented toward younger users, it also permits older players and provides private chat and voice features that critics say can facilitate contact between adults and children. Roblox in July introduced a face-scanning age-verification feature aimed at reducing adult access to child accounts; safety experts have said such measures can be circumvented if predators use another person’s account.
A Roblox spokesperson said the company was “deeply saddened by this tragic loss” and added in a statement that, while it could not comment on litigation, the company strives to hold itself to “the highest safety standards” and has rolled out more than 100 safety features this year. A Discord spokesperson said the company uses “a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies,” noting that Discord requires users to be at least 13.
The lawsuit against Roblox and Discord is part of a wider wave of legal action and government scrutiny over platform safety. More than 20 lawsuits alleging that Roblox enabled sexual exploitation have been filed in federal courts this year, and a coalition of about a dozen law firms is working on child-safety litigation that aims to establish legal precedent holding tech platforms accountable for enabling contact between predators and minors, according to attorneys involved in those cases. State attorneys general have also opened investigations: Florida’s attorney general launched a child-safety inquiry in April, and Louisiana’s attorney general sued Roblox last month, calling it “the perfect place for pedophiles.”
Becca Dallas said she has formed a foundation in her son’s name to support children with mental-health needs and said she filed the lawsuit to raise awareness and warn other parents. In media interviews she described her son as a class clown and an athlete who loved the game that, she said, he had played since childhood.
Legal filings and platform responses in the San Francisco suit will determine next steps in the case. The complaint alleges that Roblox and Discord failed to provide reasonable safeguards and adequate monitoring tools that could have prevented the grooming and coercion. Those claims will be weighed against legal protections technology companies often cite for user-generated content.
The family’s suit and the growing number of related lawsuits and investigations underscore ongoing tensions between the accessibility of online gaming and communication platforms for children and the persistent challenge of preventing exploitation on services that mix young and adult users. The companies involved say they continue to develop technological and human-review measures intended to detect and remove predatory behavior, while plaintiffs and state officials push for stronger accountability and oversight.
