Discord under scrutiny as platform grows and faces safety challenges
As the chat app expands to hundreds of millions of users, critics warn that anonymity and rapid messaging can enable sextortion, child exploitation, and extremism, highlighting a widening safety gap.

Discord, a popular chat platform with hundreds of millions of users, is attracting renewed scrutiny over how anonymity and server-based group chats can enable illicit content and online coercion. In one high-profile case, a suspect later identified as Tyler Robinson posted gloating messages about the murder of political activist Charlie Kirk on Discord, hours before his arrest, illustrating how quickly such acts can be celebrated or amplified on the service.
Founded in 2015 by Jason Citron, Discord began as a way for gamers to discuss tactics without interrupting play. The platform now hosts about 200 million monthly users, including roughly 5 million in the United Kingdom, and has been valued at about $15 billion. Chats, known as servers, can range from a handful of friends to communities with thousands or millions of members. While many servers cover innocuous topics—video games, bands, or study spaces—Discord has also become a magnet for groups that exploit anonymity to share illicit content and coordinate wrongdoing. The platform has long faced criticism over its role as a hub for extremist activity and criminal networks.
The platform’s dual nature is not new. The Charlottesville white-supremacy riots in 2017 were organized in part on Discord, and by 2021 the company said it had removed about 2,000 groups affiliated with political extremism. The British Institute for Strategic Dialogue, publishing in 2023, found that despite crackdown efforts, Discord remained a hub for hard-right and other extremisms, with evidence of hard-right Catholic extremism, Islamic extremism, and even expressions of support for banned neo-Nazi groups. While many users join for legitimate reasons, researchers and law enforcement note that the platform’s architecture—open, expansive servers, user anonymity, and rapid message delivery—can facilitate harmful behavior, including sexual exploitation and blackmail.
[]
Discord’s growth has outpaced the company’s ability to police content at pace. In 2024, the platform published a transparency report detailing safety concerns in the first half of the year: more than 200,000 separate accounts were warned over child-safety issues; 27,000 accounts were flagged for deceptive practices; 33,017 for sending exploitative or unsolicited content; 56,042 for harassment and bullying; and 1,842 accounts were warned for violent extremism. During the same period, Discord received nearly 30 million user reports of wrongful behavior and more than 3 million users were connected to servers accused of enabling or hosting such content. The figures underscore a widening safety gap as the platform hosts a mosaic of communities—from legitimate study spaces to highly problematic groups.
The platform has also become a vehicle for individual criminals and, at times, sensational headlines. Earlier this year, Richard Ehiemere, arrested in the London area, faced charges for downloading indecent images of children and distributing stolen email addresses and passwords, crimes carried out on Discord. Under the alias “Retaliate#1337,” he reportedly logged into a server operated by a neo-Nazi group on hundreds of occasions. The case is one of several in which UK authorities have linked Discord activity to sexual exploitation and intimidation. Other British prosecutions in recent years include 2024 sentencing of Syed Ali, who entered a private Discord chat with a 13-year-old girl and sought explicit images and location details, and 2023 sentencing of Calum Lacey for using Discord’s video chat to stalk and pressure underage girls for explicit material. In 2022, PC Will Scott-Barrett of the Metropolitan Police admitted to sending sexual messages to a 15-year-old over Discord.
The platform has not been limited to criminal act reports. In 2023, Discord drew headlines when leaked documents related to the Ukraine war circulated on a server nicknamed Thug Shaker Central before proliferating to other platforms. In a more unexpected use of its reach, Discord also played a role in a political development in Nepal this year when protestors turned to the service as authorities banned many other apps, and the country’s parliament subsequently faced upheaval linked to a prime minister’s resignation. The Daily Mail contacted Discord for comment but did not receive a response.
Industry observers say the speed and breadth of Discord’s growth require more robust, scalable moderation solutions and clearer policies to protect younger users while preserving legitimate communities. The company has defended its ongoing moderation work, noting its transparency reports and ongoing updates, but critics argue that the scale of abuse—and the diverse range of communities that thrive on the platform—makes comprehensive policing exceptionally difficult.
As technology platforms like Discord connect millions around the world, the tension between openness and safety grows more acute. Experts say the challenge is not merely one of policing content after it appears, but of designing systems—potentially with more automated detection, context-aware moderation, and stronger age-verification standards—that can anticipate and interrupt harmful activity without stifling legitimate expression. With the platform continuing to host everything from group study sessions to networks that propagate violent extremism or sexual exploitation, the onus is increasingly on the company, policymakers, and users alike to push for safer digital spaces while preserving the benefits of a community-driven communication tool.