Discord under scrutiny as safety gaps persist
Technology & AI — As the popular chat platform grows into a universe of servers, safety concerns mount over illicit content, sextortion and extremism taking root in plain sight.

Discord, the popular chat service built around 'servers' that host communities from gaming to study groups, is under growing scrutiny after authorities reported a surge in illicit activity hosted on the platform. Officials say the configuration—anonymous usernames, large group chats, and a permissive culture—has made it attractive to criminals and predators. The platform reports roughly 200 million monthly users, with about 5 million in the United Kingdom, and it has a market value near $15 billion.
Discord was launched in 2015 by Jason Citron, who had built his first game by age 13 and later started a software company that was sold to a Japanese conglomerate. The idea grew from a need for gamers to talk during play. Today the service operates thousands of servers and about 19 million communities, catering to a wide range of interests. The basic design—servers with channels—functions like a vast, asynchronous chat room. The anonymity afforded by avatars has drawn in users who use the platform to share illicit material, perpetrate sextortion, or coordinate extremist activity. While widely used to connect gamers, the service has faced sustained criticism for enabling harmful conduct and for periods when moderation lagged behind activity on the platform.
Discord's role in extremist movements has drawn particular scrutiny. The platform was linked to the 2017 Charlottesville white-supremacist rally, and by 2021 the company said it had removed about 2,000 groups affiliated with political extremism. A British think tank, the British Institute for Strategic Dialogue, reported in 2023 that Discord continued to function as a hub for extreme-right socializing and community-building, with evidence of hard-right Catholic extremism, Islamic extremism, and even expressions of support for the banned neo-Nazi group Atomwaffen Division.
In the United Kingdom and elsewhere, individuals have used Discord to facilitate illegal activity. Police and prosecutors have cited episodes in which users downloaded or distributed indecent images of children, or used aliases to coordinate wrongdoing. In one 2025 update, investigators described a user who logged into a Discord server operated by a violent neo-Nazi group hundreds of times, complicating law enforcement efforts. Across Western Europe and North America, cases have illustrated a pattern of grooming, coercion, and blackmail conducted on the platform. The platform has also drawn attention for high-profile other uses, including the leak of secret military documents in 2023 via a server later linked to broader online channels.
Beyond individual crimes, Discord has been associated with broader safety concerns. In 2023, the platform drew attention for leaking secret military documents—relayed via a server nicknamed 'Thug Shaker Central'—which then circulated on other social networks. Earlier this year, Discord became a focal point in discussions about how social platforms influence political mobilization when Nepal briefly blocked several apps and allowed protestors to organize through the service, culminating in a violent crackdown that led to dozens of deaths. The events underscored the platform's reach beyond entertainment to real-world consequences.
Discord’s own transparency reports offer a quantitative snapshot of the challenge. For the first half of 2024, the company said it issued safety warnings to more than 200,000 separate accounts, with 27,000 accounts accused of deceptive practices, 33,017 of sending exploitative and unsolicited content, 56,042 of harassment and bullying, and 1,842 accounts warned for violent extremism. During the same six-month period, Discord received just under 30 million user reports of wrongful behavior, and more than 3 million users were connected to servers accused of enabling or hosting such content. In 2021 the platform noted a single, extreme case of abuse in which a user targeted a 15-year-old; the year also saw a rising tide of reports related to sexual content on the platform. While the numbers illuminate the scale of the problem, critics say they only scratch the surface of Discord’s safety gaps.
Advocates and security researchers argue that the platform’s architecture—open servers, anonymous avatars, and a culture that prioritizes privacy—can inadvertently shield wrongdoing. They call for stronger default protections, improved age verification, faster enforcement, and more aggressive use of automated detection powered by artificial intelligence, as well as closer cooperation with law enforcement. Discord has said it will continue refining its enforcement tools and policies, even as critics contend that progress remains uneven and timelines for meaningful change are uncertain. The Daily Mail, which first highlighted the allegations about illicit content on the platform, contacted Discord for comment but did not receive a reply.