New AI App Sway.ly Aims to Let Families 'Retrain' Social Media Algorithms
Developers say the app filters dozens of harmful content types, alerts parents and offers guidance to young users after a survey found widespread physical and emotional harm linked to social feeds

A new app called Sway.ly launched Monday with the stated aim of letting parents and young people exert greater control over the content delivered by social media algorithms. The developer says the tool uses artificial intelligence to identify and filter 36 categories of harmful material, notify parents and suggest actions for children, such as unfollowing or blocking accounts and pursuing educational resources.
The launch follows a company survey of 1,400 children and teenagers aged nine to 19 that found more than three-quarters believe social media is damaging their physical or emotional health and that most see hundreds to thousands of posts a day. The survey reported that 72 percent of respondents encountered “uncomfortable” content that left them upset, sad or angry, and that many experienced symptoms including sore eyes, tiredness and sleep problems.
Mike Bennett, chief executive of Sway.ly and a co-founder of the company, said in a statement that the app was designed to “retrain the algorithms” rather than rely on blanket bans, which he and his team said often lead to family conflict and can be circumvented by tech-savvy children. "Children are being overexposed to a relentless feed of toxic narratives about violence, misogyny, beauty, self-harm and more," Bennett said. "This is not content they go looking for — it’s what finds them."
Sway.ly’s developers said the system continuously monitors youth digital culture, including changing trends and coded language, to remain current in identifying content that may harm young users. When the app detects material in one of its flagged categories, it reportedly filters the content on the child’s device, issues an alert to a parent or guardian and provides tailored guidance to the child about safer online choices.
Daniela Fernandez, the company’s chief strategy officer, said that many parents struggle to keep up with rapidly shifting social media language and trends. "We wanted to build an app to keep up and help decode what kids are really seeing, and to give parents the insight and tools they need to respond with confidence," Fernandez said. The company said Sway.ly is priced at £2.60 per user per month.
The internal survey cited by Sway.ly found that of those surveyed, more than two-thirds reported at least one physical or emotional symptom they attributed to social media use. More than a quarter reported sore eyes and tiredness, about one in five said they experienced sleep problems, and 17 percent described themselves as "addicted" to their mobile phone. Approximately 14 percent said they felt sad, anxious or depressed because of what they had seen online.
The survey also reported that children with neurodivergent conditions such as attention deficit hyperactivity disorder or dyslexia said they experienced higher rates of harm. According to the company data, 35 percent of neurodivergent respondents reported being targets of cyberbullying versus 20 percent of their peers.
Psychotherapist and online-harms consultant Dr. Catherine Knibbs, who contributed commentary to the app’s supporting materials, said removing harmful content entirely is difficult because it is "varied, relentless and adaptive." She called education and open family discussion the most powerful tools for helping young people process what they see online.
Sway.ly’s developers framed the app as complementary to regulation efforts such as the Online Safety Act, arguing that legislative approaches focused on outright bans may be insufficient to address the subtler, persistent exposure that their survey participants described. The company said parental controls that rely on restriction alone can be bypassed by some young users, who may use virtual private networks, secondary devices or alternate accounts.
The company said the app was developed using peer-reviewed research alongside input from psychotherapists and AI specialists. The firm did not provide independent verification of the survey methodology or raw data when asked for details in materials distributed at launch.
Sway.ly becomes one of several technology products that position algorithmic curation as a controllable variable in online safety strategies. The company said the app is available starting Monday, and that it will update its detection models as youth trends evolve. Independent researchers and child-welfare advocates have called for continued scrutiny of tools that mediate young people’s online experience and for transparency about how automated decisions are made and reviewed.