express gazette logo
The Express Gazette
Wednesday, December 31, 2025

London borough expands facial recognition, AI surveillance and drones to curb fly-tipping

Hammersmith & Fulham to retrofit 500 cameras for retrospective monitoring and deploy live facial recognition at priority sites; critics warn of privacy risk and potential overreach

World 3 months ago
London borough expands facial recognition, AI surveillance and drones to curb fly-tipping

The west London borough of Hammersmith & Fulham announced a sweeping expansion of its surveillance program, saying it will deploy live facial recognition cameras, AI-powered analysis and a fleet of drones to curb fly-tipping and related anti-social behaviour. The plan, costing about £3.2 million, would see 500 existing cameras converted to retrospective facial monitoring and fixed live cameras installed at ten priority locations across the borough, stretching from Shepherd’s Bush to Westfield.

Officials said the investment would help identify criminals who would otherwise evade detection and would be restricted to police access. They argued the data would be held by the police and used to secure arrests that would not happen otherwise. The program would also aim to identify objects such as knives and guns, and drone patrols would target fly-tipping, with cameras equipped with speakers to issue warnings. The council said the technology would be used to pursue not only serious crime but also anti-social behaviour in public spaces.

Council leader Stephen Cowan described the project as a way to give families peace of mind and to ensure criminals know there is nowhere to hide. The borough already houses more than 2,500 CCTV cameras, and officials note that this year the existing network contributed to hundreds of arrests. Supporters say the expanded system could close gaps in enforcement and deter wrongdoing in high-footfall areas around transport hubs and shopping streets.

Yet critics warn the move could erode civil liberties and lead to mission creep. Civil-rights group Big Brother Watch argues that while facial recognition may have legitimate security uses, placing such capabilities in the hands of a local council raises serious questions about oversight, misuse, and racial bias. The group notes that facial recognition systems have shown higher error rates for darker-skinned individuals and calls for parliamentary scrutiny of how the tech is deployed and controlled at municipal level.

Residents and workers offered mixed views. A market stall owner near the Shepherd’s Bush Market corridor said cameras could enhance safety but feared a drift toward a China-style surveillance state if the program expands. A local resident who has concerns about privacy but supports targeted policing said the debate should focus on proportionality and clear safeguards. Advocates for the plan cite a record of arrests that supposedly demonstrates the network’s effectiveness and argue that the risk of crime should take precedence over concerns about casual surveillance.

In the course of reporting on the plan, a journalist accompanying the project encountered a tense moment at a busy rail-heavy corridor. Plainclothes police officers and a security guard approached to verify the journalist’s purpose and credentials, illustrating how closely the new regime of constant monitoring could touch ordinary people in everyday settings. The episode underscored the perceived line between public safety measures and intrusive oversight, a line that many residents fear local authorities may cross as the program expands.

The council has a track record of aggressive enforcement. It has been criticized for swift penalties in other contexts, including a £1,000 fine for putting out rubbish too early before a planned holiday and for aggressive traffic-enforcement practices tied to low-traffic neighbourhood schemes. Proponents argue the fines reflect aggressive but lawful enforcement designed to keep streets safer and cleaner, while opponents point to disproportionate impacts on ordinary residents and small businesses during busy times.

Supporters also highlight the borough’s higher-than-average camera density, noting that Hammersmith & Fulham already operates more than 2,500 cameras and aims to extend the network further. They say the combination of live facial recognition, retrospective matching against stored footage, and public-address capabilities could deter crime, help victims seek justice and ensure criminals have nowhere to hide. Detractors worry about the balance between security and personal privacy, the potential for racial bias in automated systems, and the risk that local authorities could extend surveillance into petty offences or non-criminal behaviour.

Analysts and advocacy groups say the expansion warrants thorough parliamentary scrutiny and clear, transparent safeguards. Big Brother Watch has urged lawmakers to initiate a proper debate on facial recognition technology, noting that the topic has received limited formal attention in Parliament and should be subject to robust oversight if municipal agencies are to wield such tools. With the ongoing debate about how to balance safety, accountability and civil liberties, the Hammersmith & Fulham plan is likely to fuel discussions about the proper scope of surveillance in cities and the role of local government in deploying advanced monitoring technologies.

As the borough proceeds, residents will weigh the potential benefits of rapid response and deterrence against concerns about privacy, bias, and overreach. The broader question facing cities across the world remains how to harness powerful new technologies to improve public safety without compromising the rights of individuals or normalizing pervasive state and municipal surveillance.


Sources