express gazette logo
The Express Gazette
Saturday, December 27, 2025

UK Labour aide says AI giants will never have to pay for Britain's creative talent, prompting backlash

Adviser to the Technology Secretary says publishers and artists would not be legally compensated for their work used to train AI, a stance that has drew criticism from creators and lawmakers.

Technology & AI 3 months ago
UK Labour aide says AI giants will never have to pay for Britain's creative talent, prompting backlash

Artificial intelligence companies will never have to pay publishers and artists for using their work to train their systems, an aide to the UK Technology Secretary asserted on social media. The remark, attributed to Kirsty Innes, a former Tony Blair Institute staffer who now advises Liz Kendall, has quickly drawn the ire of musicians, writers and other creatives who say it would undermine rights and fair pay in a rapidly expanding sector. The claim comes amid ongoing policy debates over how AI training should interact with copyright rules and the incentives creators rely on to fund new work.

Ms Innes’s posts, reported by The Guardian, argue that regardless of personal beliefs about compensation, large AI firms may not be legally required to pay for the use of content to train models. She suggested that even with copyright laws in place, firms could operate under exemptions or loopholes that would sidestep royalties. The posts were written before she joined government service, seven months prior to her appointment as a ministerial adviser, and she has since deleted the tweets. Innes, who previously worked at the Tony Blair Institute for Global Change, has not provided further comment on the matter.

The Daily Mail has been campaigning to cast a spotlight on Britain’s creative industries amid debates over how to regulate AI and copyright rules governing training data. The Guardian’s reporting on the deleted posts raised questions about the stance of the government’s technology team and its approach to balancing innovation with creators’ rights. The Institute behind the aide has, in the past, benefited from significant donations from Oracle founder Larry Ellison, a factor widely cited in broader conversations about AI infrastructure and corporate influence on policy. Oracle is a backer of a large, cross-border AI infrastructure initiative led by OpenAI and SoftBank, among others.

Kirsty Innes and Liz Kendall have not publicly commented on the reports, and there has been no formal policy shift announced by the Technology Secretary’s office. Still, the exchanges pull into focus a central tension in Britain’s AI policy: whether training data should be treated as a resource that creators are paid for, or as material that can be used under broad exemptions as AI systems improve. Critics say an opt-out regime or de facto free use would undermine the incentive structure for authors, musicians, and other creators who rely on licensing revenues to fund new work.

Advocates for robust copyright protections argue that requiring compensation for the use of creative works in AI training is essential to sustain the creative economy in Britain, which is valued at roughly £126 billion across its industries. They warn that moving away from compensation could reduce the returns creators receive for their labor and disincentivize investment in new artistic projects. Proponents of a flexible framework, by contrast, contend that overly strict rules could hinder innovation and slow the development of AI technologies that rely on large-scale data for training. Policy makers have signaled a willingness to explore scalable solutions that respect creators’ rights while still supporting advances in AI capabilities, but concrete measures remain under discussion.

The broader global conversation about AI training data and copyright has intensified as tech giants seek to balance access to vast reservoirs of content with the legal and ethical obligations to identify and compensate rights holders. In Britain, the debate has grown more charged as lawmakers and industry groups press for timely regulations that clarify when and how content can be used to train models, and how compensation should be structured if at all. The current reporting underscores the partisan and strategic sensitivities at play as ministers confront a rapidly evolving technology landscape and a public that is increasingly vigilant about the value of its own creative output.

In the meantime, rights organizations, creators, and unions have urged policymakers to adopt clear, enforceable rules that deter unauthorized use and ensure fair pay. They point to international examples and ongoing negotiations in other jurisdictions as benchmarks for possible approaches in the United Kingdom. The situation remains fluid, with no official statement from Innes or Kendall and no announced changes to UK copyright policy related to AI training as of this writing.


Sources