AI and IP
To read about the impact of AI on journalism, download our briefing document
Independent journalism is an essential pillar in our democratic society – it debunks fake news, holds power to account, and informs and engages the public. News media publishers invest a considerable amount of time, money, and resources in creating high-quality journalistic content that adheres to journalistic standards and ethics.
However, with the rapid advancement of AI, there are growing concerns over copyright issues. Generative AI can easily replicate, change, and distribute journalistic content, leading to potential copyright infringements, which can be detrimental to publishers who invest heavily in creating original content. There are also fears that AI technology could accelerate the spread of misinformation and disinformation – a growing problem in today’s society.
Current situation
“Both artificial intelligence and the creative industries – which includes news and media – are central to this government’s driving mission on economic growth. To strike a balance in our industrial policy, we are working closely with both sectors. We recognise the basic principle that publishers should have control over and seek payment for their work, including when thinking about the role of AI.”
Prime Minister Sir Keir Starmer in his op-ed for Journalism Matters week
The current IP regime generally strikes the right balance in terms of protecting the rights of content creators and the rights of consumers or users. Any significant new exceptions could be highly damaging to the news media industry.
The rapid advancement of AI in recent years has caused serious concern over the role it will play in spreading dis- and misinformation online and beyond. Journalism will play as vital a role as ever in combatting the dangers AI poses to our society. Journalists, unlike AI, work under an editorial code and are committed to upholding the truth and public interest.
New research from the News Media Association and Newsworks – published as global leaders gathered at Bletchley Park for the inaugural AI safety summit – showed that the spread of misinformation and fake news is the public’s main concern with AI technology. Additionally, a YouGov poll, commissioned by the NMA and canvassing MP opinion showed that three quarters of MPs agree that trusted journalism created by news publishers is critical in minimising the risk of misinformation ahead of a potential general election. A similar survey from the NMA also found that 97 per cent of news brand editors agreed that the risk to the public from AI-generated misinformation ahead of a general election is greater than ever before.
AI is often marketed as “generative” but this is a misnomer. In reality, it is a tool that extracts large amounts of information, typically from news media, without a commercial license, and then regurgitates that information in response to a query. Users should also be informed that AI-powered tools are not capable of producing journalism and the information they provide is based on the best possible guess from crawled data.
Rightsholders of intellectual property should be the ones deciding who may use their content, and how.
The NMA is a signatory to the Global Principles for AI which sets out how developers should work with rights holders in news media.
What action do we want to see?
The NMA works to safeguard the intellectual property rights of its member news media publishers. This involves working with stakeholders such as the Creative Rights in AI Coalition, the Alliance for Intellectual Property and Publishers’ Content Forum to protect publisher copyright and intellectual property and to ensure the industry’s voice is clearly heard.
In December 2024, the government launched its consultation on AI and copyright, proposing an ‘opt-out’ approach as its preferred way forward. Yet, the government’s consultation and its favoured policy on AI and copyright fail to address the core issue. The UK’s gold-standard copyright law is already clear; what is missing is strong enforcement mechanisms and transparency requirements that enable creative businesses to enforce their rights effectively.
Currently, there is no ambiguity in the law, but these proposals risk creating confusion and allowing generative AI companies to evade their obligations. News publishers must retain control over how and when their content is used, along with fair compensation for when their work is used by generative AI.
Rather than introducing impractical systems like the ‘rights reservations’ (or ‘opt-out’) regime, the government should prioritise embedding transparency requirements within the existing copyright framework.
This approach is the only way to ensure that both creative businesses and the generative AI companies that rely on their high-quality data can flourish and continue to innovate together.