AI & IP

AI and IP

To read about the impact of AI on journalism, download our briefing document

Independent journalism is an essential pillar in our democratic society – it debunks fake news, holds power to account, and informs and engages the public. News media publishers invest a considerable amount of time, money, and resources in creating high-quality journalistic content that adheres to journalistic standards and ethics.

However, with the rapid advancement of AI, there are growing concerns over copyright issues. Generative AI can easily replicate, change, and distribute journalistic content, leading to potential copyright infringements, which can be detrimental to publishers who invest heavily in creating original content. There are also fears that AI technology could accelerate the spread of misinformation and disinformation – a growing problem in today’s society.

Current situation

"Everything our journalists have written in the history of digital publication has been used to train these tools without permission. That is quite a scary phenomenon and none of us have given permission for this to happen, and we’re all trying to catch up with the greatest heist of intellectual property the world has ever seen."

Matt Rogerson, director of public policy, Guardian Media Group

The current IP regime generally strikes the right balance in terms of protecting the rights of content creators and the rights of consumers or users. Any significant new exceptions could be highly damaging to the news media industry.

The rapid advancement of AI in recent years has caused serious concern over the role it will play in spreading dis- and misinformation online and beyond. Journalism will play as vital a role as ever in combatting the dangers AI poses to our society. Journalists, unlike AI, work under an editorial code and are committed to upholding the truth and public interest.

New research from the News Media Association and Newsworks – published as global leaders gathered at Bletchley Park for the inaugural AI safety summit – showed that the spread of misinformation and fake news is the public’s main concern with AI technology. Additionally, a YouGov poll, commissioned by the NMA and canvassing MP opinion showed that three quarters of MPs agree that trusted journalism created by news publishers is critical in minimising the risk of misinformation ahead of a potential general election. A similar survey from the NMA also found that 97 per cent of news brand editors agreed that the risk to the public from AI-generated misinformation ahead of a general election is greater than ever before.

AI is often marketed as “generative” but this is a misnomer. In reality, it is a tool that extracts large amounts of information, typically from news media, without a commercial license, and then regurgitates that information in response to a query.

Rightsholders of intellectual property should be the ones deciding who may use their content, and how.

The NMA is a signatory to the Global Principles for AI which sets out how developers should work with rights holders in news media.

What action do we want to see?

The NMA works to safeguard the intellectual property rights of its member news media publishers. This involves working with stakeholders such as the Alliance for Intellectual Property and Publishers’ Content Forum to protect publisher copyright and intellectual property and to ensure the industry’s voice is clearly heard.

As government and regulators adapts to the growth of AI, it is vital that copyright laws are not weakened for news publishers. It is crucial the rights of content creators are recognised as the development of generative AI models continues and vital to ensure full transparency regarding the limitations of AI. Users should be informed that AI-powered tools are not capable of producing journalism and the information they provide is based on the best possible guess from crawled data.

Latest