Editorial 

The Guardian view on AI and copyright law: big tech must pay

Editorial: Elton John, Paul McCartney and thousands of other artists have called for protection from data crawling. The government must listen
  
  

Elton John performing
‘Sir Elton John has joined Sir Paul McCartney in calling for new rules to prevent tech companies from riding “roughshod over the traditional copyright laws that protect artists’ livelihoods”.’ Photograph: PA

Asked for a definition of intellectual property and ChatGPT answered: “IP refers to creations of the mind that are legally recognized and protected from unauthorized use by others.” The bot’s summary helpfully ended: “Intellectual property laws exist to encourage innovation and creativity by granting creators exclusive rights over their work for a certain period.

Too bad that the tech giants behind such artificial intelligence (AI) tools have chosen to play fast and loose with the rules. As Silicon Valley bosses were expressing concerns about “the distillation” of their models with the launch of the Chinese DeepSeek chatbot earlier this week, the House of Lords was debating legislation to protect against unfettered data crawling by AI.

Sir Elton John has joined Sir Paul McCartney in calling for new rules to prevent tech companies from riding “roughshod over the traditional copyright laws that protect artists’ livelihoods”. A petition against the unlicensed use of creative works for training generative AI now has more than 40,000 signatories including Julianne Moore, Kazuo Ishiguro, Kate Bush and Sir Simon Rattle. It is a battle that has united artists of every kind.

They are outraged at UK government plans to weaken copyright laws, and are urging greater transparency, control and financial remuneration to counter the challenge of AI.

The UK has a “gold standard” copyright regime and the oldest in the world. The Statute of Anne, passed in 1710, gave authors the right to control their work. Now the government is seeking to turn this law on its head by proposing exemptions, where the onus would be on the rights holder to opt out of their content being taken free of charge, and for them to trace how it is being used.

Kate Mosse, the bestselling author and co-founder of the Women’s prize for fiction, likens the AI model to a thief stealing all the Mars bars from a shop on the grounds that no one told them they weren’t allowed to do to so. The peer, film‑maker and digital rights campaigner Beeban Kidron has said we should not “redefine the notion of theft”. Her amendment to the government’s data bill, stating that data is licensed by default, was accepted by the House of Lords on Tuesday.

In the UK, adapting a novel for TV or film needs permission and payment unless it is public domain. Covering a song? No permission needed, but royalties apply – unless the lyrics or melody is changed, which requires consent. Even reproducing lyrics requires both permission and a fee, unless fair dealing applies. ChatGPT “knows” this and it cannot reproduce song words “due to copyright restrictions”. As Philip Pullman notes: “The principle is simple ... if we want to enjoy the work that someone does, we should pay for it”.

Labour must not be dazzled by the shininess of the tech industry and the promise of endless growth. To favour these businesses at the expense of the creative sector, the second most lucrative in the UK, generating more than £126bn annually, would impoverish more than our economy.

Both industries are at an impasse: tech insisting that enforcing copyright laws on AI is unfeasible and regressive, creatives that opting out is unworkable and unjust. It is up to government to foster a fair agreement in which both flourish. This is a pivotal moment for our cultural future. The artistic community has spoken. Now ministers must listen, take their concerns seriously and respond. We must protect our creators at all costs.

  • Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

 

Leave a Comment

Required fields are marked *

*

*