If you asked someone a few years ago which party formed a British government wanting to exempt AI firms from having to adhere to copyright laws and joining the Trump government in refusing to sign an international declaration calling on AI to be, among other things, ethical, they would almost certainly have assumed it was a Conservative government, not a Labour one.
The reality is that that’s exactly what Keir Starmer’s Labour government is doing.
Both of these should be extremely concerning for us all, but for Liberal Democrats this should ring particular alarm bells. The government seems intent to hand the majority of the value of the UK’s vital creative industries, estimated to be worth over £120 billion, to unaccountable US tech firms headed by the wealthiest men on the planet, with precious few safeguards for authors, artists, and creators.
For those of us hoping that the 2024 General Election would see a wholesale change in how the country is governed, this is a huge disappointment. The Labour party of mere months ago would have balked at these plans, and yet in government it has ceded crucial ground to the far-right tech oligarchs with enthusiasm. It is hard to imagine the Conservatives or Reform taking a sufficiently different stance on these issues, so we must do so ourselves.
The notion that creators should be paid fairly for their work, and should retain control over it for a limited time after publication, has been the cornerstone of the UK’s copyright laws for centuries. Those who dismiss it as gold plating miss the point of gold plating, which any chemist will tell you is to stop the things underneath it from corroding. If Keir Starmer gets his way and our copyright laws are holed below the waterline, we will see not only the collapse of a critical industry, but a flight from the UK of creative talent and works – we will also find that creatives, long assumed to be among the last victims of AI, will in fact be the first, as authors, artists and others conclude it is no longer financially sustainable for them to devote significant time to their craft.
A far better approach to AI use of copyrighted works is a licence model, whereby AI firms are forced to compensate creators fairly where their works are used in models – this would not only allow creators to opt fairly into, rather than bureaucratically out of, the use of their work for model training, while also ensuring that those models are fairly comprised. Those bursting to cry that the poor AI companies would have to pay through the nose for this would do well to remember that tech companies are among the most valuable on the planet, and tech oligarchs are the wealthiest people in history – and almost all are American.
Liberal Democrat Parliamentarians everywhere should be robustly opposing these plans – which can only serve to beggar the UK and its world-beating creative industries, for the benefit of a hyper-wealthy few ushering the world down a very dark path indeed.
* John Grout is a Lib Dem activist and lives in Reading.
16 Comments
Peter Kyle (Technology Secretary) was quoted in the Guardian yesterday saying “I’m working really carefully with the tech sector so that we can produce the reassurance they need, that there will be technical solutions to things like transparency and making sure that those remarkable people, who create remarkable pieces of art, are respected for it.”
So he’s offering “transparency” but not *control*, and people who create art will be “respected” but not *paid*.
Musicians who sample others work without permission regularly get sued. But now that deep-pocketed Tech Bros are on the scene, Kyle says the current situation is “not tenable”.
John Grout, thank you for your vital article.
For 20 years I have been researching and writing ten non-fiction books, which are still selling. Book 11, ‘From Setting Sun to Rising Sun’, will not be published as it includes a chapter on travelling across northern Israel. Copyright has protected me.
We see today Starmer cosying up to Trump and Musk by joining with the USA in not supporting the agreement on AI at the Paris Conference.
LibDems are in a unique position to fight for the rights of the creative industries. Other parties will not.
I’m sympathetic to much of this article, but why does the fact that tech oligarchs are mainly American have any relevance to whether their companies should have to pay for using creative artists’ work in their AI models? Would the argument be any different if they were French, or Dutch, or Swedish, or British?
Also worth pointing out that it’s likely to be very hard to detect what data goes into AI models, especially as AI improves and it becomes less and less likely to simply regurgitate its input data without modification. You’re largely going to be relying on companies themselves to be honest about what data they are using and to pay up. Probably, companies like Google will do so, but what about smaller companies dabbling in AI, which I suspect are likely to pop up in their thousands as AI tech becomes cheaper.
> why does the fact that tech oligarchs are mainly American have any relevance
Starmer needs the US (and thus the Trump supporting tech oligarchs) to invest in building the UK located AI datacentres he naively believes will actually do anything more than enable the oligarchs to collect rent money. Hence why he is jumping to the MAGA tune.
We can expect weakening of UK copyright laws, will also lead to a weakening of GDPR and the protection of UK medical data….
So whilst, Simon, you are right, it should not matter, it does. The laugh about copyright weakening, is we can expect Hollywood and music industry to want exemptions, that enable them to keep tight control over their copyrights…
The release of ‘DeepSeek’ as Open Source by the Chines has altered the balance in the AI world. Using standard chips instead of the exclusive (and expensive) Nvidia ones, their development costs were far lower than the Americans (whether subsidised or not). The Americans always think they can achieve superiority by throwing money at things, often without thinking things through, particularly the systems implications of getting things to work reliably together. Maybe Keir Starmer should hedge his bets, and not be taken for a ride by the billionaire oligarchs in the White House.
@David – The laugh is you could see this coming. Decades back the US limited the power of computers sold to the USSR, after the wall came down we discovered in many areas eg. Cryptology the Russians had developed better algorithms and approaches than the US who had been able to simply throw ever larger amounts of computing power at the problem…
Given the speed at which people and business are discovering the limitations of “AI”, I expect the wheels to have fallen off this articulate hype wagon before any of Starmer’s AI datacentres have gone live. Which is why we need to be careful about making exceptions for “AI” as we can such exceptions to be enlarged and used / misused to support other fads business can convince government to reduce the “regulatory burden “…
Aside: If your email service is running a spam checker (old school name for it), in modern language it is “AI enabled”.
@David – The laugh is you could see this coming. Decades back the US limited the power of computers sold to the USSR, after the wall came down we discovered in many areas eg. Cryptology the Russians had developed better algorithms and approaches than the US who had been able to simply throw ever larger amounts of computing power at the problem…
Given the speed at which people and business are discovering the limitations of “AI”, I expect the wheels to have fallen off this articulate hype wagon before any of Starmer’s AI datacentres have gone live. Which is why we need to be careful about making exceptions for “AI” as we can such exceptions to be enlarged and used / misused to support other fads business can convince government to reduce the “regulatory burden “…
During the week I had to go to A&E where I found that the triage system had been digitalised since I’d last been there. It may not strictly speaking be an AI system but you can see how AI could be used to further enhance the service. So AI is definitely not empty hype. On the other hand I’ve found that copyright laws in this country prevent me digitalising books if the author hasn’t been dead for at least 70 years. I don’t know the details of the present proposals but that our intellectual property rights legislation are sacrosanct cannot be supported.
@Mark Frankel – I first studied AI at Imperial College over 30 years ago. Forerunners of the current models were already being embedded in systems then, though at that stage they were more limited in scope. We have been living with AI for ages. The current AI systems that we are all now familiar with have taken a couple of small steps further forward in a) being publicly accessible and b) drawing on larger databases than ever before. We should not fear it or treat it as a new phenomenon, but we should make sure that it adheres to copyright laws.
As a writer, with several musicians in the family, it is essential that creatives are paid for the work they do.
@Roland – *this*. You have it right.
As to AI being hype, the term originated at the 1956 Dartmouth Summer Research Project on Artificial Intelligence, organised by John McCarthy, which aimed “to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it”. This new field would seek to become investible by US Big Business and Big Science, using tax-payer money to mitigate risk in the usual way. Eric Schmidt stated at a talk at Stanford – which he thought unfilmed – that be believed that a further $300 Bn would be needed to reach useful “AGI”. The US state will fund Project Stargate so that the tax payer can chip it.
What is happening is precisely a Gartner Hype Cycle. This is needed to drive investment intensity.
The UK government is trying for a junior seat at the table. Quite how HMG would get equity or good jobs out of this is not explained. An act of faith.
excellent article from a feelow Berkshire LD!
AI observations 1: Google is possibly killing its own golden goose with AI ‘overviews’ that answer questions without people having to look at any of the results below.
People paying for domain names etc and creating content may as well pack up, if ‘overviews’ mean they aren’t getting any traffic.
2. If real people give up, the internet will ultimately just be bots scraping old content that other bots previously scraped…
3. AI ‘hallucinates’. And makes mistakes. Examples being Apple’s summaries of BBC and other news reports being factually plain wrong.
And there are plenty of horror stories (US so far) of AI ‘making stuff up’ when summarising patients’ medical notes.
@Mark. Copyright can be left in a person’s will. (Or assigned by them while living). So the rights to a best-selling book (and financial benefits, such as royalties, licensing merchandise etc) can be enjoyed by their chosen heir.
That is why copyright lasts for 70 years.
@Mark,
The copyright situation in the USA is rather simpler, being 95 years from the date of publication. Interestingly, Google digitised many books that are still in copyright but operates under the ‘fair use’ rule that allows them to publish snippets of text from a copyrighted book (or full pages with the copyright owner’s permission). You can also find book page images that are still in copyright on the Internet Archive, but you have to register with them to formally “borrow” these. We have a similar system here with the Libby app where you can borrow ebooks, audiobooks and e-magazines from libraries that use OverDrive. Some libraries use BorrowBox or RB Digital instead. So copyright is certainly moving with the times.
I agree in principle with everything you said, but I don’t have the first idea of how something like this would be rigorously or effectively enforced with the massively increasing sophistication of AI models. It’s like asking me to acknowledge and pay for copyright every time I am inspired even subconsciously by an idea I read in a book or saw on TV.
If anyone believed AI was a reality, they’d be asking the AI to draft the laws on AI! The fact no is doing that just shows it is hype which largely involves copying/stealing original work (by humans) and hashing it up to make it “new”.