A future written by Generative AI looks miserable

I’ll start by thanking Hugh Andrew for his excellent LDV post from the 23rd April – ‘A thief in the night’, which I completely agree with. I’m old enough to remember the Napster file-sharing era when ordinary people started downloading music over the internet for free. This mightily offended big business in the form of the music industry who, pretending to care about the artists they profited from, declared this was stealing and so successfully lobbied Governments to change the law and make it easier for them to prosecute file-sharers.

Fast-forward 20 years, and now other big companies are downloading creative works over the internet for free, often created by ordinary people who are aspiring or actual artists, writers or musicians. This is also stealing, but those big companies are once again lobbying Governments to change the law, weaken copyright in their favour and legitimise what they are already doing anyway. And Governments, forever in thrall to the lure of the ‘next big thing’ are listening to them.

Where does this leave creatives such as artists, musicians, writers and academics? An aspiring musician might now put their work on Spotify, who will typically pay the princely sum of $0.004 per stream. A new author self-publishing on Amazon might earn a couple of quid per Kindle download of their book. A talented or lucky few may create a buzz, go viral or build a following that allows them to make a living doing what they love. However the vast majority will earn peanuts, but at least their work is out there to take pride in and get credit for, and those that enjoy it will know the creator’s name.

Or so we thought. Now their creative work could be swallowed by a machine and regurgitated without credit by anyone who can type the right prompt into an AI model.

Right now, it is estimated that only one in five website visitors is actually human, and over half of all internet traffic is now AI bots and crawlers, relentlessly sniffing out and ingesting human creative works. They are collectively perpetrating theft on a scale that the largest organised crime syndicates can only dream of.

Why? Well, those loveable management consultants at McKinsey estimate that Generative AI “could add trillions of dollars in value to the global economy” by “automating work activities that absorb 60 to 70 percent of employee’s time today”. A cynic might suggest that actually it will generate hugely profitable consultancy contracts for McKinsey and their ilk to help large organisations lay-off 60 to 70 percent of their staff.

Meanwhile the IEA estimates electricity demand from data centres worldwide is set to more than double by 2030 to around 945 terawatt-hours, mostly driven by AI. Most of that power will be turned into heat that needs to be dissipated, and in some parts of the world vast amounts of scarce fresh water will be consumed to do that. To get here, around US$750 billion of hype-driven investment has already been swallowed up, often by companies that are yet to turn a profit.

So is this where thousands of years of evolution and progress have led mankind? Vast sums of money invested in environmentally-disastrous technology, consuming more electricity than the entire nation of Japan, with the explicit aim of destroying jobs while devaluing the creative essence that makes us human? Just to benefit a handful of large companies, who are unlikely to be British no matter how accommodating our Government is?

The Government recently ran a consultation seeking “views on how the government can ensure the UK’s legal framework for AI and copyright supports the UK creative industries and AI sector together”, but it’s difficult to see how the latter can gain without the former losing out. History has shown that the concept of ‘copyright’ is malleable and protecting corporate profits is likely to take precedence over protecting the rights and livelihoods of individual creatives. The best the Government has offered so far is ‘transparency’, although it will be of little comfort knowing your work has been stolen if you can neither stop nor profit from it, and they may end up taking the enormous gamble that Generative AI will somehow create more jobs than it destroys.

For the Lib Dems, Tim Clement-Jones in the Lords and Cheltenham MP Max Wilkinson have spoken out in support of the need to defend copyright and protect the UK’s creative industries, but it feels like changes in favour of Big Tech are already baked-in to the conversation by Labour. I trust the Lib Dems will continue to defend the human urge to create, and to take credit for and profit from that creativity, because we are already seeing glimpses of what a world full of AI-generated slop and clickbait is like, and it looks miserable.

* Nick Baird is a Lib Dem activist and Chair of the Liberal Democrats in Cheltenham.

Read more by or more about .
This entry was posted in Op-eds.
Advert

5 Comments

  • I should add that while most forms of ‘computers doing stuff’ is being rebranded as AI, I do recognise the potentially significant benefits of using ‘AI’ to process and find patterns in large datasets in the fields e.g. of medical science and diagnosis, process optimisation, and fraud prevention.

  • Peter Chambers 1st May '25 - 2:39pm

    Good article Nick. It takes forward the conversation around IP theft from legacy creative effort. Of course this always happens. When the US had a Cable Boom in the 90s the regulations about residual payments (back catalogue) allowed a large subsidy to the new business at the expense of legacy content libraries. The nascent Information Superhighway was boosted by section 230 of the CDA, making websites Not-Publishers.
    At least one famous semi-conductor firm started by acting as if IP law did not exist, until they had a library of their own chip designs, then they grew a legal department.
    The willow patterns on the then-new pottery industry around Stoke … But you get the idea. Outcomes vary. Textiles caused widespread Machine Riots. My nephew’s regiment was founded around that time – as a gentleman’s militia – to put them down.

    With so-called Ai – why cannot we be precise and say ML, automation, expert systems, genetic algorithms and so on – there are additional hazards. Slop is one of them. There are also generative collapse and pre-poisoning of content – check out Harmony Clock, Glaze and Nightshade. It should be possible to encourage generative collapse by looking for techniques that defeat slop detection in ingester functions. Perhaps by researching the state of watermarking. Should be a PhD or two in there.

  • ‘Right now, it is estimated that only one in five website visitors is actually human, and over half of all internet traffic is now AI bots and crawlers’.
    That’s interesting. The internet started out as basically ‘shopping’ and ’emails’. If actual people give up creating fresh content, as only bots are seeing it, maybe it will go back to that?
    Given the evidence of AI hallucinating, and its rubbish efforts at transcribing things like old newspapers and handwriting, I wouldn’t use for anything important.

  • @Cassiec: “The internet started out as basically ‘shopping’ and ’emails’.”
    Not quite – the Internet was developed for military and security purposes in the 1960s. Email was an early application of the technology. It quickly found a home in academia and facilitated the free exchange of research between universities across the world. I do remember some early attempts at shopping via bulletin boards in the early 80s but it never took off.
    Unfortunately too many people believe the internet is synonymous with the world wide web which was famously invented by Tim Berners-Lee in the early 90s. In its early days it was markedly non-commercial, but Amazon was launched as an online only book store in 1995.
    And AI is already embedded in so many of our familiar technologies, from GPS tracking to doorbell CCTV. The current chat models that we all play with are only a tiny window into the world of AI.

  • Peter Hirst 5th May '25 - 3:29pm

    One possible solution is to enforce transparency of the use of AI. Knowing that what you are reading is AI generated will help the reader to judge its merit. In contrast knowing that it is human generated will renew trust in our creative capacity.

Post a Comment

Lib Dem Voice welcomes comments from everyone but we ask you to be polite, to be on topic and to be who you say you are. You can read our comments policy in full here. Please respect it and all readers of the site.

To have your photo next to your comment please signup your email address with Gravatar.

Your email is never published. Required fields are marked *

*
*
Please complete the name of this site, Liberal Democrat ...?

Advert

Recent Comments

  • David Le Grice
    I think this article massively understates the malaise and cowardice that has taken over the party. On the supreme court judgement we still haven't proposed to...
  • Geoffrey Payne
    @Simon McGrath - in answer to your question, I would be fine with a BBC presenter having those views if he was presenting Match of the Day because his personal ...
  • Simon McGrath
    I guess the best way of thinking about the Gary Lineker issue is to think about what one’s position would be if he held rather different views to most readers...
  • Geoffrey Payne
    Delighted to see Carl Cashman mentioned here. He is clearly someone who is carrying the flame of Liberal radicalism, which is very much part of a Liverpool trad...
  • Geoffrey Payne
    @David Le Grice, we are covering economic policy more spefically at our other conference in St Albans on the 19th July (see https://www.socialliberal.net/events...