Labour has an incredibly relaxed attitude to British property. As the Chagos islanders found out, it wants to give away as much as it can, as quickly as possible. Next, it’s the turn of Britain’s creative industries and publishers.
The Government is mulling a radical change in copyright law to please large technology companies, like Microsoft and Google. Their generative AI systems are worth very little without training data, and so need high-quality material to feed them.
For the machines to create their knock-offs, they must swallow up art, stories, movies and news articles. But almost anything worth ingesting is private property – it belongs to other people.
The owners are entitled to refuse a licence if it harms their business prospects, under UK and international conventions. The whole point of erecting a barrier is to encourage a tollbooth through which commercial transactions take place.
But while Big Tech knows it must pay billions of dollars for computing power, and millions more for skilled AI developers, it thinks it can obtain this training material free. And in the UK Government, it’s found an easy mark.
The Government is reportedly preparing to allow AI companies to take what they want, without compensation, according to sources familiar with the thinking.
From this, it can create its clones in perpetuity. Opening a loophole of this nature would go far beyond what is being proposed in the United States, or implemented anywhere else today, according to Professor Daniel Gervais, former head of copyright at WIPO, and Professor in Law at Vanderbilt University in Nashville.
Singapore and Switzerland have opened material for scraping, but not for everyone. “They’ve made AI exemptions for training, but imposed strict limits on the type of organisation and its uses,” he tells me. So non-profits or scientific researchers can use the material, but not giant multinationals like Microsoft and Google. These companies certainly have the means to pay – they just don’t want to.
Not surprisingly the prospect has dismayed those whose livelihood depends on strong property rights. Around 26,000 professionals have signed a petition declaring that: “The unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted.”
Signatories include actor Kevin Bacon and musician Robert Smith, as well as many authors and poets. The petition was organised by Ed Newton-Rex, a former AI developer, who told me he was baffled by the prospect of such a one-sided, giant loophole.
“If the Government legalises AI training without compensation, then companies will jump on it immediately and it’ll be too late,” he says. “They’ll then use that to make synthetic data, and we’ll lose the provenance chains, and the Crown Jewels will have been pilfered.”
Newton-Rex doesn’t find the growth argument convincing. “What you’re doing is taking content from creators and giving it to AI companies and that’s putting 5pc of GDP in peril.”
It’s not a zero-sum game. He urges AI startups to tap into emerging commercial marketplaces for their training material, like Human Native AI – or the American Copyright Clearance Center. Or perhaps they can act like grown-ups, rather than spoiled brats, and cut a deal. This is exactly what other governments encourage.