Ex-Meta Executive Says Requiring Artist Consent to Train AI Could 'Kill' UK's Tech Sector Overnight
While agreeing artists should opt-out, Clegg finds pre-training permission 'implausible'

Britain's artificial intelligence sector could be 'killed overnight' if tech companies are forced to ask artists for permission before using their work to train AI models, former Meta executive and ex-deputy prime minister Nick Clegg has warned.
Speaking at the Charleston Festival during a discussion about his forthcoming book, How to Save the Internet, Clegg's comments have ignited a fierce debate over AI's future and creative rights. After leaving Meta in January, where he served as president of global affairs since 2022, Clegg is once again at the centre of a national conversation—this time on the collision course between code and creativity.
The Cost Of Consent
During the event, Clegg addressed concerns from the artistic community demanding stronger copyright safeguards against AI. He acknowledged that creators deserve transparency but pushed back on proposals requiring prior consent.
'It would be fair to allow artists to "opt out of having their creativity, their products, what they've worked on indefinitely modelled",' he said. But, responding to more stringent suggestions, he noted: 'Quite a lot of voices say, "You can only train on my content, [if you] first ask." And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.'
'I just don't know how you go around asking everyone first. I just don't see how that would work.'
He added: 'If you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.'
Lawmakers Split On Artist Protections
Clegg's warning comes amid heated debate in Parliament over the Data (Use and Access) Bill, which aims to give creators greater insight into how their copyrighted work is used in AI training. A proposed amendment would force companies to disclose datasets used for model development, a move many artists believe is critical to protecting intellectual property.
Backers of the amendment include a roster of creative heavyweights such as Paul McCartney, Dua Lipa, Elton John, and Andrew Lloyd Webber. In a BBC interview, Elton John called the bill 'thievery on a high scale' and threatened legal action: 'It's criminal, in that I feel incredibly betrayed.'
Government Pushes Back On Transparency Clause
Despite public and industry support, the proposed amendment was rejected by Parliament on Thursday. Technology Secretary Peter Kyle defended the decision, saying: 'Britain's economy needs both [AI and creative] sectors to succeed and to prosper.'
Baroness Beeban Kidron, who introduced the amendment, argued that enforcing transparency is the only way to ensure copyright laws are upheld. In a recent op-ed, she promised that 'the fight isn't over yet,' as the bill now heads back to the House of Lords in June.
A Fight Over Innovation Versus Integrity
The debate captures a growing divide between the tech sector's demand for data and creators' rights to control their work. Clegg's comments reflect the urgency within the tech community to avoid regulatory red tape, while creators see disclosure as the only path to accountability.
With billions in future industry value—and the integrity of British creativity—on the line, this standoff could define not only who controls AI's development, but what ethical boundaries govern it.
© Copyright IBTimes 2025. All rights reserved.