A House of Lords committee has warned that the UK must not sacrifice its “powerhouse” creative sector by allowing “opaque” AI models to be trained unrestricted on human work.
The Communications and Digital Committee within the Lords has been running an enquiring on AI, copyright and the creative industries since last year.
In its latest report, published on Friday, the committee said the UK “faces a choice between two futures”, one in which the UK is a world-leader in responsible licence-based AI development and the other in which the UK “continues to drift towards tacit acceptance of large-scale, unlicensed use of creative content and long-term dependence on opaque models trained overseas, with most benefits accruing to a small number of US-based firms while harms to UK creators grow”.
The committee cited the creative sectors as an “economic powerhouse” that contributes billions to the UK economy and has been supported by a “gold-standard copyright framework”.
It said that in the age of generative AI, limited transparency from developers and the absence of specific protections for “digital likeness” pose severe risks to the livelihoods of creatives.
“Meanwhile, technology sector stakeholders are pressing for the introduction in the UK of a broad new exception for commercial text and data mining that would legitimise large-scale AI training on copyright-protected works. Without this, they argue, the growth of the UK’s AI sector will be stunted.”
Therefore, the committee recommended the UK should rule out a commercial data mining exception with an opt-out model, close gaps in protections for digital replicas, make AI training transparency a requirement and prioritise the development and adoption of sovereign AI models of the US.