UK AI Copyright Rules May Backfire, Causing Biased Models & Low Creator Returns
Briefly

Policy experts are warning that barring major tech companies from using copyrighted materials for AI training in the UK could lead to reduced model effectiveness and economic repercussions. The UK government initiated a consultation aimed at protecting artists’ rights, proposing an opt-out system where creators would need to exclude their works from AI training. This has faced backlash from both creative industry representatives and tech companies, who argue the proposal is impractical and restrictive. Experts suggest that failing to implement a comprehensive text and data mining exemption could stifle innovation and produce less effective AI systems.
Barring companies like OpenAI, Google, and Meta from training AI on copyrighted material risks undermining model quality and economic impact, according to policy experts.
The UK government opened a consultation to explore ways to protect the rights of artists, yet the proposed opt-out system was largely rejected by creative industry representatives.
Policy experts warn that solutions short of a full text and data mining exemption in UK copyright law may lead to ineffective AI systems and stifle innovation.
Benjamin White stressed that restrictions on AI training impact various sectors, stating, 'The rules that affect singers affect scientists, and the rules that affect clinicians, affect composers as well.'
Read at TechRepublic
[
|
]