The Society of Authors (SoA) has welcomed the House of Lords Communications and Digital Committee’s recommendations published today.
The thirty-eight recommendations calling for Government action are the result of an Inquiry examining AI, Copyright and the Creative Industries. The publication comes just before Government are expected to publish further economic evidence and directions before the 18 March deadline imposed by the Data Use & Access Act.
The Committee’s report prioritises a licensing and transparency approach to AI training, urging the Government to make a clear public statement that commercial AI developers operating in the UK must obtain appropriate licences when using copyright-protected works. This is something that the SoA have long campaigned for.
It also calls for stronger transparency requirements, statutory duties for major AI developers, and regulations that support rightsholders.
In her comments on the report Baroness Keeley, Chair of the Communications and Digital Committee, said:
“Our creative industries face a clear and present danger from uncredited and unremunerated use of copyrighted material to train AI models. Photographers, musicians, authors and publishers are seeing their work fed into AI models which then produce imitations that take employment and earning opportunities from the original creators.
“AI may contribute to our future economic growth, but the UK creative industries create jobs and economic value now. In 2023 the creative industries delivered £124 billion of economic value to the UK and this is set to grow to £141 billion by 2030. Watering down the protections in our existing copyright regime to lure the biggest US tech companies is a race to the bottom that does not serve UK interests. We should not sacrifice our creative industries for AI jam tomorrow.
The SoA strongly supports the Committee’s recommendation to rule out a new commercial text and data mining (TDM) exception, warning that such an exception would undermine creators’ rights and destabilise the UK’s creative economy. The Committee also emphasised the need for robust labelling standards to help users identify AI‑generated content, which is important step in safeguarding public trust and protecting human creativity in the age of AI.
Anna Ganley, who gave evidence to the Committee in November, welcomed the report:
“We wholeheartedly agree with the Committee’s recommendations. It is a thorough examination of the issues informed by evidence from across the tech and creative sectors. Authors and creators cannot accept the current standstill from Government while their livelihoods disappear.”
She added:
“The SoA is particularly pleased to see calls for the Government to clearly rule out the previously proposed commercial TDM exception, and to commit to meaningful transparency around model training and clear labelling of AI‑generated content. Authors want fairness, transparency and licensing. Hesitancy from Government on this continues to leave the gate wide open for unscrupulous theft of authors copyright protected work for AI training.”
In its evidence to the Committee, the SoA stressed that creators are not opposed to AI. Instead, they want a fair and functioning market in which they can participate, and be paid, without having their work scraped or replicated without consent.
The SoA has made it clear through campaigns and lobbying without urgent policy intervention, authors face a perfect storm of shrinking opportunities and rising barriers to entry. The Government’s response so far has been mixed at best, slow at worst and the Committee’s report highlights the lack of movement from both DCMS and DSIT.
The Committee’s report recognises the precarious situation facing authors and creators and calls for immediate Government action. The SoA says it now awaits the Government’s announcement due on or by 18 March and expects a clear plan to move forward.
CLEAR Framework: A roadmap for fair AI
Alongside partners in the Brave New World? report, the SoA continues to advocate for the CLEAR Framework to set a minimum standard for a fair, sustainable creative economy in the age of AI:
Consent before work is used to train or power AI (with credit and compensation where chosen)
Licensing, not scraping
Ethical use of training data—from lawfully licensed sources
Accountability through transparent datasets and clear labelling of AI outputs
Remuneration standards and contracts that ensure creators understand their rights and are paid fairly
Human Authored scheme
The SoA will shortly launch its Human Authored scheme, supporting creators and helping readers, platforms and publishers identify and champion human‑created work.

