The SoA has been furthering its work to raise awareness, provide guidance, challenge poor practice, and argue for the recognition of creators’ rights in regards to artificial intelligence (AI). In addition to our recently-closed survey on writers, illustrators and translators’ use of generative AI, we have been taking part in discussions around AI regulation, provided evidence for, and responded to, governmental reports, and continued to support, advise and lobby on behalf of our members.
Intellectual Property Office (IPO) Statement on voluntary AI Code of Conduct
We were disappointed to learn from the Intellectual Property Office (IPO) this week that it has shelved plans for a proposed code of practice that would set clear guidelines for the development of AI models on copyright-protected works. Citing a failure to reach consensus between rightsholders and AI developers, the IPO has returned responsibility for future work on this to the Department for Science, Innovation and Technology (DSIT) and the Department for Culture, Media and Sport (DCMS) to engage with stakeholders from creative and AI sectors. They state that they will focus on an approach that allows AI and creative sectors to ’grow together in partnership’.
While we appreciate the need to accommodate the interests of multiple parties, alongside the opportunities and risks of new technologies, we need to keep working together for processes and policies to ensure that existing Intellectual Property law is upheld.
The SoA and Nicola Solomon as Chair of the Creators Rights Alliance had attended many meetings on this code. We were dismayed at the IPO’s failure to state decisively that copying authors’ works to develop Large Language Models (LLMs) is an infringement of UK copyright law.
However, in the discussions, there was broad agreement on transparency measures and on the obligations of tech companies to publish the works they have used to develop their models, and to label any AI generated products clearly. We would urge the IPO, DCMS and DSIT to continue working with tech companies and the creative industries in the hope of agreeing a voluntary code on this aspect.
We are pleased to note the small first step that Instagram and Facebook are going to label all AI-generated images, but more must be done to identify the images made by creators on which those AI copies are based.
Publication of the House of Lords report on their inquiry on Large Language Models
In contrast, we applaud the recent House of Lords report – to which the SoA provided evidence – saying that there is a role for Government in ensuring AI developers comply with their obligations under UK law, including seeking permission for use – before use! – and remunerating for and properly attributing any uses they make of protected works.
The report says:
The point of copyright is to reward creators for their efforts, prevent others from using works without permission, and incentivise innovation. The current legal framework is failing to ensure these outcomes occur and the Government has a duty to act. It cannot sit on its hands for the next decade until sufficient case law has emerged.
The report calls for greater transparency for rightsholders to see if their work has been used without consent and for investment in new datasets to ensure tech firms pay for licensed works, noting there is ‘compelling evidence’ that the UK benefits economically, politically and societally from its ‘globally respected’ copyright framework.
The Lords quote the SoA in text (and a few times in footnotes):
231. The Society of Authors noted that AI systems “would simply collapse” if they did not have access to creators’ works for training and believed tech firms should reward creators fairly.
The whole report is well worth a read for its robust restatement of authors’ rights, and we urge that these measures be taken forward.
Publication of the Government’s plan for artificial intelligence regulation
The Government has published its response to the public consultation ‘A pro-innovation approach to AI regulation’ to which the SoA responded.
We were pleased to see Government recognition of human-centred principles that had been missing in their White Paper principles last year. The work of our members is not just ‘data’, and we welcome the Government’s recognition of ‘the importance of ensuring AI development supports, rather than undermines, human creativity, innovation, and the provision of trustworthy information’.
However, as the House of Lords stressed in its report, the Government must do more to provide appropriate and sufficient incentive for compliance by AI suppliers and users. As such, we also welcome the Government’s response that it wants AI developers to be transparent about the information that is used to develop their systems, so that rightsholders can determine whether their work was used to develop AI models.
This is in line with our own calls for IPO, DCMS and DSIT to continue to work with the creative industries and the technology sector to implement transparency mechanisms – as has happened in the music industry – and to respect creators’ rights, to stop infringing their works and to pay for all uses.
It is essential that the Government makes more explicit its commitment to hold AI developers to account – including in relation to retrospective uses they have made of copyright-protected works. As recognised in the House of Lords report, the creative industries are united in calling for compensation for these uses which have occurred without consent, remuneration or attribution, in direct contravention of copyright law.
If the Government is to honour its commitment ‘to the growth of our world-leading creative industries’, it must do more than express a desire for responsible and safe AI systems. It must take active steps to hold AI developers to account and not allow them to continue to ignore and undermine copyright laws.
DSIT governance report
The need for policy to protect the rights of rightsholders and enable these rights to be enforced has also been emphasised by DSIT, in a welcome report published on the back of the Government’s response. It mentions as one of ten important AI challenges ‘The Intellectual Property and Copyright Challenge’. Some AI models and tools make use of other people’s content, data, and copyright-protected works: clear policies of payment and attribution must be established so that the rights regimes are respected and enforced.
AI in the EU
The European Union (EU) has adopted the first AI Act. The AI Act introduces basic obligations in the field of copyright. It says that providers of general purpose AI systems (such as generative AI) must respect copyright law and have policies in place to this effect. It will also ensure that these AI systems are transparent on the data used for their development (including whether these include copyright-protected works).
The Committee of Permanent Representatives in the European Union also voted in favour of the EU AI Act last week. The vote sees EU countries agree on the technical details of the AI Act but now needs a sign-off from EU lawmakers before the rules enter into force.
Next steps
The SoA’s staff continue to lobby and engage with stakeholders while sharing best practice with members and keeping them up to date with developments.
We will shortly be publishing new guidance on how to prevent your websites being copied for use in developing Large Language Models. If you have any queries or suggestions always feel free to contact publicaffairs@societyofauthors.org.
Please also see our previously published guidance on generative AI: