The emergence of artificial intelligence (AI) systems is already having a direct impact on SoA members. We have seen AI companies advertise editing and book review services and others selling machine-translations. AI narration is used in the creation of audiobooks, and the publishing industry is already using AI-generated images on book covers and in promotions. Some AI-generated content may be low-grade at present, but it is good enough in many situations and it is improving daily.
AI learns by accessing content and copying it briefly before deleting it, after which the system will remember what it has learned. If the copying takes place without the permission of the copyright holder, it is an infringement of copyright under UK law even though the copy is held only briefly by the AI before being deleted.
It is important to ensure that you try to control whether and how AI has access to your work, your style, and your voice.
- The SoA’s policy position on AI
- Our response to the Government’s AI white paper
- The Creators’ Rights Alliance response to the Government’s AI white paper
The practical steps below are designed to safeguard yourself and your work. These are fast-evolving technologies, so we will review this guidance regularly.
As with all our published guidance, this information is general. For specific advice on your own situation on this rapidly evolving topic, please get in touch.
We can give you general advice and guidance and spot anything missing or concerning in your contract. It is also important, even if you are happy with your contract or situation, that we see your contract so we can know as much about industry developments as possible. This allows us to monitor situations and AI applications as they evolve so we can effectively lobby and advocate on behalf of all members.
We would like to thank the many publishers who have agreed to include some variation of our recommended AI safeguarding wording in their contracts.
Contracts for the exploitation of your work
If you are signing a contract with a publisher, producer or anyone else for the exploitation of your work, seek SoA advice and consider the following:
- Usage – make sure it is clear in your contract how the work will be used by the publisher. Try to limit licensing to third parties.
- Term – limit how long you are granting rights for, and in what territories.
- Payment – is the payment reasonable for the amount of work, and the likely use?
- Moral rights – will you be credited in all uses of your work?
- Editing/manipulation – ensure that your work cannot be amended without your consent.
- Termination with reversion of rights – is there a mechanism for taking action if things were to go wrong?
And when it comes to AI specifically:
- Publishers and developers can monetise your work by using it to assist machine learning, so ensure the contract contains a clause forbidding such use. For example, ‘The [Publisher] may not use, or allow access to, the Work in any manner which could help the learning/training of artificial intelligence technologies.’
- You might want to prevent AI technologies being used in connection with the creation or exploitation of your work – for instance, forbidding AI-rendered translation, editing, cover design, indexing, and audio recordings. See ‘Your publisher’s use of AI’ below.
- Your publisher may likewise ask for assurances from you as to whether or not you have relied on AI to, for example, render a draft or a rough translation. It is in everyone’s interest to be transparent and to protect the authenticity and originality of human creators.
Your voice
If you earn your living from your voice – as a performer, or narrating audiobooks for example – be sure to weigh up the pros and cons of AI carefully.
It might be tempting to sell rights allowing copying of your voice to an AI company, but bear in mind that doing so will help systems learn to imitate human speech generally, and you personally. Money today could simply speed up the absence of any work at all in the future.
Performers’ union Equity advises you to be particularly careful if you are asked to record or perform in unusual ways, for example over the phone. It is not unheard of for unethical AI companies to develop voice models in this way.
Your publisher’s use of AI
Ask your publisher to confirm that it will not make substantial use of AI for any purpose in connection with your work – such as proof-reading, editing (including authenticity reads and fact-checking), indexing, legal vetting, design and layout, or anything else without your consent. You may wish to forbid audiobook narration, translation, and cover design rendered by AI.
This is important because:
- there are questions over the current level of expertise afforded by AI
- many AI programmes have been trained by learning on pirated content
- for the AI company to undertake any such services it needs access to your work – and it could gain far more from accessing the work than you or your publisher gain from using the automated service. The better AI becomes at simulating these services, the harder it will be for humans to continue to work, or bring any influence to bear on, what is created.
Your use of AI as a tool
Use an AI system as a tool only after considering what it can learn from you in return for what it gives you and the unethical way in which many AI programmes have been trained by learning on pirated content. The adage that ‘if something is free, you are the product’ holds true in the AI development race.
If you do use AI as a tool for initial editing and research, check carefully for accuracy. Generative systems often fill gaps in their learning with information that seems credible to the reader but which is fictional, derived from assumed patterns of information.
You should also be transparent in labelling and identifying any AI-generated or AI-assisted content.
If you are sourcing images for your work, check where they came from and how they were created. Most stock image libraries such as Adobe Stock and Shutterstock now list AI-generated images alongside human-created ones by default and some provide text-to-image generation tools. You can usually exclude AI-generated artwork from results while searching these platforms – look for a filter that says ‘Exclude Generative AI’ or ‘Exclude AI images from results’.
If you are working with a designer – for instance to produce a cover for a self-published book – set a clear brief that images created or sourced should not be AI-generated.
Protecting your website
Much of the information used to develop generative AI systems is ‘scraped’ from publicly available websites by automated software.
If you do not want your work and other information to be used in this way, make clear in your website terms and conditions – and perhaps also in your email footer – that you do not permit text and data mining of the information you share online. This will not block unauthorised scraping but it will show the government and others that you do not want companies using your work without permission.
Some generative AI developers – including OpenAI (which owns ChatGPT), Google (which owns Google Bard) and Common Crawl (which owns CCBot) – now give the option of excluding all or part of a website from being used in their systems.
To enable this, ask your website designer to add the following lines to your website’s ‘robots.txt’ file:
User-agent: GPTBot
Disallow: /
User-agent: CCBot
Disallow: /
Unfortunately, this will not apply to material already scraped from your website, and since these lines of code are requests not impenetrable blocks we have no way of knowing to what extent they are respected.
Clearly, an ‘opt-out’ (instead of an ‘opt-in’) and the technical nature of this process is far from ideal, but we hope that other generative AI developers will start to offer a similar level of control to content-owners in the near future.
Your website on third party platforms
If your website is hosted on a third party platform, such as Substack or Patreon, you are unlikely to be able to make the changes mentioned above.
At time of writing (updated 14 February 2024), Substack does not include ‘opt-out’ code on its platform, however Patreon does. Other platforms will have their own approaches. Any platform could change its approach at any time based on user feedback, commercial concerns and other considerations, so we recommend you ask their support teams about it directly.
Uploading your work to cloud services
Check the terms and privacy settings of cloud services you use to store or develop your work. Some services by default reserve the right to analyse your text and images for development purposes. While not always explicitly stated, this could include using your work to train AI systems. For instance, in 2022 Adobe introduced a Content Analysis privacy setting to its Creative Cloud and Document Cloud services. This gives it permission to use the work you store in its cloud for the training and development of its own suite of generative AI products.
Awareness
A common theme in the points above is the need for transparency and for you to understand the risks as well as the opportunities in the systems you and those you work with use. Before entering into agreements, be aware of what you are agreeing to – whether you are about to sign a contract for publication or you are signing up to an online service.
But we live in a tick box world, in which we rarely think twice before clicking uncritically to agree a website’s terms during a sign-up process.
Now more than ever, as companies implement new ways to monetise and manipulate our information, behaviour and creativity, we need to pause at these moments of consent to ensure we clearly understand what we are signing away.
Update 4 June 2024: Meta’s new privacy policy
We have received messages of concern from our members about Meta’s new privacy policy, which comes into effect on 26 June 2024.
Meta is the parent company of Instagram and Facebook.
Per the new privacy policy, agreeing to Meta’s terms and conditions (which you do by either setting up a new account or having an existing account on Instagram or Facebook), you will automatically agree to Meta using your information to develop its artificial intelligence (AI) models.
Your information includes any photos, posts and captions as well as any copyright-protected works (illustrations, photographs, blog posts, etc) shared on Instagram or Facebook.
We know that many of our members use Meta platforms to showcase their creative work and are rightly concerned about Meta using their information (and by extension, their works) to develop AI models.
The Design and Artists Copyright Society (DACS) say that the updated privacy policy ‘represents a significant shift in how user data will be handled’ and that they will be following this closely. We will also be keeping a close eye on this situation and will provide updates as and when we can.
How do I opt-out?
Unfortunately, Meta have complicated the process of ‘opting-out’ of this development.
According to DACS:
In cases where Meta has obtained your personal information from third parties, you have the right to request access to that information and object to it being used to develop Meta’s AI models. However, Meta requires users to fill out forms, provide evidence, and even enter a one-time code received via email.
For information you have shared on Meta products like Instagram or Facebook, you again have the right to object to the information being used to develop Meta’s AI models. There is another form for making this request, but there is no guarantee that Meta will honour your request. You must provide reasons why Meta’s processing impacts you, which Meta must consider before making its decision.
DACS have published a step-by-step guide to objecting to Meta’s use of personal information to develop AI models, which you can find here.
We encourage SoA members to get in touch with us if they have queries or if they want to share their experience.
As with all our published guidance, this information is general. For specific advice on your own situation on this rapidly evolving topic, please get in touch.
We can give you general advice and guidance and spot anything missing or concerning in your contract. It is also important, even if you are happy with your contract or situation, that we see your contract so we can know as much about industry developments as possible. This allows us to monitor situations and AI applications as they evolve so we can effectively lobby and advocate on behalf of all members.
Clear and thought-provoking advice. Thank you.
Really useful information. Thank you.
Excellent advice. The more creators challenge AI theft, the more chance of maintaining human livelihoods. Lobbying needs to wake up UK government, the latter touting AI as simply a one-sided economic positive and denying any human-cost negatives
Isn’t the only way to prevent web scraping to put content behind a paywall?
I have resisted using ChatGPT while revising my novel since I believe there will be copyright issues, and it feels like cheating. But I tossed it a paragraph of a short story out of curiosity. It spat out three redundant literary paragraphs that did not resemble my voice at all. Whose voice was it? One of the authors of the 7000 books it stole from Smashwords, no doubt. What will writers, who have used AI as an assistant, do when restrictions are in place, and they have to write on their own? I foresee an explosion of imposter syndrome in… Read more »
The problem at the moment is people continue to assume techie people are the pinnacle of intelligence. The narcissists of Silicon Valley see the gadgets they create as awesome. And they look down on arty people as somehow having inferior brains. So they assume their machines will be able to write novels – but of course they won’t. The machines, of course, can scan millions of novels and maybe write something coherent about, say, love, but they can’t experience love. Therefore what they write will never touch the soul. Like you say, the ‘voice’ is all wrong. They will also… Read more »
I also doubt that AI can offer wit or humour.
I already know authors who welcome AI as a useful ‘tool’ for assisting with drafting, word choice etc. Their ‘creative input’ may be as simple as just fact-checking and approving the AI’s choices. At the other end, the humble spellchecker will soon be incorporating AI to distinguish its from it’s. I think the only way we’ll be able to separate our work from AI is by its hand-made quality. As AI matures (it’s still a toddler!) it will surely be replacing almost all human-creted prose.
If I could persuade other people to join me, I would like to set up a movement like those that emerged in the nineteenth century against everything being factory-made (the Arts and Crafts Movement for example). This group would act as a backlash against our new technologies: none of our works would be published online, all writing would be done with, at most, a type-writer; none of our works would be sold on Amazon. I doubt the movement would stop all AI developments, just as nineteenth century movements didn’t stop factory-produced goods. But the point of these types of movements… Read more »
Given all the warnings that are coming out about Artificial Intelligence, including, I believe, by people who have been involved deeply in its development – Elon Musk for example, shouldn’t we just press pause for a while and not let any new technologies into our lives until we’ve done ‘health checks’ on the ones we already have? Three years ago the vast majority of people in the UK willingly accepted endless weeks of house arrest to calm down a disease which had an average age of death of 80. If a whole nation can do something like that for a… Read more »
I’m finding this all very troubling. Reading the above I realised that probably one of the editing tools I use is AI and I was trying to be very careful and keep clear of it all. I will go and double check and delete it if I think it’s a problem. Would it be possible to have a list of on-line ‘author tools’ that are AI?
I think as a matter of perception, it would be helpful for the SoA to state that they will not represent, or grant membership to, authors found to have generated writing through AI. It’s important to recognise that the use of AI does not make you a writer – just someone who can use software.