Partner, Temple Bright LLP
AI and Intellectual Property – The Latest UK and EU Law
Jeremy advises UK and international business clients on brand strategy, intellectual property rights and disputes, and data protection law, at UK law firm Temple Bright.
AI can write music, understand natural language, analyse vast data lakes and make inventions. Legislators and policy makers around the world are starting to grapple with what this means for copyright, patents and other intellectual property, and for developers and users of AI systems.
At the heart of these developments lies the adaptive and autonomous nature of AI. ‘Adaptive’ means that AI needs no pre-programmed instructions: it learns as it goes along, based on training data, and its subsequent methods are not always transparent. ‘Autonomous’ means that AI can execute decisions without human command and control.
Access to Data for Training
Data for AI training includes text, images and other content likely to be protected by copyright and (in Europe) database rights. Loading these into computer memory requires permission from underlying rights holders and may be a breach of access terms. That is so whether data have been scraped from web pages, or licensed to subscribers. In the latter case, the licence does not necessarily allow uploading for AI training purposes.
Currently, in the UK, there are limited exceptions to copyright where copying is for non-commercial research purposes, and no exception at all for database right (which protects non-copyright collections of data). Even the express copyright exception is unhelpful for AI training, because that will usually be commercial.
EU law deals more generously with this kind of ‘text and data mining’. Commercial mining is allowed, as an exception to both copyright and database right, although the organisation must already have lawful access to the material (for example, under a subscription arrangement). Rights holders can exclude materials from access, except in the case of research organisations or cultural heritage institutions conducting scientific research.
A recently proposed EU Data Act also makes clear that data produced from operation of machinery of any kind will not be protected by database right, and must be made available to aftermarket service providers on fair, reasonable and non-discriminatory terms.
To redress the imbalance in the UK, the government recently finished a consultation exercise, concluding that a new exception for text and data mining should be introduced. The new exception will cover both copyright and database right, and apply for any purposes. Rights holders will not be able to opt out. Having said that, there will be a requirement for lawful access to the material in the first place, so rights holders can choose not to make their content available either at all, or unless a fee is paid.
Outputs of AI – Creative Content
In 2011, photographer David Slater set up a camera in the Indonesian jungle, a monkey took a great selfie that went viral, and an unusually exotic copyright dispute flared up. Fast forward to the age of AI and we see similar issues.
Under European law, necessary originality for copyright purposes can come from making creative choices (photographic subject-matter, angle, lighting, and so on) or just ‘being in the right place at the right time’, and it does not matter who (or what) pressed the button. On the other hand, mere ownership of the camera (or other equipment) does not confer ownership of copyright, unless an applicable contract says so.
UK copyright law has long recognised that computer-generated works with no human author can attract copyright protection. This contrives the ‘author’ or ‘designer’ to be the person who made the necessary arrangements for the work to be created. ‘Person’ here includes a corporate entity, and is quite likely to be different from the person designing the software or system.
Computer-generated works have to be distinguished from works made using a computer system as an aid, where the human is indeed the true author or designer. These distinctions will not always be straightforward when it comes to using AI systems for creative projects, however.
Outside the UK, most countries view computer-generated copyright works as contrary to the essential principle of ‘originality’. As a result, they are much less likely to be protected by copyright. An example is Dr Stephen Thaler’s failed attempt to register US copyright for a work authored solely by the ‘Creativity Machine’ AI system. (The relevant USPTO guidelines also, incidentally, clarify that works supposedly created by divine or supernatural beings will be refused…)
The UK government has announced that our current protections for computer-generated works incentivise investment in AI, and there are no plans to make changes.
Deepfakes and Marketplaces
AI that controls presentation of sales offers, via online marketplaces or IoT, could have an impact on trade mark law constructs such as the ‘average consumer’ and ‘likelihood of confusion’. These are complex points, and the government feels that AI is not yet developed enough to have a meaningful impact in this area.
AI also opens the possibility for simulated likenesses of deceased or retired performers, and false attribution of speech and actions to non-consenting individuals. The UK government has not drawn clear conclusions on how to deal with this, although it says it may not be best left to intellectual property laws to resolve.
Inventions by AI
Dr Thaler (see Outputs of AI, above) is also famous for attempting to obtain patent protection for AI-generated inventions, with the AI system named as sole inventor. DABUS (Device for the Autonomous Bootstrapping of Unified Sentience), an AI system, autonomously developed inventions that included a container lid and a warning light.
No patent office has denied that, if Dr Thaler had filed the patent application naming himself as inventor and applicant, all would have been fine. In other words, there is nothing preventing the patenting of inventions made by humans using AI tools. What they rejected was the suggestion that an AI system could properly be cited as sole inventor, and that Dr Thaler could then claim title merely through owning the system that, he said, made the invention without his involvement. Appeal courts in England, the EU, the US and Australia all took a similar view.
Whilst the outcome was perhaps not a surprise to many patent lawyers, there remain some difficulties in fitting inventions by AI systems into patent law. For example, what kind of involvement in the activity of the AI system should entitle a business to file a patent application in its name? In practice, where multiple entities are involved, this should be dealt with by contract. Secondly, if a system would inevitably have produced that output, is it actually inventive at all?
The UK government recently published its conclusions on patents for AI-generated inventions, following public consultation. There was concern that a proliferation of AI-generated inventions, concentrated among a few dominant industry players, could disadvantage SMEs. The government saw no need to change the requirement for a human inventor to be named on patents. It also considered, but rejected, expansion of ‘inventors’ to include those who perform programming, input data or select outputs based on commercial value. Most respondents to the consultation felt that any changes in this area will need to be harmonised internationally, and that AI was not yet developed enough to make a real impact on the concept of inventorship. No doubt this will be reviewed in the future.