Suggested searches

AI as an inventor – Implications of the DABUS Decision

Published
20 August 2021
Please subscribe to get latest updates
Vestibulum quam mauris, pulvinar non orci.
Authors
Akanksha Dahiya

Akanksha Dahiya

Principal, Sydney | BEng (Elec & Telecom), MEng (Telecom), MIP Law
Share

Implications of the Federal Court’s decision that an AI system can be an inventor

Following our recent legal update on Thaler v Commissioner of Patents [2021] FCA 879, we provide in this article a more detailed analysis of the judgement and a consideration of its possible implications. The decision, delivered on 30 July 2021, found that an artificial intelligence (AI) system can be an “inventor” for the purposes of the Australian Patents Act 1990. This development has already been the subject of significant media attention globally.

As we previously reported, the Patent Office has the option to appeal the decision to the Full Court of the Federal Court of Australia. Irrespective of the outcome of any appeal, we do not see the decision as impacting Australia’s status as a relatively friendly jurisdiction for protecting human-devised inventions that utilise AI (as discussed in our earlier article on this topic). However (and, in particular, if confirmed on appeal), the decision may have significant implications for other aspects of Australian patent law.

Implications

The decision to recognise non-humans as inventors under the Australian Patents Act 1990 may well have implications beyond applications for which an AI system is purportedly the inventor. In particular, we see the decision as potentially impacting any assessment made under the Act that involves a “person skilled in the art” (PSA). These assessments include issues of inventive step and support for claims, as well as the requirement to provide a clear and complete disclosure of an invention.

In regards to inventive step, this is assessed in Australia from the perspective of a PSA equipped with the relevant common general knowledge (CGK). The PSA is the “hypothetical addressee” of the patent specification1, possessing a “practical interest in the subject matter” of the invention2..

By recognising that an AI system can be trained to “invent”, there would appear to be a significant question around whether an AI system could have a “practical interest in the subject matter” of a patent application, and could thus be considered as a PSA. This would have consequential implications for assessing inventive step, as questions of how AI systems solve problems and what is obvious to an AI system may need to be considered.

In addition, it is relevant that the human beings behind AI systems may be experts in computer science, but not necessarily in the field of the particular invention that the AI system is trained to generate–in the case of the DABUS application, the invention itself was unrelated to AI or computer science, being concerned with a food container with a particular geometry. This could lead to uncertainties in identifying exactly who is a PSA for an AI-generated invention and the skills that the PSA should be deemed to possess.

Another implication for inventive step arises when the PSA is considered to be a human being assisted by an AI system. This was discussed by Beach J at [145] in the decision, noting that (our emphasis):

“the threshold for inventiveness might rise if the “person skilled in the relevant art” can be taken, in the future, to be assisted by or has access to artificial intelligence or is taken as part of the common general knowledge to have knowledge of developments produced by artificial intelligence in the relevant field. But that is a separate question.”

It will be very interesting to see if any of these issues (or indeed, any other tantalising questions that flow from this decision) arise should the application the subject of the decision undergo substantive examination before the Australian Patent Office (or any other challenge).

Thaler v Commissioner of Patents

The decision was concerned with an Australian patent application entitled “Food container and devices and methods for attracting enhanced attention” (the DABUS application). The application listed “DABUS” (or Device for the Autonomous Bootstrapping of Unified Sentience) as the inventor, as well as noting that “The invention was autonomously generated by an artificial intelligence”. The Applicant of the DABUS application was Dr Stephen Thaler.

The AI system that purportedly generated the inventions claimed in the DABUS application is not described in the patent specification. However, Dr Thaler gave evidence that he was the developer of the AI system and the owner of copyright in the system’s source code. Dr Thaler described the system in evidence as involving two neural networks arranged to interact with each other. According to Dr Thaler, the first neural network initially performed supervised learning and required a “human-in-the-loop” to identify links between different fundamental topics input to the neural network. These links were “perturbed” in order to create new inventions. The second neural network implemented reinforcement learning that recognised the perturbations as new inventions and assessed the utility or benefit thereof.

The key issue the Court was required to decide was whether an “artificial intelligence” could be validly named as an inventor on an Australian patent application. However, we see the Court’s considerations as discussed below as applying to any entity or agent that is capable of “invention”.

Considerations

The Act and Regulations do not preclude non-human inventors.

The Court observed that the word “inventor” is not defined in the Act or in the Patent Regulations 1991 and considered that it should be given its ordinary meaning. When considering the ordinary meaning of the word “inventor”, the Court did not accept the Commissioner’s submission that the word requires the involvement of a human being. Rather, the Court found that the word is an “agent noun” which can refer to a person or a thing:

In this respect then, the word “inventor” is an agent noun. In agent nouns, the suffix “or” or “er” indicates that the noun describes the agent that does the act referred to by the verb to which the suffix is attached. “Computer”, “controller”, “regulator”, “distributor”, “collector”, “lawnmower” and “dishwasher” are all agent nouns. As each example demonstrates, the agent can be a person or a thing. Accordingly, if an artificial intelligence system is the agent which invents, it can be described as an “inventor”.

(at [120], emphasis added)

The interpretation of the word “inventor” as an agent noun was a critical component in the Court’s reasoning that an AI system can be an inventor for the purposes of the Patents Act 1990.

The flexible concept of an inventor

In reaching its decision, the Court also found that the concept of an “inventor” should be viewed in a similarly flexible and evolutionary way as what is considered to be patentable subject matter (or a “manner of manufacture”). In this regard, the Court observed that:

in considering the scheme of the Act, it has been said that a widening conception of “manner of manufacture” is a necessary feature of the development of patent law in the twentieth and twenty-first centuries as scientific discoveries inspire new technologies” (D’Arcy v Myriad Genetics Inc (2015) 258 CLR 334 at [18] per French CJ, Kiefel, Bell and Keane JJ). I see no reason why the concept of “inventor” should not also be seen in an analogously flexible and evolutionary way. After all the expressions “manner of [new] manufacture” and “inventor” derive from the 21 Ja 1 c 3 (Statute of Monopolies) 1623 (Imp) s 6. There is a synergy if not a symmetry in both being flexibly treated. Indeed, it makes little sense to be flexible about one and not the other. Tension is created if you give flexibility to “manner of manufacture” and then restrict “inventor”. You would be recognising an otherwise patentable invention and then saying that as there is no inventor it cannot be patented.

(at [121])

AI as an inventor aligns with the object of the Act

Pursuant to recent legislation3, the Patents Act 1990 now includes an object section. The object of the Act is set out in s2A as follows:

The object of this Act is to provide a patent system in Australia that promotes economic wellbeing through technological innovation and the transfer and dissemination of technology. In doing so, the patent system balances over time the interests of producers, owners and users of technology and the public.

The Court referred to the object of the Act in reaching its decision; finding that allowing AI systems to be named as inventors would incentivise technological innovation in computer science and other fields to develop creative machines and to use the output of such machines (at [124]-[125]).

Positions in other jurisdictions

As we have reported previously, the Court’s decision in Australia is directly at odds with the findings of Patent Offices and Courts in other jurisdictions. Since our report, the corresponding application has been granted in South Africa4, however it is relevant that this jurisdiction does not have a substantive patent examination system.

In the United States, the United States Patent and Trademark Office (USPTO) refused DABUS to be named as inventor as contrary to the applicable legislation. In this regard, 35 U.S.C. § 100(f) defines an “inventor” as “the individual or, if a joint invention, the individuals collectively who invented or discovered the subject matter of the invention”. In addition, 35 U.S.C § 115 refers to an “individual” who must swear an oath as the “individual” who “believes himself or herself to be the original inventor or an original joint inventor of a claimed invention in the application” (emphasis added).

The European Patent Office (EPO) also found that DABUS could not be validly named as an inventor on a patent application. In reaching its decision, the EPO found it to be a relevant consideration that a non-human cannot exercise the various rights conferred by inventorship under the European Patent Convention.

Similarly, both the UK Intellectual Property Office (UKIPO) and the England and Wales High Court5 found that an “artificial intelligence” is not an “inventor” for the purpose of the Patents Act 1977 (UK) and cannot be named as such in a patent application. The English Court’s conclusion was partly based on a finding that a non-human cannot exercise legal rights and fulfil responsibilities.

Although the Australian Court came to a similar conclusion about a non-human’s inability to exercise legal rights, it did not prevent the Court from ultimately deciding that an AI system can be an inventor for the purposes of the Australian Patents Act 1990.

It should be noted that corresponding applications are currently pending before a variety of other Patent Offices, so it will be interesting to see whether any other countries follow Australia’s trailblazing example.

Conclusions

As we have explained previously, Australia is a relatively friendly jurisdiction for protecting AI inventions. As the Court’s decision was not concerned with human-devised inventions that involve AI (as opposed to inventions that are generated by AI), we do not see it as impacting on Australia’s favourable status as a filing destination.

It remains to be seen whether the decision, if confirmed on appeal, will result in more applications being filed in Australia that list an AI as the inventor. However, in view of the fact that such inventions probably cannot be protected in other major jurisdictions, perhaps follow-on patenting activity may be very limited.

At the same time, if confirmed on appeal, and in the absence of legislative action, the decision could potentially impact how other assessments are made under the Act.

1 Ranbaxy Australia Pty Ltd v Warner-Lambert Company LLC (No 2) [2006] FCA 1787.
2 Catnic Components Ltd v Hill & Smith Ltd [1982] RPC 183.
3 Intellectual Property Laws Amendment (Productivity Commission Response Part 2 and Other Measures) Act 2020
4 Application 2021/03242
5 STEPHEN L THALER-and-THE COMPTROLLER-GENERAL OF PATENTS, DESIGNS AND TRADE MARKS [2020] EWHC 2412 (Pat)

About the Author

Akanksha Dahiya

Principal, Sydney | BEng (Elec & Telecom), MEng (Telecom), MIP Law

Akanksha’s focus: electronics, telecommunications, and software engineering technologies.

Learn more about Akanksha
Scroll to Top

Suggested searches

Skip to content