20 June 2023
AI litigation to hit Australian shores before regulation, IP lawyers say
This article was originally published by Sam Matthews from Lawyerly on 20 June 2023. We thank Lawyerly for allowing us to publish this article.
Copyright Lawyerly Media. Unauthorised reproduction, distribution or sharing of this article is prohibited.
The growing use of generative AI tools such as ChatGPT could shake up the landscape of intellectual property laws in Australia, and novel questions posed by the technology are likely to be answered in the courts before regulators step in, lawyers say.
The increasing ubiquity of artificial intelligence tools such as chatbot ChatGPT and image generators like DALL-E and Midjourney -- which can instantly generate material ranging from sonnets and song lyrics to summaries of stoic philosophy and Picasso-esque portraits --- poses a number of novel questions in the field of intellectual property law, experts told Lawyerly.
One of the most pressing legal issues will be whether there is infringement of the intellectual property for the data used in the machine learning process, says Ashurst IP and media partner Nina Fitzgerald.
“This includes consideration of rights to use training data -- for example, did training involve copyright infringement or breach of website terms and conditions? -- and how to ascertain how a machine learning system has been trained -- for example, lack of transparency around the training process and whether infringement might have occurred,” she said.
Fitzgerald said the AI training process also raises confidentiality concerns.
“Most generative AI systems use the inputs or data provided to it to continuously train the AI system with a risk that confidential inputs from one user may form part of the output for another.”
The intellectual rights which attach to the materials created by generative AI are another area of concern, Fitzgerald said.
“It will be difficult for a human in Australia to own the output of an AI system given Australia's copyright laws which require a human author.
“In this context, consideration is also being given to the terms and conditions of use as many generative AI tools purport to assign ownership of the output to the user (subject to compliance with the terms) but then also reserve the right to create identical or similar outputs for other users.”
Bartier Perry partner Jason Sprague said some of the IP questions swirling around AI are more easily answered than others.
“Do the AI companies have the right to use the information they’re using to train [the tools]? Do they own the creations of the AI? …What rights does the end user have to the work that’s been generated?
“If you’re an end user, read the terms of use, it’s as simple as that, because AI companies are saying ‘well, I own it’,” he said.
“If you want to say otherwise… do you want to go and take that to court? That’s the uncertain piece.
“Do we want AI companies to control all the works generated by AI, which can generate works far more quickly than human beings can do it?...Will it stifle human endeavour?”
The federal government recently flagged possible regulatory action in the field, announcing a consultation process and committing millions to assist businesses with the rollout of AI technologies.
However, a discussion paper on ‘safe and responsible AI’ released in June makes only passing mention of the technology's implications in the field of intellectual property.
According to the discussion paper, IP Australia’s policy team has formed an ‘AI Working Group’ that is “exploring issues at the intersection of AI and IP”, including whether the government “could consider any potential changes”.
An IP Australia spokesperson told Lawyerly that government agencies forming part of the group – including the Attorney-General’s Department, DFAT, DISR and the DTA – were looking at a broad range of issues “including in relation to both AI inputs and outputs,” but that there is “currently no firm timeline for any law reform or stakeholder consultation process coming out of the working group’s collaborations.”
“However, some individual agencies are working on AI-related issues specific to particular IP rights,” the spokesperson said.
“For example, IP Australia has recently concluded a 12-week sprint exploring the possible implications of generative AI for registered IP rights, and will soon be publishing some provocations to stimulate discussion on the issues.
“In addition, the implications of AI for copyright law were among matters raised at the inaugural Ministerial Roundtable on Copyright, and will be discussed further at an additional roundtable to be hosted by the Attorney-General’s Department in the coming months.”
As test cases in the UK and US begin to pile up against generative AI companies including Stability AI, Midjourney and Open AI, the question is “who will blink first”, Sprague said.
“The courts I think don’t have the luxury of sitting back and waiting,” he said.
“Historically, the legislature has been less keen to take the lead on these things, because they might be concerned they’ll get it wrong, and it’s often then hard to reverse it – they’re looking for guidance from the judicial system about how to apply some of these things.
“But it’s still very much front of mind, there’s a need to deal with it.”
Fitzgerald agreed that litigators will beat regulators to the punch. She predicted “the training of AI systems and the rights to use the training data will be the first area of litigation” in Australia.
“I think there will be a series of test cases regarding AI systems [in Australia] before the IP aspects of AI systems are regulated,” she said.
She pointed to the recent test cases in Australia and other countries over whether AI can be named as an inventor on a patent application.
In November last year, the High Court declined to enter the fray and rule on whether an inventor must be a natural person under the Patents Act, letting stand a Full Court judgment that overturned a landmark victory for AI pioneer Dr Stephen Thaler.
The High Court rejected Thaler's special leave application asking it to overturn a unanimous judgment from a five-judge panel of the Full Federal Court that rejected findings by Justice Jonathan Beach that the Thaler-created AI called DABUS could be listed as the owner of a patent using fractals to create interlocking containers and attention-grabbing light patterns.
While the US Federal Circuit Court declined Thaler's application for a rehearing en banc in October, the UK's highest court agreed in September to hear an appeal by Thaler.
'Piracy on an unprecedented scale'
Overseas, AI company Stability AI has been hit with multiple suits, including copyright infringement cases in the US and UK by stock photo giant Getty Images, accusing Stability of scraping millions of its photos without a license in order to train its Stable Diffusion image generator.
Meanwhile, a class action filed in California in January on behalf of artists alleges Stability “downloaded or otherwise acquired copies of billions of copyrighted images without permission to create Stable Diffusion” without the consent of the artists. The suit argues that “AI image generators are 21st-century collage tools that violate the rights of millions of artists”.
“When used to produce images from prompts by its users, Stable Diffusion uses the training images to produce seemingly new images through a mathematical software process. These 'new' images are based entirely on the training images and are derivative works of the particular images Stable Diffusion draws from when assembling a given output,” the suit says.
“These resulting derived images compete in the marketplace with the original images.
"Until now, when a purchaser seeks a new image 'in the style' of a given artist, they must pay to commission or license an original image from that artist. Now, those purchasers can use the artist’s works contained in Stable Diffusion along with the artist’s name to generate new works in the artist’s style without compensating the artist at all.”
The class action also targets Midjourney and DeviantArt, which operate their own image generators.
Another class action filed against Microsoft and ChatGPT and DALL-E developer Open AI alleges their AI coding tools – GitHub Copilot and OpenAI Codex – are “accomplishing software piracy on an unprecedented scale”.
Fitzgerald and Sprague noted that while the outcome of these overseas test cases overseas may be informative, Australian IP experts will be considering them in light of key legal differences between jurisdictions.
“In particular, in the US, there is a broad fair use defence and the UK has a database right and protections for computer generated works," Fitzgerald said.
“The legislative differences mean overseas decisions must be considered with caution as an Australian court may reach a different result based on the same facts.”
Sprague said overseas test cases could spark legislative changes, but noted that while issues raised in the US cases could inspire similar litigation in Australia, the relevant law is well-settled here.
“If there becomes an impetus for change where we look at these other jurisdictions which we’re very similar to, the question will be – Do we need to adapt? Does the law need to catch up?
“Information that is being accessed or scraped is still subject to standard intellectual property laws we’ve dealt with previously…the US cases are all about copyright infringement because they’ve scraped images without permission – that’s not new law.”
Sprague further noted that while AI companies could get themselves into hot water by using images and other works to train AI without a license, copyright does not protect an artist’s style – only an actual work.
“Copyright is the embodiment of the work itself, just because you’ve imitated a certain style doesn’t mean you’ll breach copyright.”
Could AI inspire a change to authorship laws?
Sprague said there was “some merit” to the idea that AI-generated works should belong to the AI companies – as many of them profess in their terms and conditions of use -- but that “the question is whether the work itself will have the essential elements to get copyright protection.”
“One thing you might look at is a change in position about authorship – UK legislation recognises that the author of a computer generated work is the person who has put in the commands and all things required to generate the work….we haven’t done that here in Australia yet.
“Ultimately it still comes back to that human element…the whole notion of copyright is tied to the creation of something with your hands – whether you write it, paint it, sculpt it or build it, there’s an inherent connection to that human process of creation.
“AI takes it beyond that, it’s almost like commissioning a work…if you commission an art work, the artist is still the author of the work.
Sprague said that we're unlikely to see litigation to enforce the rights of an AI-created work in Australia anytime soon.
"Our law is very clear that unless you are a human being, there’s no copyright in the works….whether someone will challenge that, we’ll wait and see.”
Sprague said arguments about whether end users should receive some kind of inferred license rights to use the works generated from their prompts would involve a question of “how much direction must be given” to the AI.
Legal developments will have to strike a balance between the “public interests of all” who benefit from the use of generative AI against the “private rights of individuals to not have their work copied,” he said.
“Clearly, as with the internet, [AI] has become an integral piece of how the world functions – we couldn’t have cases running every day,” he said.
“Do you stifle creativity if you don’t protect the rights of the individual enough? There’s always going to be that sort of competing interests, and that’s been around since copyright first existed…the amount of information that can be copied, that’s the major issue here."
While legal questions continue to swirl, Sprague said we haven’t yet seen a “fundamental shift that requires immediate action by the legislature.”
“Why do we really need to change something unless it’s for greater public benefit? Maybe we need to relax some laws [around AI] like when the internet came on board,” he said.
“We’re going to use AI, it’s very useful…it’s here to stay, and we have to work out how best we can manage it and handle it.”