While AI technology brings immense benefits to society, there are circumstances where AI-driven products or services fail. In these circumstances, it is important to consider issues around liability, particularly considering the rapid expansion in the use of AI since the onset of the COVID-19 pandemic.
In Australia, there has been some focus by the government on liability for specific AI inventions, such as driverless vehicles. In 2017, the Standing Committee on Industry, Innovation, Science and Resources completed its Inquiry into the Social Issues Relating to Land-Based Driverless Vehicles in Australia which discussed the uncertainty of legal responsibility and insurance in the case of car accidents where there is some automation. The Committee recognised that the introduction of driverless vehicles may require a change ‘in the way vehicles are insured and in the current understanding of legal liability’.
Developments are also occurring in relation to AI in the area of intellectual property protection in Australia. For example, there are questions as to whether AI creations might be protected under copyright, since there are requirements that a work is original and originates from an author (and it is necessary to have a human author*). This contrasts with the position in the UK, where works generated by a computer are protected, even if there is no human creator.
On appeal in Commissioner of Patents v Thaler [2022] FCAFC 62, the Full Court of the Federal Court of Australia unanimously found that an AI machine cannot be named as an inventor in a patent application, and that the inventor listed in an application for a patent must be a natural person and the High Court of Australia subsequently refused Dr Stephen Thaler’s application for special leave to appeal this decision.
While liability has been explored in the context of automated cars, there is no general framework for AI liability. This means that liability regimes for AI can include the areas of tort, product liability, or contract law. In relation to tortious liability, parties who are liable could be extensive and might include, for example, the manufacturer, operator, creator, or owner of the AI.
In many cases, liability for AI products and services is likely to be simple. However, more sophisticated uses of AI could challenge existing liability regimes, and different forms of liability for AI may need to be considered. Some proposals suggest giving AI legal personhood or personality, or introducing strict liability regimes for AI.
*IceTV Pty Ltd v Nine Network Australia Pty Ltd (2009) 239 CLR 458
Considering the uncertainties around liability for AI products and services, this is an area that is likely to attract more attention in coming years. As AI products and services develop and become more complex, traditional liability regimes are likely to be challenged. Businesses should stay alert as to potential changes in this area.
*Information is accurate up to 27 November 2023