To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter points out the significant challenges in holding foundation model developers and deployers clearly responsible for the uses and outputs of their creations under US law. Scienter requirements, and difficulties in creating proof, make it challenging to establish liability under many statutes with civil penalties and torts. Constitutional protections for speech may shield model-generated outputs, or the models themselves, from some forms of regulation—though legal scholars are divided over the extent of these protections. And legal challenges to agencies’ authority over AI systems could hamstring regulators’ ability to proactively address foundation models’ risks. All is not lost, though. Each of these doctrines do have potential pathways to liability and recourse. However, in all cases there will likely be protracted battles over liability involving the issues described in this chapter.
In this chapter, the law scholar Christine Wendehorst analyses the different potential risks posed by AI as part of two main categories, safety risks and fundamental rights risks. Based on this, the author considers why AI challenges existing liability regimes. She spells out the main solutions put forward so far and evaluates them. This chapter highlights the fact that liability for fundamental rights risks is largely unchartered while being AI-specific. Such risks are now being addressed at the level of AI safety law, by way of prohibiting certain AI practices and by imposing strict legal requirements concerning data governance, transparency, and human oversight. Wendehorst nevertheless argues that a number of changes have to be made for the emerging AI safety regime to be used as a ‘backbone’ for the future AI liability regime if this is going to help address liability for fundamental rights risks. As a result, she suggests that further negotiations about the AI Act proposed by the European Commission should be closely aligned with the preparatory work on a future AI liability regime.
Since the second AI revolution started around 2009, society has witnessed new and steadily more impressive applications of AI. The growth of practical applications has in turn led to a growing body of literature devoted to the liability issues of AI. The present text is an attempt to assess the current state of the law and to advance the discussion.
The rapid development of robotics and intelligent systems raises the issue of how to adapt the legal framework to accidents arising from devices based on artificial intelligence (AI) and machine learning. In the light of the numerous legal studies published in recent years, it is clear that ‘tort law and AI’ has become one of the hot topics of legal scholarship both in national and comparative contexts.
Tort law has a number of systems and structures by which the system will be used to address the challenges posed by AI technologies. It will not be necessary to significantly alter our understanding of tort law’s foundations to be ready for AI, and this may significantly and potentially affect AI innovation and utilization.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.