Legal News

Legal decisions and AI—are judges really that predictable?

Legal decisions and AI—are judges really that predictable?
Published on: 01 December 2016
Published by: LexisPSL
  • Legal decisions and AI—are judges really that predictable?
  • An AI project recently predicted the outcomes of hundreds of European Court of Human Rights’ cases with an accuracy of 79%. How was the AI project able to achieve this accuracy?
  • Is there anything particular about human rights cases that allowed the AI software to make predictions with such accuracy? Are these cases more simplistic or complex, for example, than other cases?
  • The authors of this experiment suggest that law firms are increasingly turning to AI to wade through vast amounts of legal data. What does this tell us about the role of AI and automation more generally in the future of the legal system? Can you imagine, for example, AI representation, AI juries or even AI judges?
  • What are the advantages of the increasing role of automation in the legal system?
  • Are there any downsides to a more extensive role for AIs and automation within the legal system?
  • Are there any other developments that practitioners interested in the future role of automation should be aware of?

Article summary

IP&IT analysis: A study recently found that artificial intelligence (AI) software can predict almost 80% of outcomes in human rights cases. Joanne Frears, partner at Blandy & Blandy, considers the implications of this study and the rising use of AI within law more generally, with contributions from Fern Tawera, an LLM (human rights) student currently writing a thesis on AI and human rights. or take a trial to read the full analysis.

Popular documents