Skip to content
Law Society of Scotland
Search
Find a Solicitor
Contact us
About us
Sign in
Search
Find a Solicitor
Contact us
About us
Sign in
  • For members

    • For members

    • CPD & Training

    • Membership and fees

    • Rules and guidance

    • Regulation and compliance

    • Journal

    • Business support

    • Career growth

    • Member benefits

    • Professional support

    • Lawscot Wellbeing

    • Lawscot Sustainability

  • News and events

    • News and events

    • Law Society news

    • Blogs & opinions

    • CPD & Training

    • Events

  • Qualifying and education

    • Qualifying and education

    • Qualifying as a Scottish solicitor

    • Career support and advice

    • Our work with schools

    • Lawscot Foundation

    • Funding your education

    • Social mobility

  • Research and policy

    • Research and policy

    • Research

    • Influencing the law and policy

    • Equality and diversity

    • Our international work

    • Legal Services Review

    • Meet the Policy team

  • For the public

    • For the public

    • What solicitors can do for you

    • Making a complaint

    • Client protection

    • Find a Solicitor

    • Frequently asked questions

    • Your Scottish solicitor

  • About us

    • About us

    • Contact us

    • Who we are

    • Our strategy, reports and plans

    • Help and advice

    • Our standards

    • Work with us

    • Our logo and branding

    • Equality and diversity

Journal logo
  • PRACTICE

    PRACTICE

    • Practice

    • Corporate law

    • Criminal law

    • Employment law

    • Environment law

    • Family law

    • Industry updates

    • Intellectual property

    • Property law

    • Technology law

    • Technology and innovation

    • Practice

    • Corporate law

    • Criminal law

    • Employment law

    • Environment law

    • Family law

    • Industry updates

    • Intellectual property

    • Property law

    • Technology law

    • Technology and innovation

  • PEOPLE

    PEOPLE

    • People

    • Equality, diversity & inclusion

    • Ethics & professional responsibility

    • Obituaries

    • Wellbeing & support

    • Noticeboard

    • People

    • Equality, diversity & inclusion

    • Ethics & professional responsibility

    • Obituaries

    • Wellbeing & support

    • Noticeboard

  • CAREERS

    CAREERS

    • Careers

    • Job board

    • Leadership

    • Management

    • Skills

    • Training & education

    • Careers

    • Job board

    • Leadership

    • Management

    • Skills

    • Training & education

  • KNOWLEDGE BANK

    KNOWLEDGE BANK

    • Knowledge Bank

    • Book club

    • Interviews

    • Sponsored content

    • Knowledge Bank

    • Book club

    • Interviews

    • Sponsored content

  • ABOUT THE JOURNAL

    ABOUT THE JOURNAL

    • About the Journal

    • Contact us

    • Journal Editorial Advisory Board

    • Newsletter sign-up

    • About the Journal

    • Contact us

    • Journal Editorial Advisory Board

    • Newsletter sign-up

Using AI to improve the administration of justice

4th November 2024

'Using AI in the Justice System' was a well-received session on Day 2 of the Society’s 2024 Annual Conference.

Paul Mosson, Executive Director of Member Services & Engagement at the Society introduced Ellen Lefley, senior lawyer at JUSTICE, who provided an insightful overview of the complexities and implications of integrating AI into legal and judicial processes.

The session outlined both opportunities and risks presented by AI, particularly in enhancing access to justice and the impartiality of decision-making and covered the questions:

  1. What are the key opportunities for using AI in the justice system?
  2. How do we navigate the risks? Do we need new frameworks to help us? Whose responsibility is it?
  3. How important is the human element in decision-making? Could or should decision makers in the justice system (e.g. police, judges or juries) be replaced by AI)?

JUSTICE mission

JUSTICE is an independent law reform charity founded in 1957 in England and Wales, and 2012 for its Scotland branch.

JUSTICE operates as an independent body advising policymakers within the judiciary and the Ministry of Justice, advocating for a system that upholds the rule of law and human rights. The organisation's work spans reactive measures, such as responding to government consultations and legislative reviews, and proactive projects aimed at identifying and promoting necessary reforms. Ellen highlighted that, “it’s within this proactive stream of work that we’re working on artificial intelligence, human rights and the law”. The current focus on AI forms part of the ongoing commitment to ensure the justice system evolves in a manner that benefits society comprehensively.

The role of technology in justice

Technology's influence on the justice system is not new. “JUSTICE has been engaged in questions about video technology and the way in which that could improve how our justice system does what it does”, for a while, commented Ellen. As many are familiar, during the COVID-19 pandemic, video technology was scrutinised for its potential to improve procedural efficiency while posing new risks, such as digital exclusion. The AI work stream initiated in 2024 seeks to build upon this by exploring both the potential and the perils of AI applications in legal contexts.

You can read more about JUSTICE's AI, human rights and the law guide here.

Defining AI

Whilst not being a computer scientist herself, Ellen highlighted early on in her presentation, that if attendees were going to take away one thing from her session, it would be “that there is no agreed upon, authoritative, ultimate and singular definition of AI”.

However, the Organisation for Economic Co-operation and Development (OECD) defines it as “a machine-based system that, for explicit or implicit objectives, infers from the input it receives how to generate outputs such as predictions, content recommendations, or decisions that can influence physical or virtual environments”. The UK government’s 2023 consultation also emphasised ‘autonomy’ and ‘adaptiveness’ as essential features, but Ellen noted, “the adaptivity of AI can make it difficult to explain... the intent or logic of the system”.

Key Opportunities in AI for Justice

Ellen’s discussion highlighted three primary areas where AI could positively impact the justice system:

  1. Equal and Effective Access to Justice: The current system struggles with accessibility, notably due to austerity measures that have limited public funding. “There are examples of AI chatbots... delivering legal advice to those who can’t afford a lawyer”. High translation costs and complex procedures further disadvantage individuals, particularly litigants in person. AI tools such as chatbots powered by large language models (LLMs) and automatic translation services could ameliorate these challenges by providing more accessible legal advice and reducing procedural barriers.
  2. Independent, Impartial, and Competent Decision-Making: Existing disparities in criminal justice outcomes, particularly affecting Black and ethnic minority groups, underscore the need for unbiased judicial processes. Ellen mentioned, “AI might be able to improve the training and make training more personalised for decision makers”. However, it must be ensured that AI systems operate effectively and do not entrench existing biases.
  3. Openness to Scrutiny and Public Engagement: The current system's opacity is marked by unpublished judgments and limited public access to case details. AI could facilitate automatic transcription and data analysis, increasing transparency and enabling better scrutiny. “AI can help demystify everything that goes on in our justice system”. This would help ensure that justice is not only done but seen to be done.

Notable risks

Ellen emphasised that while AI offers significant promise, it carries inherent risks that must be navigated to prevent undermining the justice system's goals. “Will this make things worse rather than better?” is a question JUSTICE asks when assessing new AI initiatives. Examples of the risks to consider, include:

  1. Accuracy and reliability: AI-powered legal advice, while potentially beneficial, could disseminate incorrect or misleading information. The risks are especially pronounced if users, particularly litigants in person, lack the expertise to verify AI outputs. “If the user... has fewer resources on which they can pull to assess that accuracy... that’s when we get into high-risk scenarios”.
  2. Bias in decision-making: The example of the COMPAS system in the United States, used for sentencing guidance, illustrated how AI can reinforce discriminatory outcomes due to biases embedded in training data. “The lack of impartiality... was just data-derived rather than explicitly human-derived, because there were societal biases embedded in it”. This highlighted that AI, if not carefully designed and regulated, could replicate and magnify societal biases.
  3. Privacy concerns: The use of technologies such as facial recognition, already implemented by police forces in South Wales and London, raises significant privacy issues. “There’s definitely a risk to privacy rights” that needs to be carefully weighed.

Mitigation strategies

To address these risks, JUSTICE proposes a tripartite framework for evaluating AI's role in the justice system. Important to note this is very much a draft for now, but will hopefully be proposed to policy makers in due course. The framework currently includes:

  1. Outcome-focused approach: Policymakers should start with clearly defined objectives, aligning AI applications with the broader goals of ensuring equal access, impartial decision-making, and transparency.
  2. Risk assessment: Each AI initiative should be rigorously assessed for potential risks to these objectives. This includes evaluating the severity and likelihood of adverse impacts, along with appropriate safeguards.
  3. Responsible adaptation: Policymakers must be prepared to halt AI applications that present unacceptable risks. “Any response to AI innovation must be willing to stop... and say no” when necessary. A “tick-box” approach to risk assessment is insufficient; genuine willingness to forgo AI when necessary is essential to maintain public trust and uphold the justice system's integrity.

Broader reflections and engagement

Attendees at the session shared their examples of how they are currently utilising AI, noting applications such as diary management for improved efficiency. Questions were understandably raised about the use of tools like ChatGPT, especially in terms of confidentiality and data security, reflecting ongoing concerns about how legal professionals can leverage AI responsibly.

Ellen’s discussion reinforced that while AI holds great potential to revolutionise the justice system, it must be approached with caution and responsibility. “The key is ensuring that any technological integration upholds the rule of law and enhances the protection of human rights,” she stated. By adopting a thoughtful and measured approach, the legal community can harness AI's strengths while mitigating its risks, paving the way for a more accessible, fair, and transparent justice system. The conversation around AI and justice is far from over, and continued dialogue and vigilance will be essential in shaping a future where technology serves as an ally, not an obstacle, to justice.

You can find out more about JUSTICE and the Scottish branch here.

The Society, in partnership with Wordsmith (the 2024 Annual Conference main sponsor), recently launched their AI Guide which you can access here. 

Written by Rebecca Morgan, Editor of the Journal, Law Society of Scotland

About the author
Add To Favorites

Additional

https://www.clio.com/uk/?utm_medium=bar_partner&utm_source=law-society-scotland&utm_campaign=law-society-scotland-q2
https://www.evelyn.com/people/keith-burdon/
https://lawware.co.uk
https://www.findersinternational.co.uk/our-services/private-client/?utm_campaign=Scotland-Law-society-Journal-online&utm_medium=MPU&utm_source=The-Journal
https://yourcashier.co.uk/
https://www.lawscotjobs.co.uk/client/frasia-wright-associates-92.htm

Related Articles

Regulation of Legal Services (Scotland) Bill approved by Parliament after decade of work

21st May 2025
One of the longest legislative processes in Scottish parliamentary history has concluded with new powers which regulators say will better...

Authorising the Algorithm — what the first AI-driven law firm signals for legal practice

21st May 2025
Garfield.Law Ltd is the first purely AI-based firm approved to provide legal services. Dr Corsino San Miguel looks at this...

Public Policy Highlights April 2025 including Net Zero, Legal Aid and Human Rights

21st May 2025
The Law Society Policy team and its network of committee volunteers respond to issues of legal aid, net zero, human...

Journal issues archive

Find all previous editions of the Journal here.

Issues about Journal issues archive
Law Society of Scotland
Atria One, 144 Morrison Street
Edinburgh
EH3 8EX
If you’re looking for a solicitor, visit FindaSolicitor.scot
T: +44(0) 131 226 7411
E: lawscot@lawscot.org.uk
About us
  • Contact us
  • Who we are
  • Strategy reports plans
  • Help and advice
  • Our standards
  • Work with us
Useful links
  • Find a Solicitor
  • Sign in
  • CPD & Training
  • Rules and guidance
  • Website terms and conditions
Law Society of Scotland | © 2025
Made by Gecko Agency Limited