Artificial intelligence technologies within the justice system can have serious implications for human rights and civil liberties, and may not be properly understood by those using them, a House of Lords committee warns today.

In its report Technology rules? The advent of new technologies in the justice system, the Justice and Home Affairs Committee explores the use of AI and other advanced algorithmic tools in activities to discover, deter, rehabilitate, or punish people who break the law in England & Wales. The report acknowledges the potential of these technologies but warns against the pace of their deployment and the absence of appropriate safeguards.

The peers said they were taken aback by the proliferation of AI tools being used without proper oversight, particularly by police forces across the country.

Although it regards informed scrutiny as essential, the committee instead uncovered a landscape in which new technologies are "developing at a pace that public awareness, Government and legislation have not kept up with – a new Wild West".

It describes the market as "worryingly opaque", noting: "We were told that public bodies often do not know much about the systems they are buying and will be implementing, due to the seller’s insistence on commercial confidentiality — despite the fact that many of these systems will be harvesting, and relying on, data from the general public.

"This is particularly concerning in light of evidence we heard of dubious selling practices and claims made by vendors as to their products’ effectiveness which are often untested and unproven."

Committee chair Baroness Hamwee commented: "We had a strong impression that these new tools are being used without questioning whether they always produce a justified outcome. Is 'the computer' always right? It was different technology, but look at what happened to hundreds of Post Office managers."

Launching the report, she asked: "What would it be like to be convicted and imprisoned on the basis of AI which you don’t understand and which you can’t challenge?"

In addition, although AI offers a "huge opportunity" to better prevent crime, there is also a risk it could exacerbate discrimination. The committee heard repeated concerns about the dangers of human bias contained in the original data being reflected, and further embedded, in decisions made by algorithms.

In order to achieve better scrutiny, the report calls for the establishment of a mandatory central register of algorithms used by the police and in the justice system. It also recommends the introduction of a duty of candour on the police.

To clarify governance and guarantee the integrity of the technologies being deployed, it calls for a national body to be established to set strict scientific, validity, and quality standards and to "kitemark" new technological solutions against those standards.

And to secure the good use and close monitoring of new technologies, it recommends the mandatory training of technology users and the development of local ethics committees within police forces.

The committee argues that, as the use of new technologies is becoming routine, these proposed reforms will maximise their potential while minimising the associated risks. They would reverse the status quo in which a culture of deference towards new technologies means the benefits are being minimised, and the risks maximised.

Calling on the Government to "take control" and bring forward legislation to establish clear principles that provide a basis for more detailed regulation, Baroness Hamwee added: "Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked."

She concluded: "We welcome the advantages AI can bring to our justice system, but not if there is no adequate oversight. Humans must be the ultimate decision makers, knowing how to question the tools they are using and how to challenge their outcome."

Access the report and related papers here.