techUK
Printable version

House of Lords Report questions technology’s role in the justice system

On 30 March 2022, The House of Lords Justice and Home Affairs Committee published its report on the advent of new technology in the justice system

The report, Technology rules? The advent of new technology in the justice system', highlights how public awareness, government and legislation are not keeping up with the new technology developing and that appropriate steps to ensure the technology is safe, necessary, proportionate and effective aren’t always taken. In addition, without sufficient safeguards, supervision and caution, advanced technologies used across the justice landscape could undermine human rights, risk the fairness of trials and damage the rule of the law. This includes the use of facial recognition by police, and the use of data analytics and artificial intelligence (AI) by the law enforcement community.

The report acknowledges the benefits of advanced technologies for preventing crime, increasing efficiency, and generating new insights that feed into the criminal justice system but explores how new technological developments are moving at pace and, controls have not kept up.

The Rt Hon Kit Malthouse MP, the Minister for Crime and Policing at the Home Office and Ministry of Justice, highlights in the report his excitement for the use of artificial intelligence (AI) and machine learning in policing with advanced tools providing substantial assistance towards “enacting the crucial duties of the police to protect and prevent harm”. Further adding, new technologies, if used appropriately have the potential to “increase trust in the rule of law”.

However, the committee highlights their concern around lack of mandatory training for users of AI, such as facial recognition and the need for individuals using the tools to understand their limitation and know how to question the tool and challenge its outcome. Consistency in training is also vital.

Risk of exacerbating discrimination. The report highlights concern around human bias contained in original data being reflected, and further embedded, in algorithmic outcomes. The report highlights concern around selling practices and the products’ effectiveness which are often untested and unproven.

The Committee’s report has a number of recommendations. It is calling for:

  • the establishment of a mandatory register of algorithms used in relevant tools.
  • the introduction of a duty of candour on the police to ensure full transparency over their use of AI. AI can have huge impacts on people's lives, particularly those in marginalised communities. Without transparency, there can be no scrutiny and no accountability when things go wrong. 
  • the establishment of a proper governance structure with the ability to carry out regular inspections.
  • urgent streamlining and reforms to governance to be supported by a strong legal framework.
  • legislation to be introduced to “establish clear principles applicable to the use of new technologies, as the basis for detailed supporting regulation which should specify how these principles must be applied in practice”.

In addition, the report highlights that most public bodies lack the expertise and resources to carry out evaluations, and procurement guidelines do not address their needs. It recommends that a national body should be established to set strict scientific, validity, and quality standards and to certify new technological solutions against those standards. No tool should be introduced without receiving certification first, allowing police forces to procure the technological solutions of their choice among those ‘kitemarked'.  

Improved multi-agency working and breaking down silos

With over 30 public bodies, programmes and initiatives playing a role in the governance of new technologies in the application of the law, the landscape can be confusing. The report suggests the system needs streamlining and that there needs to be coordination between Government departments as roles are unclear, functions overlap and joint working is ‘patchy’.

The committee decided to examine the use of these tools in order to highlight where change was needed, identifying some principles for the safe and ethical use of such tools. Whilst findings indicated this had largely been done – academic and civil society alike have produced plenty of work which recommends worthy principles – ‘what not one has quite done yet is to suggest how these principles should be brought together and put into practice’

In launching the report Baroness Hamwee, Chair of the Justice and Home Affairs Committee, said: 

“What would it be like to be convicted and imprisoned on the basis of AI which you don't understand and which you can't challenge?  

“Government must take control. Legislation to establish clear principles would provide a basis for more detailed regulation. A “kitemark” to certify quality and a register of algorithms used in relevant tools would give confidence to everyone – users and citizens. 

“We welcome the advantages AI can bring to our justice system, but not if there is no adequate oversight. Humans must be the ultimate decision makers, knowing how to question the tools they are using and how to challenge their outcome.” 

If you would like to read the report in full, please click here.

Channel website: http://www.techuk.org/

Original article link: https://www.techuk.org/resource/house-of-lords-report-questions-technology-s-role-in-the-justice-system.html

Share this article

Latest News from
techUK

Latest WiredGov Survey: How Are Public Sector Budget Cuts Hurting Talent Acquisition? 10 x £100 Amazon Vouchers Up for Grabs!