Improving productivity in policing through technology and digital skills can create the equivalent of 30,000 more police officers and staff, and could free up as many as 60 million policing hours over a five-year period, the Policing productivity review has claimed.
Concluded on 30 September 2024, the review was commissioned by the Home Office in summer 2022 to identify how police forces in the UK can boost their productivity and improve outcomes.
Led by Alan Pughsley QPM, a former chief constable at Kent Police, a significant portion of the independent review focuses on how police can better leverage technology to improve criminal justice outcomes, free up time for both frontline officers and administrative staff, and boost dwindling trust in policing as an institution.
“Policing productivity matters because it means getting the best possible service from the resources available,” he said. “It’s about having more officers on the street, more officers and staff responding to incidents, and investigating crime, all of which mean better outcomes for the public.
“The review team has spent time with officers and staff across the country, exploring the challenges they face and seeing examples of good practice in action that could have a huge impact if adopted more widely,” said Pughsley. “The size of the potential savings may raise eyebrows, but our recommendations are practical and grounded in the realities of policing.”
While the three-phase review has now officially concluded, the only material to be published so far is a report based on work from the first phase, which focused on identifying barriers to police productivity.
These barriers include intense budget pressures, a lack of digital skills among the workforce, poor data quality, consistency and sharing, and the patchy pooling of resources and efforts that mean innovation is not being diffused throughout the UK’s federated policing system.
Data issues
On the data issues, for example, it noted the ways in which data are captured, managed, shared and used can vary greatly from force to force; that there is a lack consistency in how data is defined and interpreted; and that forces are largely unable to link their disparately held data.
However, the first-phase report also outlined where the review team believes technology has already been deployed to achieve a range of operational and administrative policing outcomes.
This includes Bedfordshire’s use of an AI-powered auto-redaction tool to remove content from files being sent to the CPS, which is claimed to have achieved an 80% saving in time efficiency; Dyfed-Powys, Leicestershire and Sussex police using video calling to attend low-risk 999 calls; and the increasing use of retrospective facial recognition as a capability in the police national database (PND), which it noted “underpins results in over 100 cases per month in South Wales Police alone”.
Police legitimacy
The review further noted that outside of improving productivity and outcomes, data and technology can also be used as a way of clawing back the diminishing legitimacy of policing. However, it also warned that rolling out new technologies without meaningful public engagement could reduce legitimacy even further.
It highlighted, for example, that “low legitimacy and levels of trust” are impacting police effectiveness by making it difficult for officers to obtain basic information such as witness statements (which in turn means officers have to work longer and harder to get results, decreasing overall productivity).
“The Peelian principle of policing by consent places an important requirement on policing’s adoption and use of emerging technologies,” it said. “Policing has a duty to demonstrate and explain to the public what a technology is doing, and that its use is proportionate, lawful, accountable and necessary. The Home Office also has an important role to play in providing a framework to enable adoption of new technologies. This context has a significant bearing on policing productivity because inappropriate use, or a failure to use technology when appropriate, can have a deleterious effect on justice and legitimacy.”
Giving the example of retrospective facial recognition use, the report noted the legal landscape underpinning its use “is delivered through a ‘tapestry’ approach which includes primary legislation, codes of practice and local policy, whose complexity can raise legitimacy concerns”.
It added that a lack of clarity on the use of various technologies and their proposed use or governance “may be compounded by low levels of trust in the force”, and that “an independent national ethics function is required” to gain public confidence in police technology.
“While the factors that contribute to police legitimacy and public trust are multi-faceted, greater engagement, communications with the public on how technology is to be used, and the ethical considerations that have informed its deployment will help to build this trust,” it said.
The review added that a confluence of factors – including the current economic context, declining public trust and rapid technological change – means “inaction, muddling through or making incremental tweaks carries far more risk to police legitimacy and productivity than can be afforded.”
Future investment and legislative change
However, the review said that for police to harness the full range of benefits of data and technology, there needs to be a revamp in digital skills, investment and coordination.
“Innovation is poorly shared across the sector,” it said. “Forces’ efforts and use of resources overlap or duplicate unnecessarily. Many investments made by forces are not as clearly baselined, measured, identified or evaluated as they should be.
“Patchy evaluations mean that innovations (operational or structural) remain underexploited. This impacts the sustainability of these pilots or investments, weakens future resourcing bids, and importantly it leaves funders, oversight bodies and partners unclear as to the value provided to citizens.”
It added that to keep pace with technological change, police also need people with skills that are much in demand: “This is reliant on forces being able to either fund internal specialist posts, train people in-house (who get “regularly ‘head hunted’ for commercial roles”) or bring in external specialists.”
Pace of innovation
Elsewhere, the review said the pace of innovation and the complexity of new technologies means legislation, authorised professional practice or guidance often lag behind. “Forces therefore lack the provision of a clear framework within which to operate some innovations,” it said. “In their absence, the legal landscape is fragmented and includes primary legislation, codes of practice and local policy. This is challenging for forces and the public to fully comprehend, which hinders deployment or fuels legitimacy concerns.”
Owen Sayers, an independent security consultant and enterprise architect with over 25 years of experience in delivering secure solutions to policing, called this position “absolute tosh”, noting the legal framework for policing (contained in Part Three of the Data Protection Act 2018) is clear, unambiguous and fully equipped for all police technology use.
“Suggesting that technology outpaces legislation and it is therefore legislation that must be ignored or worked around is a dangerous step towards digital anarchy, and is the favoured soundbite of privacy-breaking technologists the world over,” he said.
“Laws exist for good reason, adherence to those laws is vital for the operation of safe societies, and the Police (of all public sector bodies) should be the first to recognise this,” said Sayers. “If we need to abandon or ignore UK laws to deploy the police’s favoured technology, this principally shows they’ve picked the wrong tech; not that the laws are wrong.”
Sayers added that the only issue with the UK’s police-specific data protection legislation is that it “forms a valid constraint to the adoption of policing’s preferred technologies – and specifically to the engagement of a small number of cloud providers – with whom the police wish to work and have an altogether too cosy relationship with”.
On the use of auto-redaction technologies, he said the report has “ignored” a number of key issues, including that automated redaction of personal data “struggles when put against the section 49 rights for a data subject against automated “significant decision-making”, and that most of the solutions are run on public cloud platforms: “These platforms are illegal for police to process law enforcement data on – yet the report does not mention this. It’s effectively endorsing illegal data processing practice. How does this improve police efficiency and public trust?”
Computer Weekly contacted the Home Office about the issues identified by Sayers, but received no on-the-record response.
However, in its official response to the first-phase report, the government said it would create a new Centre for Police Productivity by Autumn 2024 based in the College of Policing, which “will set the foundations necessary for policing to deliver the 38 million police officer hours identified by the independent review”.
It added: “This Centre will inclrude a new Policing Data Hub to support police forces’ use of data and ensure they can deploy and get the benefits from new technology, including AI. It will also deliver new model processes into policing trialled during this review. Adopting ‘what works’ throughmodel processes will mean better outcomes for the public at less cost.”
Comentarios recientes