Find out how AI technology can is being used in policing today to identify patterns and provide faster, more accurate insight to support the reduction of crime.
Just a few years ago, artificial intelligence (AI) was confined to the realms of science fiction but that is no longer the case. AI is having a profound impact on many sectors of industry by taking on tasks which the human brain is not designed to do.
Opportunities for AI in policing are evolving at a rapid pace – fortunately not in the form of RoboCop or The Terminator but as a way to help human officers make better informed decisions.
AI can support policing by sifting through large volumes of data, identifying patterns and enhancing human decision-making rather than replacing it.
We’ve looked at some key areas where AI is already adding value by informing decisions and mitigating risk.
Police forces spend valuable time and resources processing evidence – time which could be better spent on the investigations themselves.
AI can offer a new dimension to these investigations by spotting patterns based on data from past cases and the circumstances around them. By making connections between these factors, the technology can provide police officers with additional insight on how to proceed with a case.
Kent Police has been trialling the Evidence Based Investigation Tool (EBIT), an algorithm which produces a probability score of a crime’s solvability. This information is factored into the overall decision-making process around a case. The tool has already saved hours of police time.
The success of tools such as EBIT relies on good quality data, but data capture can be a challenge in a busy control room or when police officers are at the scene of a crime. That’s where innovations like language processing – voice to text and video to text – along with police body worn cameras come into play.
With good quality data on previous cases, AI enables police forces to take better informed courses of action on their current caseloads.
AI can also be effective in guiding police officers on how to deal with an offender once they have been arrested. There is a lot of leg work involved in considering whether to grant pre-charge bail – assessing how likely a person is to reoffend, whether they will attempt to contact witnesses or pose a threat to the community.
Police officers gathering information on a suspect often have to trawl through paper files or scroll through databases. AI can sift through vital information much more quickly while making connections from the data to predict if someone is at risk of doing harm if they are released.
Durham Constabulary has been testing the Harm Assessment Risk Tool (Hart) to inform pre-charge decisions. The program takes five years’ worth of historical data on people taken into custody in Durham and makes predictions based on 34 different metrics including previous offence history, age and postcode of the offender. Using machine learning, the tool identifies whether a suspect has a low, moderate or high risk of reoffending.
The constabulary does not rely on the algorithm to make the decisions, instead officers use it to gain additional insight into their existing risk assessments.
Offender behaviour programmes are an effective way to reduce offending. However, the success of community justice hinges on keeping people engaged in the schemes and ensuring they do not drop out.
It’s not always easy to predict who is likely to engage with a programme and who is not, but AI can detect patterns which might identify someone who is at risk of quitting a course. Data could come from a range of sources, such as reasons given for non-attendance, relationships with others on the course and feedback from the course instructor on how much a person contributes.
Lack of engagement in an offender management course could even be down to something as basic as a course schedule. The sessions are causing childcare issues, for instance, or they clash with shift patterns. AI can help to optimise programme timetables to make it as easy as possible for someone to attend a course without disrupting other aspects of their lives.
It’s an approach which has been adopted by the National Institute of Justice in the US, which is trialling AI based systems to help offenders re-enter society after a spell in prison. The systems can remind, encourage or warn offenders to attend scheduled sessions through their smartphones or wearable devices.
While AI opens up possibilities for targeting resources more effectively in law enforcement, it is important to avoid AI being perceived as a ‘black box’ process in which data is fed in and results churned out. Making the technology explainable – for example outlining why the AI has deemed a case solvable – helps to demystify the process.
With so many exciting possibilities to enhance human decision-making in policing, the age of AI will see the very best of humanity and technology working together for a fairer and safer world.
Find out more about AI technology in policing
As AI technology becomes more widely used, we here at NEC are taking a deeper look into it’s opportunities in policing.
Find out why Steve Ainsworth, executive director of Public Safety at NEC Software Solutions UK, believes that transparency is key when it comes to embedding artificial intelligence into operational policing.