Facial recognition software is already in common use. Director of Safety, Steve Ainsworth, explains how regulation will make it more effective.
Mention automatic facial recognition and the words ‘Big Brother’ will often come back at you. Critics of the surveillance state point out the tension between individual privacy and public protection, and that’s part of the debate that we should be having right now. AFR is already in use all over the world, in the public and private sectors. So if the genie won’t go back in the bottle, we’d better learn to control it.
Technology has always set the pace and then watched as legislation catches up. With AFR, the pace of change is so great that we need to get ahead now and define the regulatory approach before it’s out of reach.
Balancing the public’s desire for privacy with their need for police forces to protect them has always been hard. Debates about CCTV, ANPR and stop and search haven’t gone away, even though they’ve proved their worth as public safety tools. So how should we talk about AFR?
I think a lack of regulation is holding us back. What’s the proper threshold for inclusion on a watch list? Should it hold images of those arrested but not charged? Should it keep images of people with spent convictions? Will images of law-abiding citizens stay on a police database? Without giving accurate and consistent answers to these questions, suspicion will grow.
We need an ethical approach from the tech companies compiling the control data sets to ensure they are fully representative of gender and ethnicity. A recognised, uniform approach would go a long way to reducing the opportunity for bias.
In reality, monitoring and tracking happens everywhere we go. Footage captured on CCTV in shops, stations and city centres lasts far longer than the millisecond lifespan of a non-matched image in AFR. However, the long-standing protocols that govern CCTV use make it largely accepted as an important tool.
When it comes to automated facial recognition we need a transparent process and the right privacy tools. For example, blurring the faces around the ‘matched’ face of a person of interest, and ensuring faces that don’t match any on a ‘watch list’ aren’t stored on a police database and immediately deleted.
Bias in AFR remains a concern. The quality of the provided and captured images has improved, but we need a standardised approach to compiling, testing and curating the data sets that train the algorithms.
That’s why we need an ethical approach from the tech companies compiling the control data sets to ensure they are fully representative of gender and ethnicity. A recognised, uniform approach would go a long way to reducing the opportunity for bias.
You’ll often see the need for better accuracy quoted as a reason to pull back. However, this fails to understand how flexibly it’s used. AFR systems are fine-tuned to meet different objectives. A counter-terrorism unit with intelligence on a known individual planning an attack on a specific area will set a very different threshold for what constitues a match to a team looking for a vulnerable person. That’s because the technology isn’t there to outsmart officers; it’s there to help them narrow the field.
The benefits for officers are potentially enormous. AFR can help in pre-planned operations, to support safeguarding or to provide a lead in time-critical situations. In New York, officers identified and arrested a suspect within hours of a major panic after a man entered the rail network with a suspect device.
The Crime and Security Research Institute at Cardiff University has suggested that AFR should be renamed ‘assisted’ facial recognition because the decision to arrest subjects is made by humans, not machines. I have sympathy with that view. Technology didn’t double-check the match or dispatch resources; that was the officers at the NYPD.
The potential to apply AFR to body-worn video could deliver huge benefits to society, helping officers to identify those who wish to cause harm as well as those at risk.
So is it possible to balance civil liberties with AFR? I think so. But we’ll need effective regulation and oversight – and ‘self-policing’ in the interim – to build and sustain public trust.
Interested in learning more? Find out more about our facial recognition technology here.