Formerly the preserve of Minority Report and dystopian novels, facial recognition technology (FRT) is now a burgeoning part of the tech industry and one that is now being trialled by police forces in the UK. But as the news comes that San Francisco has become the first big city toban the use of the technology, should we be worried?
The chances are, you’ve already encountered FRT. It’s there whenever Facebook suggests you tag a specific person in your photos.South Wales Police used it during the Champions League finals in Cardiff in 2017. It's the technology that powers Face App the mobile app that lets users see what they look like when they get old. The Metropolitan Police used it onshoppers in London in December 2018. In America,Taylor Swift used it at a concert to unearth any of her hundreds of stalkers. In Australia and New Zealand, there arebillboards in shopping centres that use facial detection to record shoppers’ mood and reactions. There were eventheories that social media’s #10YearChallenge could be used by Facebook to help train facial recognition software. (Facebook denied this.) While there have also beenprivacy concerns raised about the Russian owned FaceApp.
Now, police forces across the UK are trialling it. Using the software NeoFace, the cameras scan the faces of people who pass them and compare the images with a list of offenders wanted by the police. The potential benefits of the technology are clear. In a crowd of people, the cameras could spot suspects on police watch lists and help ensure their arrest. According to theBBC, there were three arrests on the first day of the trial alone. Think about the crimes it could prevent—or even, in the case of potential terror suspects, the attacks.
But it is facing opposition from those who argue that the use of the technology is eroding our civil liberties and turning the UK into a surveillance state (orBig Brother Britain, to quote one headline). With the benefits come serious implications around issues like privacy. Are we in danger of losing ours?
When it comes to surveillance, the UK already boasts more CCTV cameras per person than almost anywhere else in the world. Add in the new technology and there is the worry that people could be effectively traced and identified wherever they went, eroding our fundamental right to privacy (Article 8 of the Human Rights Act states theRight to a Private Life). The non-profit group Big Brother Watch has been campaigning against the use of FRT for this reason, saying it will take away our civil liberties, as Silkie Carlo, director of the campaign group, claimed in apiece for Time.
If this sounds hyperbolic, it’s not. If you want a look at the possible future use of this technology, you only need to look to China, where it’s being used tomonitor its citizens, tracking everything from crime, via FRT smart glasses worn by police, to shopping habits and travel. It’s even entering schools,to monitor children’s engagement.
We’re already seeing the possible implications here in the UK. The Metropolitan Police said that those covering their faces from the cameras in the trials wouldn’t necessarily be viewed more suspiciously. Butone man was stopped and then fined £90 after doing just that. His photo was then taken by the police, anyway.
At the moment, there are few legal restrictions in place to protect us. Research from the University of East Anglia and Monash University has shown that the trials are unregulated and have been 'operating in a legal vacuum,' with the academics behind the research called for tighter restrictions to be placed on the use of the tech.
There’s also the concern over reliability. It has already been shown to be inaccurate in trials:96% of the matches generated between 2016-2018 were false, which means that innocent people were being flagged as criminals.2,000 people were wrongly matched at the Champions League. Despite this, photos of some of these false matches have been kept andstored on police databases for a month. Its accuracy only worsens when identifying women andblack and minority ethnic people, adding to the concerns of false matches thanks toracist technology. Big Brother Watch reported seeing a black 14-year-old in school uniform stopped by police and fingerprinted after thecameras misidentified him.
The pushback has now led to itsfirst UK court case, taking place in Cardiff over the South Wales Police’s use of the cameras. Ed Bridges is claiming he was scanned without his consent at least twice, at both a peaceful anti-arms protest and while doing his Christmas shopping. He will argue that this breaches his human rights. The civil rights group Liberty, which is defending him, has compared it to takingDNA or fingerprints without consent. The case could prove important to defining the legalities around using the technology.
While the police argue that FRT is an important tool, it’s hard not to be concerned about the potential implications over its use and what it could mean for the future. One thing is clear: there need to be greater restrictions and laws in place to protect our basic human rights, including our right to privacy.