{"id":31774,"date":"2022-11-04T05:15:00","date_gmt":"2022-11-04T05:15:00","guid":{"rendered":"https:\/\/www.lifeandnews.com\/articles\/?p=31774"},"modified":"2022-11-06T05:01:00","modified_gmt":"2022-11-06T05:01:00","slug":"facial-recognition-why-we-shouldnt-ban-the-police-from-using-it-altogether","status":"publish","type":"post","link":"https:\/\/www.lifeandnews.com\/articles\/facial-recognition-why-we-shouldnt-ban-the-police-from-using-it-altogether\/","title":{"rendered":"Facial recognition: why we shouldn\u2019t ban the police from using it\u00a0altogether"},"content":{"rendered":"\n<p><a href=\"https:\/\/theconversation.com\/profiles\/asress-adimi-gikay-1309509\">Asress Adimi Gikay<\/a>, <em><a href=\"https:\/\/theconversation.com\/institutions\/brunel-university-london-1685\">Brunel University London<\/a><\/em><\/p>\n\n\n\n<p>The UK police are being accused of breaking ethical standards by using live facial recognition technology to help fight crime. <a href=\"https:\/\/www.mctd.ac.uk\/join-calls-to-ban-police-use-of-facial-recognition-says-minderoo-centre-researchers\/\">A recent report<\/a> by the University of Cambridge into trials of the technology by forces in London and south Wales was particularly concerned about the \u201clack of robust redress\u201d for anyone suffering harm. It spoke of the need to \u201cprotect human rights and improve accountability\u201d before facial recognition is used more widely.<\/p>\n\n\n\n<p>The Cambridge team wants a broad ban on police using the technology, and they are not alone. UK civil liberties group Big Brother Watch has been running a \u201c<a href=\"https:\/\/bigbrotherwatch.org.uk\/campaigns\/stop-facial-recognition\/\">stop facial recognition<\/a>\u201d campaign as the government mulls how to <a href=\"https:\/\/www.gov.uk\/government\/news\/uk-sets-out-proposals-for-new-ai-rulebook-to-unleash-innovation-and-boost-public-trust-in-the-technology\">regulate AI technologies<\/a>. Meanwhile, 12 NGOs <a href=\"https:\/\/edri.org\/wp-content\/uploads\/2022\/10\/CZ-Minister-Digitalisation-letter-AI-act.pdf\">recently called on<\/a> EU legislators to completely ban it, along with various other forms of biometric identification, in their upcoming <a href=\"https:\/\/eur-lex.europa.eu\/resource.html?uri=cellar:e0649735-a372-11eb-9585-01aa75ed71a1.0001.02\/DOC_1&amp;format=PDF\">AI Act<\/a>.<\/p>\n\n\n\n<p>Simply banning this technology would be a mistake, however. In my view, there\u2019s a good case for a more measured approach.<\/p>\n\n\n\n<h2>Growing police use<\/h2>\n\n\n\n<p>The police forces in London and south Wales appear to be the only two in the UK currently using live facial recognition, which uses <a href=\"https:\/\/ico.org.uk\/media\/2619985\/ico-opinion-the-use-of-lfr-in-public-places-20210618.pdf\">artificial intelligence software<\/a> to compare an individual\u2019s digital facial image with an existing facial image to estimate similarity. Manchester Police trialled it but were <a href=\"https:\/\/www.express.co.uk\/news\/uk\/1031939\/manchester-news-police-surveillance-technology-trafford-centre-manchester\">forced to pause<\/a> by the <a href=\"https:\/\/www.gov.uk\/government\/organisations\/surveillance-camera-commissioner\">surveillance camera commissioner<\/a> in 2018 for not obtaining the necessary approvals.<\/p>\n\n\n\n<p>In 2020 an appellate court also <a href=\"https:\/\/www.judiciary.uk\/wp-content\/uploads\/2020\/08\/R-Bridges-v-CC-South-Wales-ors-Judgment.pdf\">ruled against<\/a> south Wales\u2019 use of the technology, concluding the force\u2019s legal framework for deployment effectively gave them unlimited discretion to do so. It made no difference to the court that the police had notified the public (known as overt operational deployment).<\/p>\n\n\n\n<p>Despite this ruling, facial recognition can still broadly be used by police, although numerous <a href=\"https:\/\/www.psni.police.uk\/sites\/default\/files\/2022-10\/02158%20Facial%20Recognition%20Technology.pdf\">other forces<\/a> have said <a href=\"https:\/\/www.gmp.police.uk\/foi-ai\/greater-manchester-police\/disclosure-2019\/april\/gsa-45619\/\">they are not<\/a> doing so at present.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><a href=\"https:\/\/images.theconversation.com\/files\/493451\/original\/file-20221104-11-ngyqww.jpeg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip\"><img src=\"https:\/\/images.theconversation.com\/files\/493451\/original\/file-20221104-11-ngyqww.jpeg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\" alt=\"Woman on phone while numerous people behind her are being scanned by facial recognition technology\"\/><\/a><figcaption>Any UK police force can use facial recognition under the current legal framework. <a href=\"https:\/\/www.shutterstock.com\/image-photo\/facial-recognition-search-surveillance-person-modern-1481376347\">Trismegist San<\/a><\/figcaption><\/figure>\n\n\n\n<p>The London Metropolitan Police increasingly use facial recognition to locate missing persons, suspects, <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/policy-documents\/lfr-sop.pdf\">witnesses<\/a> and victims. They have scanned individuals\u2019 faces in city squares and at public events, using a <a href=\"https:\/\/www.judiciary.uk\/wp-content\/uploads\/2020\/08\/R-Bridges-v-CC-South-Wales-ors-Judgment.pdf\">facial recognition camera<\/a> typically placed on a police vehicle or street pole. The <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/policy-documents\/lfr-sop.pdf\">public are alerted<\/a> to the deployment through notices as they enter the recognition zone \u2013 unless that compromises policing tactics or deployment is urgent.<\/p>\n\n\n\n<p>Between February 2020 and July 2022, the Met deployed the techology in eight locations including <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/deployment-records\/lfr-deployment-grid.pdf\">Piccadilly Circus<\/a>. They <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/deployment-records\/lfr-deployment-grid.pdf\">are estimated<\/a> to have viewed more than 150,000 faces, leading to nine arrests but also eight occasions where they targeted the wrong person.<\/p>\n\n\n\n<h2>The pros and cons<\/h2>\n\n\n\n<p>Facial recognition has evolved in recent years, for instance to work in real time, but inaccuracies and errors remain. In New Jersey, <a href=\"https:\/\/incidentdatabase.ai\/cite\/288\">228 wrongful arrests<\/a> were reportedly made using (non-real time) facial recognition between January 2019 and April 2021. One <a href=\"https:\/\/edition.cnn.com\/2021\/04\/29\/tech\/nijeer-parks-facial-recognition-police-arrest\/index.html\">black American<\/a> spent 11 days in jail after being wrongly identified. False identifications can also lead to everything from missed flights to distressing police interrogations.<\/p>\n\n\n\n<p>Specific groups are disproportionately affected. <a href=\"https:\/\/nvlpubs.nist.gov\/nistpubs\/ir\/2019\/NIST.IR.8280.pdf#page=4\">A 2019 US study<\/a> found that women are two-to-five times more likely to be falsely identified, while the risks are ten-to-100 times greater for black and Asian faces than white ones. Given that police already disproportionately <a href=\"https:\/\/www.theguardian.com\/uk-news\/2020\/oct\/27\/black-people-nine-times-more-likely-to-face-stop-and-search-than-white-people\">stop and search<\/a> ethnic minorities, this shortcoming in the technology could potentially even be used to sustain such practices.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><a href=\"https:\/\/images.theconversation.com\/files\/493452\/original\/file-20221104-11-ohcw3y.jpeg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip\"><img src=\"https:\/\/images.theconversation.com\/files\/493452\/original\/file-20221104-11-ohcw3y.jpeg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\" alt=\"Crowd in London protesting about police stop and search\"\/><\/a><figcaption>Facial recognition is not necessarily part of the solution. <a href=\"https:\/\/www.shutterstock.com\/image-photo\/london-17th-april-2021-kill-bill-1957541050\">BradleyStearn<\/a><\/figcaption><\/figure>\n\n\n\n<p>Another risk is that police covertly install facial recognition cameras permanently. This could help the state to crack down on public protests, for example. There is already a pending <a href=\"https:\/\/www.hrw.org\/news\/2020\/07\/08\/moscows-use-facial-recognition-technology-challenged\">legal challenge against Russia<\/a> before the European Court of Human Rights over such practices, and fear of state surveillance is one reason why many want this technology banned.<\/p>\n\n\n\n<p>Nonetheless, facial recognition has its benefits. <a href=\"https:\/\/www.securityindustry.org\/2020\/07\/16\/facial-recognition-success-stories-showcase-positive-use-cases-of-the-technology\/\">It can help<\/a> police to find serious criminals, including terrorists, not to mention <a href=\"https:\/\/www.reuters.com\/article\/us-india-crime-children-idUSKBN2081CU\">missing children<\/a> and people at risk of harming themselves or others.<\/p>\n\n\n\n<p>Like it or not, we also live under colossal corporate surveillance capitalism already. The <a href=\"https:\/\/papltd.co.uk\/top-10-countries-and-cities-by-number-of-cctv-cameras\/\">UK and US<\/a> have among the most installed CCTV cameras in the world. London residents are filmed <a href=\"https:\/\/www.theguardian.com\/uk-news\/2021\/oct\/02\/how-cctv-played-a-vital-role-in-tracking-sarah-everard-and-her-killer\">300 times<\/a> a day on average, and police can usually use the data without a search warrant. As if that wasn\u2019t bad enough, big tech companies <a href=\"https:\/\/guardian.ng\/features\/what-does-big-tech-know-about-you\/\">know almost everything<\/a> personal about us. Worrying about live facial recognition is inconsistent with our tolerance of all this surveillance.<\/p>\n\n\n\n<h2>A better approach<\/h2>\n\n\n\n<p>Instead of an outright ban, even of covert facial recognition, I\u2019m in favour of a statutory law to clarify when this technology can be deployed. For one thing, police in the UK can currently use it to track people on their watchlists, but this can include even those charged with minor crimes. There are also no uniform criteria for deciding who can be listed.<\/p>\n\n\n\n<p>Under the EU\u2019s <a href=\"https:\/\/eur-lex.europa.eu\/resource.html?uri=cellar:e0649735-a372-11eb-9585-01aa75ed71a1.0001.02\/DOC_1&amp;format=PDF\">proposed law<\/a>, facial recognition could only be deployed against those suspected of crimes carrying a maximum sentence of upwards of three years. That would appear to be a reasonable cut-off.<\/p>\n\n\n\n<p>Secondly, a court or similar independent body should always have to authorise deployment, including assessing whether it would be proportionate to the police objective in question. In the Met, authorisation currently has to come from a police officer ranked <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/policy-documents\/lfr-sop.pdf\">superintendent or higher<\/a>, and <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/policy-documents\/lfr-sop.pdf\">they do<\/a> have to <a href=\"https:\/\/www.met.police.uk\/SysSiteAssets\/media\/downloads\/force-content\/met\/advice\/lfr\/policy-documents\/lfr-policy-document.pdf\">make a call<\/a> on proportionality \u2013 but this should not be a police decision.<\/p>\n\n\n\n<p>We also need clear, auditable ethical standards for what happens during and after the technology is deployed. Images of wrongly identified people should be deleted immediately, for instance. Unfortunately, Met policy on this is unclear at present. The Met is trying to use the technology responsibly in other respects, but this is not enough in itself.<\/p>\n\n\n\n<p>Last but not least, the <a href=\"https:\/\/news.stanford.edu\/2021\/05\/14\/researchers-call-bias-free-artificial-intelligence\/\">potential for discrimination<\/a> should be tackled by legally requiring developers to train the AI on a diverse enough range of communities to meet a minimum threshold. This sort of framework should allow society to enjoy the benefits of live facial recognition without the harms. Simply banning something that requires a delicate balancing of competing interests is the wrong move entirely.<\/p>\n\n\n\n<p><a href=\"https:\/\/theconversation.com\/profiles\/asress-adimi-gikay-1309509\">Asress Adimi Gikay<\/a>, Senior Lecturer in AI, Disruptive Innovation and Law, <em><a href=\"https:\/\/theconversation.com\/institutions\/brunel-university-london-1685\">Brunel University London<\/a><\/em><\/p>\n\n\n\n<p>This article is republished from <a href=\"https:\/\/theconversation.com\">The Conversation<\/a> under a Creative Commons license. Read the <a href=\"https:\/\/theconversation.com\/facial-recognition-why-we-shouldnt-ban-the-police-from-using-it-altogether-193895\">original article<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Asress Adimi Gikay, Brunel University London The UK police are being accused of breaking ethical standards by using live facial recognition technology to help fight crime. A recent report by the University of Cambridge into trials of the technology by forces in London and south Wales was particularly concerned about the \u201clack of robust redress\u201d [&hellip;]<\/p>\n","protected":false},"author":44,"featured_media":31775,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[295],"tags":[3404,965,12869,2893,12870],"_links":{"self":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/31774"}],"collection":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/users\/44"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/comments?post=31774"}],"version-history":[{"count":2,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/31774\/revisions"}],"predecessor-version":[{"id":31792,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/posts\/31774\/revisions\/31792"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/media\/31775"}],"wp:attachment":[{"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/media?parent=31774"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/categories?post=31774"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.lifeandnews.com\/articles\/wp-json\/wp\/v2\/tags?post=31774"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}