London: Facial recognition systems in the UK are more likely to misidentify black and Asian people than white people, a critical finding has emerged. The report comes amid calls for stronger safeguards in the use of facial recognition technology, according to reports.
Analysts who examined the police national database’s previous facial recognition technology tool found that the rate of misidentification of white people is significantly lower than that of Asians and black people. In particular, the test showed that the misidentification rate of African women was higher than that of their male counterparts.
The UK’s Home Office has admitted that the latest test by the National Physical Laboratory into the use of technology within the police’s national database could have led to a higher risk of misclassification of some population groups.
Police and crime commissioners said the NPL findings highlighted a built-in bias in the face recognising technology and called for caution. The findings came hours after Police Minister Sarah Jones described the technology as “the biggest breakthrough for catching criminals since DNA matching”.
Facial recognition technology scans people’s faces and then cross-references the images with watch lists of known or wanted criminals. Officers can use the technology to check live footage of people passing by, compare their faces to those on wanted lists, or target individuals using mounted cameras as they walk by.
Suspects’ images can be run through passport or immigration databases to identify them and check their backgrounds. The civil service is working with the police to set up a new ‘national identification system’ that will store images of millions of faces.
Policy and advocacy officer at the campaign group Liberty, Charlie Welton, said that the identification system will allow police to use facial recognition without proper safeguards. He pointed out that the statistics from the tests on the technology shows racial bias, while thousands of searches are being conducted using this algorithm every month. The authorities need to answer serious questions about how many people have been misidentified and what the consequences have been, he said.