Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/rankings/
INSIGHTS

Why the current arguments against face recognition are flawed

Why the current arguments against face recognition are flawed
The biggest blow to face recognition technology happened this month, when, for the first time, an independent study in the U.K. found that the system was 81 percent inaccurate.
Face recognition technology has come under intense scrutiny lately after certain studies raised questions on their efficiency. Although privacy concerns have always been part of any form of a biometric solution, things began to get controversial after a report revealed a face recognition system used by police in South Wales, the U.K., misidentified more than 2000 people as potential criminals, which made a false positive rate of 92 percent.

Much has been talked about this matter since then and other similar reports have raised even more concerns, so much so that in some places, governments have already decided to ban face recognition. In May 2019, San Francisco became the first U.S. city to ban the technology with officials voting 8-1 in favor of the decision. Somerville followed suit soon. Just last week, Oakland in California did the same.

All this even prompted major technology companies like Microsoft and Amazon to voice their concerns on the potential misuse of the technology. But, perhaps the biggest blow to face recognition technology happened this month, when, for the first time, an independent study in the U.K. found that the system was 81 percent inaccurate.

Solution providers disagree

Solution providers are clearly not convinced. Some claim that the advantages of face recognition technology outweigh the disadvantages when it comes to deterring crime. After all, if the solution can get a thousand people wrong but still help to catch a potential criminal who could harm the rest 1000, isn’t it a good thing?

This is not the only argument. Carl Gohringer, Founder and CEO of Allevate, a company that offers biometric solutions, says that the methodology used for studies is flawed. In a blog post, he outlined the issue with the report on South Wales police’s usage.

“Looking at the system reported on in South Wales, in the worst-case scenario, if only 2,470 people actually walked past the cameras and if there was only one person in the watchlist, the system would indeed be operating at a 92 percent False Positive Rate (2,297 / (2,470 x 1),” Gohringer said in a blog post. “Abysmal, but unlikely. In reality, tens of thousands of people will have walked past the cameras and there were likely to have been hundreds of faces in the watchlist.”

Let’s speculate that 100,000 of the 170,000 people that arrived were detected by a camera and that there was only one face in the watchlist, he continued.

“The system would have then been operating at a 2.3 percent (2,297 / 100,000 x 1) false-positive rate. There clearly will have been significantly more than one person in the watchlist, and each person may have crossed multiple cameras initiating multiple searches. So, the system very likely will have been performing at a much better rate than this.”

Without knowing the size of the watchlist, the actual number of people detected by a camera, and how many people were not matched that should have been, it is impossible to come to a determination if the system is operating well or not.

Similarly, speaking about the recent reports that claimed 81 percent inaccuracy, Gohringer, pointed out that what is important is that out of potentially thousands of people, the system was able to narrow down the individuals that need to be assessed to a few. It was on this list that the researchers found more false positives than accurate ones.

This means that without the solution, the police wouldn’t have been able to narrow down to a few and then identify people they wanted from within it. He elaborated on this further.

“Imagine there are 1000 people in front of you with six criminals among them,” he said. “You can’t assess all of them manually, so you deploy an automated system to assist you. This system picks up 10 people from the crowd for you to assess. You find that five of these 10 people are the criminals you were looking for.”

Here, the operator does the assessment to make the final decision on whether someone is a criminal or not. You knew that 994 out of the 1000 people are not criminals. Out of the six criminals, you identified five and one slipped out. The system’s false reject rate was 1/1000 = 0.1 percent. It’s false accept rate was 5/1000 = 0.5 percent. Here the accuracy is not 50 percent from five out of the ten shortlisted being right. You needed to assess 1000, from which the system was able to narrow down to 10. If the system was not deployed, identifying the five criminals would have been impossible.
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: