We’re using facial recognition tech in airports, but is it reliable?
Facial recognition technology is basically an artificial intelligence (AI) algorithm taught to match several points on a human face with an image in a database.
It’s a technology that businesses are experimenting with, and although quite a few solutions are commercially available, they’re neither fool-proof nor reasonably accurate.
However, it seems that governments across the world are getting comfortable with facial recognition, making it available to the police force and to border security units at airports.
The Mineta San Jose International Airport became the most recent checkpoint to use facial recognition for passengers arriving into the country.
According to media reports, the average wait time for arriving international passengers has decreased by roughly four minutes per passenger.
For a roughly one-month period before the airport adopted the technology in June for arriving international flights, the average wait time was 25.1 minutes. Four weeks following the launch of the system, average processing times for arriving international travelers dropped by roughly 16 percent to 21.2 minutes.
Similar the technology is also in use at airports in Orlando and San Diego.
YOU MIGHT LIKE
Would you trade data for convenience?
Is facial recognition really ready for use?
To be honest, from all the reports out there, it seems as though facial recognition isn’t quite ready for use. Not for something as serious as border security anyway.
Take the case of the facial recognition software used by the Metropolitan Police, the UK’s biggest police force.
The Independent found evidence that the system returned false positives in more than 98 percent of alerts generated.
The publication also reported that the system used by the South Wales Police was almost as ineffective, returning more than 2,400 false positives in 15 deployments since June 2017.
To its credit, the South Wales Police’ did manage to match 234 suspects accurately during the Uefa Champion’s League final in Cardiff, although that was less than 10 percent of the total alerts the system issued.
According to an MIT study earlier this year, facial recognition software is accurate 99 percent of times for white male subjects. However, for darker skin tones, the error rate is substantial. Darker skinned women, for example, face a 35 percent error rate when it comes to facial recognition, according to the New York Times.
The truth is, in the corporate environment where the risks aren’t so high, AI-powered facial recognition technology is great. However, when it comes to law enforcement and border control, it might be a good idea to better train the algorithms.
18 July 2019