Biometrics — the most secure solution for banking?
- Financial firms have been exploring the adoption of facial recognition technology to enhance customer service
- Advancements in AI like deepfakes is threatening the adoption of biometric technologies in the payment landscape
The rise of customer-focused fintech and challenger banks in recent years has driven a surge in the need for better customer experience and convenience among all players, even the incumbents.
From instant transactions to personalized spending breakdowns, every element of the banking experience is now expected to be seamless and personal. Security is no exception to the rule, but dealing in both access to bank accounts and highly-sensitive personal data, convenience must be measured against assurance.
Matching the unique physical characteristics of an individual and removing the need for extensive character-numeric passwords, biometrics, namely facial recognition, seems to offer the best solution.
In the US, major financial players such as Chase, HSBC, and USAA let customers use Apple’s FaceID to log into their mobile banking apps. Bank of America (BOA) was granted a patent for a facial recognition authentication system employing a variety of biometrics last year in May.
Payments giant Mastercard has been exploring the use of facial recognition since 2016 when it launched its “selfie pay” app that enables users to approve their online purchases for instant payment. The finance giant was also reported to be exploring voice recognition technology to approve payments.
Evidently, then, there is already progress being made towards biometric security in the banking sector, but the recent debate around mass surveillance by facial recognition technology is prompting us to take a step back and reassess our understanding of the technology.
For payment firms adopting the pay-by-face concept, a trove of data must be collected, stored, and managed, data including biometric features.
Even though facial recognition aims to combat fraud in the finance landscape, it isn’t exactly foolproof. For instance, Mastercard’s app asks users to blink to verify that they are human, but the company found that this has been spoofed in the past.
Reports of deepfakes circumventing facial recognition systems indicate a pandora’s box of fraud, scams, and misidentification that bad actors could unleash. Deepfakes are growing as an imminent threat to remote customer identification, and there have been several high-profile cases that saw bad actors exploiting this technology to override biometric data.
While biometrics technology is designed to combat existing threats impacting more traditional means of verification like passwords, PIN numbers, or patterns, sophisticated deepfake technology could enable attackers to find workarounds to facial, voice, or even gait recognition, exposing new cracks in upgraded security systems.
As they race to adopt solutions that speed up the verification process and add convenience to online transactions, firms in the finance sector may need to take a step back and evaluate proactive countermeasures to combat these kinds of emerging threats.
Ken Munro, a security researcher at Pen Test Partners, told BBC that the best security would be a little more “cumbersome.”
“Ideally I’d like to see facial recognition used in conjunction with a Pin. Both systems have flaws, but work brilliantly when you combine them.”