FF News Logo
Tuesday, March 18, 2025

Exclusive: ‘Spotting the difference’ – Brett Beranek, Nuance Technologies in “The Fintech Magazine”

Imitation might be the sincerest form of flattery – but not when it comes to customer authentication. Brett Beranek, Vice President and General Manager for Security and Biometrics at Nuance Technologies, says there is only one way to stay ahead of the deepfake masters and that’s to employ the same technology as they do.

We’ve seen Carrie Fisher come back from the dead as Princess Leia, and Donald Trump apparently give speeches that are shocking even by his standards.

They’re ‘deepfakes’ – the 21st century version of Photoshopping – which see voices and faces copied with incredible accuracy then manipulated using artificial intelligence (AI) software.

Some of the fakes that circulate on social media are obviously for fun. But others are designed to deceive and distort and, given the increasing sophistication of these clones, there are clearly implications for biometric security systems.

Fingerprint scanning has been commonplace since the arrival of the iPhone 5s, while facial and voice recognition is employed for everything from opening online bank accounts to identifying individuals in crowds.

Last years Deloitte Mobile Consumer Survey revealed that, in the UK, 44 per cent of respondents with a smartphone used fingerprint recognition, and 11 per cent used facial recognition. And, while traditional security systems that employ passwords and tokens have a chequered history of breaches and outages, biometric systems have also been compromised.

In 2017 a BBC news reporter set up an HSBC bank account protected by voice recognition security – then his non-identical twin managed to mimic him and gain access.

Another example came last year when ‘ethical hackers’ at VPN Mentor broke into security platform BioStar 2 and opened files containing personal details, one million fingerprint records and facial recognition data that could be used to access buildings reliant on biometric security across the globe.

So, whats the answer? Assume biometric systems will be attacked and write code that searches for the giveaway signs of a deepfake from the outset, says US software firm Nuance Communications.

Brett Beranek, Nuance’s vice president and general manager for security and biometrics, says: “We’ve assumed, right from the get-go, that a fraudster may have access to your voice, and that’s why we’ve developed technologies to ensure that we can detect if what we’re hearing is a synthetic voice.

“The fact that fraud has become a $4trillion-a-year problem, and that it’s growing by double digits every year, is a testament to the failure of traditional authentication methods. But I think fraudsters will adapt their techniques and start attacking biometrics so it’s critical that biometric vendors provide solutions to counter those types of spoofing attacks.

“A lot of people have seen some of the deepfake videos that have populated the internet in recent months, and it has really become a source of anxiety, not only  for consumers but also for organisations. There is a question around trust: can we trust what we see and what we hear?”

Nuance began developing biometric security systems in the 1990s and voice recognition software was among its first products in 2001. Since then, levels of sophistication have exploded and, in recent years, AI has been employed to continually push the boundaries.

“For many years, we’ve been at the forefront of AI technology, which is the technology used to create deepfakes,” says Beranek. We’ve found ways to detect deepfakes. When a deepfake is created there are fundamentally two components to it. One is the visual aspect, creating a visual representation of a person doing something that they have never done, and the second is the audio aspect, which creating their voice, saying things that they’ve never said.

“We can analyse that voice and detect anomalies that are being created by the synthetic speech engine. Our customers, such as financial institutions, have a great interest in this technology to prevent fraud attacks, but I think this is a technology that can be beneficial to society as a whole. I can foresee that news organisations, for example, would use our solutions to determine if a video that they’re seeing is legitimate, or if it’s a deepfake.”

Much of Nuance’s work for the financial sector is around providing products for contact centres, so that traditional security measures such as passwords, PINs and challenge questions can be ditched in favour of more efficient and secure alternatives.

The advantages of biometric security are threefold, says Beranek  – it makes life easier for the customer (fewer or no passwords to remember), it should be more secure and it’s cost efficient because contact centre call times are shorter if a machine is verifying a customers identity in the background.

Light-speed security

Last year, Nuance launched its Lightning Engine, which the firm says can identify a caller within half a second. There are two potential methods of identification – the software can verify someone’s ‘active voice’ when they say a phrase such as ‘my voice is my password’, or it can verify the ‘passive voice’ during their conversation with a contact centre agent. It uses fourth generation deep neural networks (DNNs) and combines voice biometrics and natural language understanding (NLU) to set up a unique voice profile during account enrolment, which subsequently allows someone to be identified in less than half a second. Nuance products also identify customers through selfies and behavioural patterns, such as how they interact with their device, be it a tablet, keyboard or phone.

Beranek says the ubiquity of biometric security systems on phones has helped the public overcome their initial resistance to them.

“Two things have changed. Because biometrics have become ubiquitous with mobile devices, consumers can use them on a daily basis and most choose to do so to access their device and apps. Whether it’s putting their finger on the fingerprint reader or placing the camera in front of their face, they realise it’s a more convenient method than typing in a PIN or a password.

“Also, consumers understand it is a more secure way of authenticating. We’ve all read in the news about widespread hacks, and fraudsters gaining access to databases of compromised credentials. That creates a lot of anxiety and drives the shift to biometrics,” says Beranek.

Due to Nuance’s long history with biometrics, the firm now has a wide range of tools for organisations to deploy. Barclays, for example, first used Nuance’s voice biometrics verification for its wealth management arm in 2013, then, after positive customer feedback, began to roll it out into its retail banking business three years later. Chatbots are another Nuance product for the financial sector. Which solution is adopted should be driven by examination of the customer’s business process, says Beranek.

“More than 500 organisations have deployed Nuance biometric technology so it really is tried and tested. The biggest challenge for our customers is to think through the business processes, and consider how they need to change when moving from a legacy authentication method to biometrics. We can definitely guide organisations on that journey, but the key message is to move away from thinking about it simply from a technology perspective.

“We’ve had reports of significant reductions in fraud losses following the transition to biometric technology – one of our customers in the UK reported £330million in annual fraud savings, so not only is it more secure, it is significantly more secure than those legacy methods.”

Notwithstanding the huge benefits of convenience, Beranek says Nuance’s approach to biometrics is primarily security focussed.

“Smartphone manufacturers have done wonders for the biometric industry, but their approach has been very much a convenience play,  to make it easier for you to access your device. We look to enable enterprises such as banks, telecoms and government organisations, to ensure they can prevent fraud. So, finding ways to mitigate fraud then providing a convenient way for consumers to access services,” he says.

“How to validate consumer identities is an important consideration for a fintech because usually they don’t have a network of branches where a customer can walk in and present themselves and their ID. And so the ability to validate identities online, in a digital format, using biometrics, becomes key to their business model.”


 

This article was published in The Fintech Magazine: Issue #16, Page 72-73

People In This Post

Companies In This Post

  1. New Report Reveals the True Cost of Living as a Digital Nomad Read more
  2. Prosperity Bank Selects AKUVO Collections Platform to Support Growth Read more
  3. NCR Atleos Unveils Comprehensive Paper on Optimizing ATM Operations Through ATM as a Service (ATMaaS) Read more
  4. DTCC’s NSCC to Increase Clearing Hours to Support Extended Trading Read more
  5. EXCLUSIVE: “Good Vibes” – Miranda McLean, Ecommpay in ‘The Paytech Magazine’ Read more
FID Fraud _ FinCrime 2025 - website banner - 300 x 300