News

Research shows voice imitation can be used to breach biometrics

Tuesday 29 September 2015 00:48 CET | News

Researchers from a US university have discovered that voice imitation attacks using samples could increasingly be used to breach automated and human authentication systems.

Voice morphing software will enable hackers to launch these attacks, a team at the University of Alabama, Birmingham (UAB) has found. The UAB team warmed that people could inadvertently leave voice samples as part of daily life.

The research team looked at the implications stealing voices had on human communications as its other application for the paper’s case study. The UAB study – a collaborative project involving UAB College of Arts and Sciences Department of Computer and Information Sciences, and the Center for Information Assurance and Joint Forensics Research – took audio samples and demonstrates how they can be used to compromise a victim’s security and privacy.

Once the attacker defeats voice biometrics using fake voices, he could gain unfettered access to the system, which may be a device or a service, employing the authentication functionality. Results showed that a majority of advanced voice-verification algorithms were trumped by the researchers’ attacks, with only a 10-20% rate of rejection. On average it was also found that humans tasked with verifying voice samples only rejected about half of the morphed clips.


Free Headlines in your E-mail

Every day we send out a free e-mail with the most important headlines of the last 24 hours.

Subscribe now

Keywords: biometrics, applications, online security, web fraud, digital identity, voice, US
Categories: Fraud & Financial Crime
Companies:
Countries: World
This article is part of category

Fraud & Financial Crime






Industry Events