News

Top facial recognition algorithms to be biased, NIST study

Tuesday 24 December 2019 08:45 CET | News

The National Institute of Standards and Technology (NIST) has found ‘empirical evidence’ that characteristics such as age, gender, and race impact accuracy for most algorithms.

The study revealed that algorithms currently sold in the market can misidentify members of some groups up to 100 times more frequently than others. The group tested 189 algorithms from 99 organisations, which together power most of the facial recognition systems in use globally.

As a result, many of the world’s most advanced facial recognition algorithms are not ready for use in critical areas such as law enforcement and national security. Lawmakers called the study ‘shocking’, The Verge cited The Washington Post, and called on the US government to reconsider plans to use the technology to secure its borders.

The NIST study relied on organisations voluntarily submitting their algorithms for testing. However, Amazon, which sells its Rekognition software to local police and federal investigators, was missing from the list. Still, the BigTech claims that its software cannot be easily analysed by NIST’s tests and its shareholders have resisted calls to curb sales of Rekognition.

All in all, specialists say bias in these algorithms could be reduced by using a more diverse set of training data.


Source: Link


Free Headlines in your E-mail

Every day we send out a free e-mail with the most important headlines of the last 24 hours.

Subscribe now

Keywords: facial recxognition, biometrics, BIAS, ethnicity, Rekognition, Amazon, NIST
Categories: Fraud & Financial Crime
Companies:
Countries: World
This article is part of category

Fraud & Financial Crime