Microsoft Upgrades Its Face Recognition Tools to Work Better With Darker Skin Tones

Advertisement
By Indo-Asian News Service | Updated: 27 June 2018 18:27 IST

Addressing two weak points in the currently available face recognition technologies, Microsoft has updated its facial recognition tools that can better identify people with darker skin tones than before.

With the new improvements, the tools were able to reduce the error rates for men and women with darker skin by up to 20 times.

"For all women, the error rates were reduced by nine times. Overall, with these improvements, they were able to significantly reduce accuracy differences across the demographics," Microsoft said in a blog post written by John Roach late on Tuesday.

Advertisement

Currently, facial recognition tools tend to perform best on men with lighter skin and worst on women with darker skin.

Advertisement

"That improvement addresses recent concerns that commercially available facial recognition technologies more accurately recognised gender of people with lighter skin tones than darker skin tones, and that they performed best on males with lighter skin and worst on females with darker skin," Roach wrote.

The higher error rates on females with darker skin highlights an industrywide challenge - Artificial Intelligence (AI) technologies are only as good as the data used to train them.

Advertisement

If a facial recognition system is to perform well across all people, the training dataset needs to represent a diversity of skin tones as well as factors such as hairstyle, jewellery and eyewear.

The team responsible for the development of facial recognition technology at Microsoft, which is available to customers as the Face API via Azure Cognitive Services, worked with experts on bias and fairness across Microsoft to improve a system called the gender classifier, focusing specifically on getting better results for all skin tones.

Advertisement

"We had conversations about different ways to detect bias and operationalise fairness. We talked about data collection efforts to diversify the training data. We talked about different strategies to internally test our systems before we deploy them," said Hanna Wallach, Senior Researcher in Microsoft's New York research lab.

Wallach and her colleagues provided "a more nuanced understanding of bias," said Cornelia Carapcea, a Principal Programme Manager on the Cognitive Services team, and helped her team create a more robust dataset "that held us accountable across skin tones."

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Further reading: Microsoft, Facial Recognition, AI
Advertisement

Related Stories

Popular Mobile Brands
  1. Supernatural Thriller Jatadhara Now Streaming on OTT: All the Details
  2. OnePlus 15R Confirmed to Come With 32-Megapixel Selfie Camera
  1. Kepler and TESS Discoveries Help Astronomers Confirm Over 6,000 Exoplanets Orbiting Other Stars
  2. Supernatural Thriller Jatadhara Arrives on OTT: Where to Watch Sonakashi Sinha-Starrer Film Online?
  3. OnePlus 15R Confirmed to Come With 32-Megapixel Selfie Camera, 4K Video Recording Support
  4. Rocket Lab Clears Final Tests for New 'Hungry Hippo' Fairing on Neutron Rocket
  5. Apple Rolls Out iOS 26.2 Update for iPhone With Liquid Glass Customisation, Changes to Apple Music, and More
  6. Aaromaley Now Streaming on JioHotstar: Everything You Need to Know About This Tamil Romantic-Comedy
  7. Astronomers Observe Star’s Wobbling Orbit, Confirming Einstein’s Frame-Dragging
  8. Galaxy Collisions Found to Activate Supermassive Black Holes, Euclid Data Shows
  9. JWST Detects Oldest Supernova Ever Seen, Linked to GRB 250314A
  10. Chandra’s New X-Ray Mapping Exposes the Invisible Engines Powering Galaxy Clusters
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.