Google Cuts Racy Results by 30 Percent for Searches Like 'Latina Teenager'

Beside "latina teenager," other queries show results like "la chef lesbienne", "college dorm room", "latina yoga instructor", and "lesbienne bus".

Google Cuts Racy Results by 30 Percent for Searches Like 'Latina Teenager'

Photo Credit: Reuters

Google has AI software BERT to better interpret search queries

Highlights
  • Google query returned a set of over-sexualized results
  • Google to use AI called MUM to begin better detecting of queries
  • The company has already cut sexualised results for "Black girls"
Advertisement

When US actress Natalie Morales carried out a Google search for "Latina teen" in 2019, she described in a tweet that all she encountered was pornography.

Her experience may be different now.

The Alphabet unit has cut explicit results by 30 percent over the past year in searches for "latina teenager" and others related to ethnicity, sexual preference and gender, Tulsee Doshi, head of product for Google's responsible AI team, told Reuters on Wednesday.

Doshi said Google had rolled out new artificial intelligence software, known as BERT, to better interpret when someone was seeking racy results or more general ones.

Beside "latina teenager," other queries now showing different results include "la chef lesbienne," "college dorm room," "latina yoga instructor" and "lesbienne bus," according to Google.

"It's all been a set of over-sexualized results," Doshi said, adding that those historically suggestive search results were potentially shocking to many users.

Morales did not immediately respond to a request for comment through a representative. Her 2019 tweet said she had been seeking images for a presentation, and had noticed a contrast in results for "teen" by itself, which she described as "all the normal teenager stuff," and called on Google to investigate.

The search giant has spent years addressing feedback about offensive content in its advertising tools and in results from searches for "hot" and "ceo." It also cut sexualised results for "Black girls" after a 2013 journal article by author Safiya Noble raised concerns about the harmful representations.

Google on Wednesday added that in the coming weeks it would use AI called MUM to begin better detecting of when to show support resources related to suicide, domestic violence, sexual assault and substance abuse.

MUM should recognize "Sydney suicide hot spots" as a query for jumping locations, not travel, and aid with longer questions, including "why did he attack me when i said i dont love him" and "most common ways suicide is completed," Google said.

© Thomson Reuters 2022


Will the 2022 iPhone SE sink or swim? We discuss this on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Google, Alphabet, BERT
House of the Dragon Release Date: Game of Thrones Spin-Off Premieres August 22 in India
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »