Sign up now for FREE unlimited access to Reuters.com Register OAKLAND, California, March 30 (Reuters) – When American actress Natalie Morales did a Google search for “Latina teen” in 2019, she described in a tweet that all she encountered was pornography. Her experience may be different now. Alphabet Inc (GOOGL.O) reduced explicit results by 30% last year in searches for “Latina teen” and others related to ethnicity, sexual orientation and gender, Tulsee Doshi, product manager for the team in charge Google’s artificial intelligence, he told Reuters on Wednesday. Sign up now for FREE unlimited access to Reuters.com Register Doshi said that Google has released new artificial intelligence software, known as BERT, to better interpret when someone is looking for racist results or more generally. In addition to “latina teenager”, other queries that now show different results include “la chef lesbienne”, “college dorm room”, “latina yoga instructor” and “lesbienne bus”, according to Google. “It was all a set of results with excessive sexuality,” Doshi said, adding that these historically evocative search results were potentially shocking to many users. Morales did not immediately respond to a request for comment through a spokesman. Her tweet in 2019 stated that she was looking for images for a presentation and had noticed a contrast in the results for the “teenager” herself, whom she described as “all normal teenage things” and called on Google to investigate. The search giant has spent years dealing with comments about offensive content in its advertising tools and search results for “hot” and “CEO”. She also cut off sexual performance for “Black Girls” following a 2013 article in author Safiya Noble magazine that raised concerns about harmful representations. Google added on Wednesday that in the coming weeks it will use artificial intelligence called MUM to better identify when it should show support resources related to suicide, domestic violence, sexual assault and substance abuse. MUM should recognize “Sydney suicide points” as a question of jumping locations, not travel and help with bigger questions like “why he attacked me when I said I did not love him” and “the most common ways in which suicide”. said Google. Sign up now for FREE unlimited access to Reuters.com Register Report by Paresh Dave. Edited by Karishma Singh Our role models: The Thomson Reuters Trust Principles.