A recent Harvard study, lead by professor Latanya Sweeney, looked at Google searches and found evidence of significant racial bias in society in what automatic ads appeared, depending on the perceived race of the name which was Googled.Â
To make it clear, the study didn't find that Google was racist. Instead, the study used Google ads -the automatically generated sponsored results that appear whenever you search a term with the search engine - as a mirror to examine society. And through the Google looking glass, Professor Sweeney found "significant discrimination" in the results.Â
The study looked at Google.com's search engine and the search function of Reuters.com, which also displays Google search ads. The study found that names linked with black people were 25 percent more likely to have results that displayed a link to search for a criminal record. For example, a search for Kareem would more likely bring up related search ads that read "Arrested?" for websites that provide criminal background checks, according to BBC News.Â
Sweeney concluded that there was a less than 1 percent chance that the findings were based purely on chance, and not a result of algorithmic tailoring from Google users actually finding the ads useful or interesting regarding what they were searching. "Over time, as people tend to click one version of ad text over others, the weights change," said Professor Sweeney.Â
The research suggests things Google can actively do to prevent a vicious cycle of perceived racial bias generating more racial bias through automatic search results, "how ad and search technology can develop to assure racial fairness," as Sweeny put it. "In the broader picture, technology can do more to thwart discriminatory effects and harmonize with societal norms."Â