Afraid of prejudice, Google blocks gender-based pronouns from the new artificial intelligence tool


Google, from Alphabet Inc., launched in May a Gmail feature that automatically completes phrases for users as they type. Tap "I love" and Gmail can propose "you" or "it."

But users are out of luck if the object of their affection is "him" or "her."

Google technology does not suggest gender-based pronouns because the risk is too high for its Smart Compose technology to predict gender or gender identity incorrectly and offend users, product leaders told Reuters in interviews.

Gmail product manager Paul Lambert said a scientist discovered the problem in January when he typed "I'm meeting an investor next week," and Smart Compose suggested a possible question: "Do you want to meet them? What? " instead of "her."

Consumers have grown accustomed to embarrassing self-correcting gaffes on smartphones. But Google has refused to risk at a time when gender issues are reshaping politics and society, and critics are examining potential biases in artificial intelligence like never before.

"Not all screw-ups are the same," Lambert said. Sex is a "big and big thing" to make a mistake.

Now reads: The 14 largest flops of products in Google's history

Getting smart Coming right can be good for business. Demonstrating that Google understands the nuances of AI better than competitors is part of the company's strategy of building affinity with its brand and attracting customers to its cloud computing tools, advertising services and hardware.

Gmail has 1.5 billion users, and Lambert said that Smart Compose assists 11% of messages sent by around the world, where the feature was first released.

Smart Compose is an example of what AI developers call Natural Language Generation (NLG), in which computers learn to write sentences by studying patterns and relationships between words in the literature, emails, and web pages.

A system shown billions of human phrases becomes adept at completing common phrases, but is limited by generalities. Men have long dominated fields such as finance and science, for example, so that technology would conclude from the data that an investor or engineer is "he" or "he." The question stumbles on almost every major technology company.

Lambert said that the Smart Compose team of about 15 engineers and designers tried several alternative solutions, but none proved to be free of prejudice or worthwhile. They decided that the best solution was the most rigid: limit coverage. The ban on gender pronouns affects less than 1% of cases where Smart Compose proposes anything, said Lambert.

"The only reliable technique we have is to be conservative," said Prabhakar Raghavan, who oversaw the engineering of Gmail and other services until a recent promotion.


Google's decision to act with respect to gender follows some high-profile constraints on the company's predictive technologies.

The company apologized in 2015, when the image-recognition feature of its photographic service labeled a black gorilla couple. In 2016, Google changed the autocomplete function of its search engine after suggesting that the anti-Semitic query "is a bad Jew" when users were looking for information about the Jews.

Google has banned expletives and racial slurs from its forecasting technologies, as well as mentions from its commercial rivals or tragic events.

The company's new policy prohibiting gender pronouncements has also affected the list of possible responses in Google's Intelligent Answer. This service allows users to instantly respond to text messages and emails with short phrases, such as "sounds good."

Google uses tests developed by its IA ethics team to discover new biases. A spam and abuse team nudges the systems, trying to find "juicy" glasses thinking about hackers or journalists, Lambert said.

Workers outside the United States seek local cultural issues. Smart Compose will soon be working in four other languages: Spanish, Portuguese, Italian and French.

"You need a lot of human supervision," said engineering leader Raghavan, because "in every language, the network of impropriety has to cover something different."


Google is not the only technology company that fights the gender-based pronoun problem.

Agolo, a New York startup that has received investments from Thomson Reuters, uses AI to summarize business documents.

Its technology can not safely determine in some documents which pronoun matches what name. So the abstract draws several sentences to give users more context, said Mohamed AlTantawy, chief technology officer at Agolo.

He said larger copies are better than missing details. "The smallest mistakes will make people lose confidence," said AlTantawy. "People want 100 percent correct."

Still, the imperfections remain. Predictive keyboard tools developed by Google and Apple propose the "police" gender to complete "police" and "seller" to "sales."

Enter the neutral Turkish phrase "one is soldier" in Google Translate and he says "he is a soldier" in English. The same applies to the translation tools of Alibaba and Microsoft Corp. Inc opts for "it" for the same phrase in its translation service for cloud computing customers.

Experts in artificial intelligence have asked companies to display a statement and several possible translations.

Microsoft LinkedIn said it avoids gender pronouncements in its intelligent smart-messaging tool, Smart Replies, to avoid possible mistakes.

Alibaba and Amazon did not respond to requests to comment.

Warnings and limitations like those in Smart Compose continue to be the most commonly used defensive measures in complex systems, said John Hegele, integrating engineer at Automated Insights, based in Durham, North Carolina, which generates statistical news.

"The ultimate goal is a fully machine-generated system where he magically knows what to write," said Hegele. "There have been a ton of advances, but we have not gotten there yet."


Source link