If you’re one of the 1.5 billion people who use Gmail, you might have noticed last year that your emails suddenly started writing themselves. But did you notice that the autocomplete feature never uses gendered pronouns like she/he or him/her?
In May, Google introduced Smart Compose, which helps users finish sentences. Another feature called Smart Reply generates quick, automatic responses to emails, including phrases like “No problem!” and “Unfortunately, I can’t make it.”
In November, Reuters reported that a Google researcher discovered the potential for bias when he typed “I am meeting an investor next week,” and Smart Compose suggested, “Do you want to meet him?” instead of “her,” according to Gmail product manager Paul Lambert.
As a result, Google decided Smart Compose and Smart Reply would not suggest gendered pronouns at all, Reuters reported.
. . . .
According to Nick Haynes, director of data science at Automated Insights, a company that specializes in natural language generation software, gender-pronoun correctness is a priority for tech companies because gender is such a big deal in today’s cultural climate.
“Despite the tremendous recent growth and hype around artificial intelligence (AI) applications, many people are understandably suspicious of AI,” said Haynes. “This makes the process of building trust with users a critical part of deploying an AI system, but misgendering a person can be a glaring mistake that can quickly erode a user’s trust in an entire product or company.”
Haynes said pronouns are tricky because the English language is often ambiguous when it comes to gender. Names like Taylor and Leslie can be unisex, whereas nouns like doctor or secretary often carry gendered connotations even though they’re not explicitly gendered, he said.
“Because AI is built and trained by humans, AI systems inherit the same challenges and biases in the use of language that its human creators and users experience,” said Haynes.
. . . .
Programs like Smart Compose are created with natural language generation, a method by which computers analyze the relationships between words in text written by humans and learn to write sentences of their own.
“The (process) successfully captures analogy relations, such as ‘Man is to king as woman is to queen.’ However, the same (process) also yields ‘Man is to doctor as woman is to nurse’ and ‘Man is to computer programmer as woman is to homemaker,'” said said Londa Schiebinger, professor of History of Science at Stanford University and author of a case study on gender and ethnic bias in machine learning algorithms. “Taking no action means that we may relive the 1950s indefinitely.”
. . . .
Agolo, a New York-based startup, uses artificial intelligence to summarize business documents. It is difficult for the company’s technology to reliably determine what pronoun goes with what name, said chief technology officer, Mohamed AlTantawy. To help with accuracy, company’s program pulls as much context from the document as possible.
“The rule here is if any task is intellectually hard for humans, it’s also hard to solve using AI,” said AlTantawy.
For example, take the sentence: “Andy and Alex met yesterday when she gave him the gift.”
“You have no way of knowing the gender of Andy or Alex,” said AlTantawy. “You would assume that Andy is a female because that name appeared first in the sentence.”
But additional context helps: “Andy and Alex met yesterday when she gave him the gift. Alex is a great mother.”
“Now this changed everything! It turns out that Alex is the female,” AlTantawy explained.
As another layer of fact-checking, Agolo also utilizes a database of known facts about companies that includes the headquarters, products and names and genders of prominent employees, AlTantawy said.
. . . .
Some autocomplete suggestions might not offend people but still have gender-related implications. In its iMessage application, Apple suggests “policemen” to complete “police” and “salesman” for “sales,” for example.
When you type the gender-neutral Korean sentence “Geubun-eun gyosu ibnida” into Google Translate, it gives you “He is a professor” in English. So does Microsoft’s translator app and Alibaba’s Language Service.
In December, Google published a press release that said the company was addressing gender bias by providing feminine and masculine translations for some gender-neutral words.
“Now you’ll get both a feminine and masculine translation for a single word — like ‘surgeon’ — when translating from English into French, Italian, Portuguese or Spanish. You’ll also get both translations when translating phrases and sentences from Turkish to English. For example, if you type ‘O bir doktor’ in Turkish, you’ll now get ‘She is a doctor’ and ‘He is a doctor’ as the gender-specific translations,” the statement reads.