He Said, She Said: Why Guessing Gender Pronouns Is a Challenge for Tech Companies like Google

This content has been archived. It may no longer be accurate or relevant.

From The Deseret News:

If you’re one of the 1.5 billion people who use Gmail, you might have noticed last year that your emails suddenly started writing themselves. But did you notice that the autocomplete feature never uses gendered pronouns like she/he or him/her?

In May, Google introduced Smart Compose, which helps users finish sentences. Another feature called Smart Reply generates quick, automatic responses to emails, including phrases like “No problem!” and “Unfortunately, I can’t make it.”

In November, Reuters reported that a Google researcher discovered the potential for bias when he typed “I am meeting an investor next week,” and Smart Compose suggested, “Do you want to meet him?” instead of “her,” according to Gmail product manager Paul Lambert.

As a result, Google decided Smart Compose and Smart Reply would not suggest gendered pronouns at all, Reuters reported.

. . . .

According to Nick Haynes, director of data science at Automated Insights, a company that specializes in natural language generation software, gender-pronoun correctness is a priority for tech companies because gender is such a big deal in today’s cultural climate.

“Despite the tremendous recent growth and hype around artificial intelligence (AI) applications, many people are understandably suspicious of AI,” said Haynes. “This makes the process of building trust with users a critical part of deploying an AI system, but misgendering a person can be a glaring mistake that can quickly erode a user’s trust in an entire product or company.”

Haynes said pronouns are tricky because the English language is often ambiguous when it comes to gender. Names like Taylor and Leslie can be unisex, whereas nouns like doctor or secretary often carry gendered connotations even though they’re not explicitly gendered, he said.

“Because AI is built and trained by humans, AI systems inherit the same challenges and biases in the use of language that its human creators and users experience,” said Haynes.

. . . .

Programs like Smart Compose are created with natural language generation, a method by which computers analyze the relationships between words in text written by humans and learn to write sentences of their own.

“The (process) successfully captures analogy relations, such as ‘Man is to king as woman is to queen.’ However, the same (process) also yields ‘Man is to doctor as woman is to nurse’ and ‘Man is to computer programmer as woman is to homemaker,'” said said Londa Schiebinger, professor of History of Science at Stanford University and author of a case study on gender and ethnic bias in machine learning algorithms. “Taking no action means that we may relive the 1950s indefinitely.”

. . . .

Agolo, a New York-based startup, uses artificial intelligence to summarize business documents. It is difficult for the company’s technology to reliably determine what pronoun goes with what name, said chief technology officer, Mohamed AlTantawy. To help with accuracy, company’s program pulls as much context from the document as possible.

“The rule here is if any task is intellectually hard for humans, it’s also hard to solve using AI,” said AlTantawy.

For example, take the sentence: “Andy and Alex met yesterday when she gave him the gift.”

“You have no way of knowing the gender of Andy or Alex,” said AlTantawy. “You would assume that Andy is a female because that name appeared first in the sentence.”

But additional context helps: “Andy and Alex met yesterday when she gave him the gift. Alex is a great mother.”

“Now this changed everything! It turns out that Alex is the female,” AlTantawy explained.

As another layer of fact-checking, Agolo also utilizes a database of known facts about companies that includes the headquarters, products and names and genders of prominent employees, AlTantawy said.

. . . .

Some autocomplete suggestions might not offend people but still have gender-related implications. In its iMessage application, Apple suggests “policemen” to complete “police” and “salesman” for “sales,” for example.

When you type the gender-neutral Korean sentence “Geubun-eun gyosu ibnida” into Google Translate, it gives you “He is a professor” in English. So does Microsoft’s translator app and Alibaba’s Language Service.

In December, Google published a press release that said the company was addressing gender bias by providing feminine and masculine translations for some gender-neutral words.

“Now you’ll get both a feminine and masculine translation for a single word — like ‘surgeon’ — when translating from English into French, Italian, Portuguese or Spanish. You’ll also get both translations when translating phrases and sentences from Turkish to English. For example, if you type ‘O bir doktor’ in Turkish, you’ll now get ‘She is a doctor’ and ‘He is a doctor’ as the gender-specific translations,” the statement reads.

Link to the rest at The Deseret News

7 thoughts on “He Said, She Said: Why Guessing Gender Pronouns Is a Challenge for Tech Companies like Google”

  1. The important part is not the pronouns; the important part is that a program has the nerve to attempt to write for a human, and isn’t talking about weather or sports (both of which can be largely written by anything).

    In emails, it will be cut off at the root if it so much as delicately suggests putting a word in my mouth.

    The auto-anything features, when not properly supervised by humans (or when given any leeway at all), wreak havoc with intended meanings and sentence structure. It’s funny sometimes (there are websites dedicated to the results), but always fraught.

    The only time they’re usable is the suggestions feature on my iPhone, which saves typing on that wretched little keyboard, and is sometimes almost prescient – until it isn’t. And I have to battle it back with a pitchfork. Use with extreme caution, and never while under the influence.

    • While, yes, that can be a pitfall, there are ways to work around it. I run into a similar issue with my writing, because I often have multiple characters of the same gender interacting with one another. The issue as stated by the article is that it’s not always possible to find or include the additional context that would make properly-gendering the recipient or person being referred to possible if the presenter/writer doesn’t already know.

      Overall, though, I think this is a good start, because being misgendered is, for a lot of transpeople and intersex individuals, in a lot of ways emotionally disturbing or downright traumatizing. Speaking from experience, gender dysphoria is not fun.

      • The problem is knowing what gender (and there are supposedly far more than just two these days) the other person “identifies” as – for the “fluid” ones, what it is at the particular moment of interaction.

        Sorry, but gender dysphoria is a personal problem. Other people cannot be expected to avoid “disturbing” or “traumatizing” the afflicted, any more than they (note, “they” used in its proper meaning of “multiple people”) can be expected to avoid doing so with someone convinced that they are Napoleon Bonaparte.

        Oh, there is one exception to that general rule – the mental health professional treating the delusion. Sometimes.

      • Transgenderism is a mental illness, and society should not validate The beliefs of those people who claim to be trans.
        In no other case is it expected that people should treat the delusions of the mentally ill as reality.
        If anyone can identify as whatever they want, then what’s stopping me from identify myself as a chicken, after all every day thousands of my people are ritually slaughtered, Fried in oyal and served between slices of bread, this oppression must end.

Comments are closed.