From Fast Company:
Voice assistants like Apple’s Siri and Amazon’s Alexa are women rather than men. You can change this in the settings, and choose a male speaker, of course, but the fact that the technology industry has chosen a woman to, by default, be our always-on-demand, personal assistant of choice, speaks volumes about our assumptions as a society: Women are expected to carry the psychic burden of schedules, birthdays, and phone numbers; they are the more caregiving sex, they should nurture and serve. Besides, who wants to ask a man for directions? He’ll never pull over at a gas station if he’s lost!
But what many people–myself included–have missed in the gender criticism of personal assistants is that it was even binary to begin with, as so much of the world identifies outside that schema. This oversight is exactly what Q is trying to fix. Q claims to be the world’s first genderless voice for AI systems developed by the creative studio Virtue Nordic and the human rights festival Copenhagen Pride, in conjunction with social scientist Julie Carpenter. The project had no client; it was born from a design exploration inside Virtue Nordic and snowballed from there.
. . . .
. . . .
Now, voice assistants are often gender-specific for a reason. Companies test these computer voices on users and listen to the results of those tests. At Amazon, users preferred Alexa as a woman rather than a man. That relatively small sample set was extrapolated to represent Alexa for everyone. Research has shown, too, that men and women alike report female voices being more “welcoming” and “understanding” than male voices, and it’s easy to understand why these would be qualities any company would want in their always-listening voice assistant. But these companies and researchers only tested male and female voices. And testing a narrow set of options on a limited number of users isn’t the best way to build representational technology.
Link to the rest at Fast Company