Voice Assistants: Congrats It's A Girl
I have just had a baby girl. I mean it is probably worth noting my wife played some part in her gestation and delivery, but as a modern progressive couple I’ll assume a minimum of 50 percent of the credit.
Her arrival has made me consider what the world holds in store for this little female version of me. As I bark at Siri, holding my daughter in the dark, for a “how-to” video on baby swaddling, I suddenly feel unsettled.
Why Are Voice Assistants Female?
As it becomes second nature to bark orders at the ‘person in our pocket’, does it matter that this person seems to be a she? Over 710 million people regularly use an AI assistant and this is set to more than double by 2021. Should I be celebrating the female position as an oracle, or furious that my daughter has been assigned the role of assistant in the world’s future?
It could be argued that assigning gender to an AI assistant is a moot point. When asked, Siri, Alexa, Cortana and Google Home, all respond with claims of gender neutrality.
This claim might be easier to accept if they didn’t all have a female voice as default and, with one exception, names whose origins are hardly genderless – Cortana is named after a semi-nude female character from the video game Halo and Alexa was selected in favour of Alex, as a homage to the Great Library of Alexandra. Sexy librarian anyone?
It is perhaps little wonder that no one seems to have considered if the use of a female voice could have any downsides. With women in computing roles steadily declining since a peak in 1991, women now represent less than 25 percent of the workforce. Evidently, the Adams far outweigh the Eves in the creation and development of this technology.
Microsoft researcher Margaret Mitchell refers to this imbalance as the ‘sea of dudes’. Is it possible that they are guilty of trying to realise their Weird Science, Her and Ex Machina fantasies or is there another explanation for the female AI assistants?
Numerous studies, including one by Karl MacDorman, associate professor at Indiana University, assert that both men and women report an overwhelming preference for a female voice. With such strong evidence suggesting a mutual preference, what risks are there in reflecting this in our AI assistants? Well, as this technology integrates itself into our pockets and our homes, will children everywhere have a new female role model? A servant who does exactly as she is told.
“Please” and “thank you” are unnecessary, when “Alexa, Stop!” will do just fine, thank you very much. As my little one grows up interacting with such technology, is there a chance that a female voice is the one she associates with subservience?
Why not use male and female voices? Clifford Nass in his seminal work ‘Wired for speech’ suggests voice preference is in fact more subjective than MacDorman would have us believe. He found participants preferred a male voice when learning and a female voice when receiving life advice. It is hard to ignore the inherent stereotyping in these findings but let’s focus on the idea that the task requested should play a key part in which voice is most effective.
Training Our "Bots" To Be Better
In fact, an AI assistant’s awareness of which voice is most appropriate might be closer than you think. Amazon has recently announced research into voice recognition to detect mood and even the kind of words a person uses. Will we soon be in a place where Alexa selects her voice to suit our mood and the message being delivered? Maybe we can rely on big data, as opposed to the sea of dudes to best decide the gender of our AI assistants, I mean what could possibly go wrong?
It turns out quite a lot. Microsoft’s Twitter bot Tay was designed to learn from fellow Twitter users and create its own Tweets based on its interactions. Users, instead of spreading love and acceptance subverted their interactions, teaching Tay to curate content that was so racist and abusive it was taken offline after only 16 hours.
This article was originally published on Mumbrella.