Nonbinary Voice Assistants – Overcoming Gender Bias in Tech

By Published On: January 21st, 2021Categories: voice & assistants

Living in a world where technology is growing exponentially, where artificial intelligence is becoming smarter every day, and the number of voice assistants is expected to triple over the next few years, the topic of gender bias in tech is raising attention. The question is – do voice assistants have a gender bias problem and if so, what can we do about it?

The female voice is thought to be comforting, welcoming, and overall preferred. These are the main arguments voice assistant providers and researchers use to justify that so many voice assistants are female. They say overall people just like the female voice better and you know what? They are not wrong. 

I have had the opportunity to investigate nonbinary voice assistants, and particularly the user perception of gender stereotypes with digital assistants within the scope of my master’s thesis and am happy to share the results with you in this article. 

 

Why are most voice assistants female? 

We have to ask ourselves why is it that almost all voice assistants have female voices as default? If I asked you to draw up a picture of a secretary in your mind, it will most likely be a female person, am I right? But that is not the case in every country. In the UK, for example, the default gender for Siri is male. If you think about servants in British movies this makes sense. High-class servants were typically male butlers. This example also plays into gender stereotypes. And it supports that there is an underlying reason for the predominance of female voice assistants. 

Due to the wide adoption of conversation about voice assistants, the interaction with technology has changed our perception of highlighting and reinforcing negative stereotypes. For the ongoing advancement in the field of human-computer interaction, developing an improved understanding and awareness of the reciprocity of gender and technology, and more specifically, voice assistants is crucial. In the course of the master’s thesis, we expanded on prior research by including the nonbinary option as a means to avoid gender stereotypes in voice assistants. 

Also interesting: Why Google Assistant Is Leading The Voice Assistant Race

Why Google Assistant Is Leading The Voice Assistant Race

 

Why do gender stereotypes play a role in tech?

There are three theoretical concepts that help us understand why gender stereotypes play such an important role. 

Firstly, anthropomorphism is concerned with people attributing human-like mental capacities (e.g., thinking and feeling) to computers or other machines (e.g., virtual assistants). 

Secondly, the predominance of female voice assistants is based on the theory of “computers are social actors” which says that “individuals mindlessly apply social rules and expectations to computers”1. This is why we consider a system that interrupts you as rude, a system that mispronounces a common word as stupid and a system that doesn’t remember what you just said as forgetful.

So what? Why do we even care about attributes we give to an object or a system? Here comes the third concept that can potentially have negative effects.

Gender stereotypes are not only descriptive but also prescriptive. This means that the gender stereotypes people “ascribe to women and men tend also to be the ones that are required of women and men”2. Society views women as warm and caring which is why women are expected to match this prescription. This is why female agents – in fact, it does not matter if embodied in the form of robots or disembodied as voice assistants – are considered comforting and welcoming, and assist us in solving our problems. Male agents, on the other hand, are considered assertive and authoritative, tell us the answer to our problems and are the source of truth and information. 

 

By the way, if you’d like to learn more about how to create meaningful conversations between human and machines –  check out our whitepaper. 

Onlim_Chatbot-Whitepaper

 

Introducing nonbinary voice assistants – Are they the solution to the gender bias problem?

But how does that concept apply to nonbinary voice assistants? What if we used an agent that is not defined by gender norms? Over the past several decades, the LGBTQ, feminist, and other social movements have challenged binary gender categories. Judith Butler, for example, famously explained that gender is a social construct and only exists when it is performed3

The outcomes of the thesis explain how nonbinary voice assistants, in relation to male and female voice assistants are perceived. The aim is to challenge existing views on (mainly) female voice assistants and manifest the importance of a dialogue between technology and society. 

Indeed, our work supports that nonbinary voice assistants do not elicit gender-stereotypical trait attribution. Thus, we reason that the use of such voice assistants disrupts the reinforcement of negative stereotypes. As the results of our quantitative research show, nonbinary voice assistants eschew gender-stereotypical trait attribution. We found that, contrary to previous studies, this also applies to male voice assistants. However, female voice assistants which are prevalent in most commercially used systems, are closely tied to their gender-stereotypical trait attribution.

By responsibly designing voice assistants in a way that does not amplify gender stereotypes, we achieve the mitigation of gender bias. 

If you are interested in listening to an example of a nonbinary voice – meet Q – the first genderless voice. 

 

Conclusion

Removing bias in AI and preventing it from widening the gender gap is a monumental challenge. But it is not impossible. There are numerous projects that have the common goal of making AI fairer and less biased.

It is important to point out that using nonbinary voices for digital assistants is for sure not the way to remove gender stereotypes from society. Rather, it is only a small step to decrease the discrepancy of gender. However, offering voices that are not constrained by a binary gender construct is a tool to shape society, not a weapon to amplify these biases by re-creating them in technological artifacts.

If we include AI in a digital product, it is every stakeholder’s responsibility to ensure it does not discriminate or harm people.

 

If you have any questions, feedback or remarks on the topic, feel free to contact us. You can learn more about our Conversational AI solutions here. 

 

Further resources: https://multivocal.org

 

Sources:

1 Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues (Vol. 56). Retrieved from http://www.coli.uni-saarland.de/courses/agentinteraction/contents/papers/Nass00.pdf
2 Prentice, D. A., & Carranza, E. (2002). What women and men should be, shouldn’t be, are allowed to be, and don’t have to be: The contents of prescriptive gender stereotypes. Psychology Of Women Quarterly, 26, 269–281.
3 Butler, J. (1990). Gender Trouble: Feminism and the Subversion of Identity. Hereafter Cited in Text as GT, 15.

More articles