Much has been said about gender bias in artificial intelligence (AI) and machine learning, technology that mirrors the values and perceptions of its creators. According to the World Economic Forum’s latest Global Gender Gap Report, only 22% of artificial intelligence professionals globally are female, compared with 78% male.
It’s not surprising that many consumers of digital assistant products expect a distinctly male or female voice, because the only choices available are binary. To some, this advances a stereotype of women as naturally more supportive, helpful, and inclined to perform well in an assistant’s role, like Siri and Alexa; and men in more authoritative roles, such as in military and security applications.
Bias and built-in stereotypes, intentional and not, are only part of the representation picture, however. There is a portion of humanity that is not represented at all. Most technology companies that have developed digital assistants tested only men’s and women’s voices, leaving out transgender voices, and those of nonbinary people who identify as neither male nor female.
Creating more-inclusive AI
Julie Carpenter, Ph.D., is a research scientist and associate research and development principal at Accenture, where her work centers on the study of human-AI interactions. Dr. Carpenter is one of the guiding advisers for Project Q, the world’s first gender-neutral digital assistant, developed by Virtue Nordic in collaboration with human rights festival Copenhagen Pride.
“If you think of artificial intelligence as a communication medium, the people designing it are communicating their goals for the product and how they anticipate its use, versus how the users are using it, receiving those messages and acting on them,” Dr. Carpenter said. “And when you omit representation of groups of people, it’s a form of oppression. You’re essentially sending a message that those groups don’t exist.
“It’s not only demeaning to those groups, but it can contribute to stereotyping in the wider world, because those groups are not being represented at all,” she continued. “It’s saying this group of people doesn’t matter — or they’re not equal to the groups we’ve identified and included.”
Q aims to fill that representation gap. It’s a gender-neutral voice — neither male nor female. When Q’s creators, Emil Asmussen and Ryan Sherman at Virtue Nordic, reached out to Dr. Carpenter because of her work in gender and ethics in technology, she volunteered her time. She guided their research methods and the forming of participant groups, connecting and networking them to the people they needed — among them Denmark’s technology ambassador, Casper Klynge, who endorsed Q in a promotional video.
Dr. Carpenter also offered general research guidance, reviewing computational linguist Anna Jørgensen’s research, and how it could be incorporated into Q’s overall development process. “The representation of that nonbinary group is extremely important,” Dr. Carpenter said. “But once you’ve included them, you want to make sure that it’s authentic and not a misrepresentation that falls into a stereotype that would misinform and become detrimental.”
Finding Q’s voice
Q came together quickly, in the space of less than a year. The team began by mixing multiple voices of people who identified across the gender spectrum as nonbinary, transgender, male, female, etc. “Q is a human-based voice, and a lot of AI voices aren’t necessarily,” Dr. Carpenter explained. “But we weren’t getting a final mix that was workable. So we ended up focusing on one person’s voice and then modulating it.”
Research has identified a “sweet spot” between 145 and 175 Hz, where the human ear has difficulty distinguishing between male and female voices. Once the team modulated Q to a center range of 153 Hz, the voice was screened by a group of 4,500 people across Europe, and it was this voice that they responded to most as gender neutral.
Q’s voice is not flat, but has pitch and modulation that varies within its 30 Hz range. It also has a gentle Dutch inflection. “We knew that accent was something we would have to be careful with,” Dr. Carpenter said, adding that Q “1.0” is an initial proof of concept. “We plan to keep tweaking it and possibly come up with other iterations, in other languages, if we can find funding,” she said.
For some people, not being able to “tick a gender box” when listening to Q can lead to all sorts of questions. That’s exactly one of the things Q is designed to do — start conversations about inclusivity and social issues. “We put others in social categories as we believe we’ve identified them through different cues, one of them being voice,” Dr. Carpenter said. “If you instinctively feel a sense of confusion that you can’t categorize Q, that may be normal for you as something to be distracted by. But it may not be that way for somebody else. It’s really just a projection of your own biases or values, if you aren’t used to thinking in a gender-neutral way.”
Holding up the mirror
“At the same time, it’s 2019, and we recognize that transgender and beyond binary gender roles do exist, and including them in processes like Q can really expand all of our minds,” Dr. Carpenter said. “Those of us who are outside of that group can begin to ask ourselves those questions. And isn’t that one of the interesting parts about technology and culture — how it mirrors how we think? If you’re omitting a certain group of people, you’re mirroring the beliefs and values of the person who coded the voice assistant and left out that group. I would like to see Q take off and mirror a more inclusive way of looking at everyone.”
Though Q’s creators have been approached by major companies who want to buy it outright for use with their technologies, the team intends to keep Q available as open source. Now that its sound has been established, the next task will be building out its AI framework. Ultimately, Virtue Nordic’s goal for Q is to build it into applications beyond digital assistant technology, making it a new voice for AI in mass transit stations, video and virtual reality games, theaters, and more.
The team hopes that Q will spur tech companies to create more inclusive AI. As Ambassador Klynge said in Q’s promotional video, “It’s basically about keeping the technology companies responsible. We believe they need to take a societal responsibility which is proportional to the kind of influence they’re exercising.”
“Hey, Siri and Alexa! Make Room for Q” was written by Seabright McCabe, SWE Contributor. This article appears in the 2019 Conference issue of SWE Magazine.