The United Nations has discovered that Artificial Intelligence (AI) powered voice assistants are pushing harmful gender stereotypes. The UN published this discovery in a recent study.
The research explores the effect of bias in AI product development and the long term negative implication of conditioning children and the society at large. It explains that it makes people treat the digital voice assistants as ever ready helpers who only exist to unconditionally serve their owners.
The United Nations Educational, Scientific, and Cultural Organisation (UNESCO) carried out the study. It argues that tech companies were preconditioning users to have harmful perceptions of women. This, it said they did by giving voice assistants traditional female names like Alexa and having an automatic default female-sounding system.
It explained that tech companies are not building a proper structure to fight against gendered and abusive language. It added that most voice assistants like Apple’s Siri would deflect aggression or provide a sly joke. For example, if a user gives Siri a sexually explicit command, it will respond with what happens to be the title of the paper, “I’d blush if I could.”
The report states,
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation because the speech of most voice assistants is female, it sends a signal that women are … docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”
The assistant is so subversive and holds no power of agency beyond what the commands it receives. This poses a major problem. Experts write the undertone of using AI as a tool for subtle abuse against women as one of the problems of tech companies. They are often criticised for building AI platforms for consumers in the image of traditional ideas of subservient intelligence.
Voice assistants in a subservient future
Voice assistants are expected to become the primary mode of interaction with machines in the near future. The rise of ambient computing where verbal interaction with software and hardware is where technology heads. This is why the way we interact with these sophisticated intelligence devices can have a lasting cultural and sociological impact how we relate and interact with other human beings at work and in everyday life.
Meanwhile, a report in 2018 stated that Amazon chose a female-sounding voice assistant after market research suggested that it would be perceived as more sympathetic.
Microsoft’s assistant, Cortana, cannot be changed into a male voice. The company said it was based on its female AI character in its video game franchise, Halo. They also have no plans of changing it.
Apple’s Siri derives from a Scandinavian name for females which means “beautiful victory.” However, they introduced a male Siri in 2013 whose default languages are English, French and Arabic.
These clearly show that they thought out and made AI voice assistants female on purpose.
UNESCO intends to create close to gender-neutral assistant voices that will reduce gender-based insults. This, it says is the way to tackle this issue. Furthermore, the result adds that tech companies should not condition users to treat the voice assistants as lesser beings. This way, it will curb harmful stereotypes. They should also try to make them not depict human beings. Some researchers and linguistics are, however, already working on genderless digital voices called Q.
It seems like the technology reinforces the gender bias that women are only useful as secretaries. It also implies that they only help and are at the beck and call of men. The fact is that a lot of tech workers and people that influence these projects are male and white. According to the report, women make up just 12% of AI researchers.