Voice assistants such as Amazon Alexa, Apple Siri, Microsoft Cortana and Google Home are helping to reinforce and spread sexism, according to a United Nations study.
The UNESCO study with the curious title: “I would blush if I could” (this is the complacent response that Apple Siri gives when she receives the insinuation of being a prostitute) asserts that software of this kind perpetuates the stereotype of the servile and submissive woman even when mistreated.
“The fact that most voice assistants have female voices and are perceived as feminine sends the false message that women are a docile helper, available at all hours with a simple button or a “hey” thrown in there.”
"The assistant has no assertive power over the commands given. She honors commitments and satisfies requests regardless of the tone or hostility with which they are made."
The tendency of voice assistants to be considered particularly worrying “deflect, overlook or give conciliatory answers” when insulted, reinforcing the belief that women are subjected to abuse and harassment.
“Companies like Apple and Amazon, made up of predominantly male engineering teams, have built AI systems that transform the voice assistants in female entities who deal with abuse by flirting in a passive and submissive way,” the report continues.
The UNESCO study suggests that digital assistants should be programmed to discourage sexist insults, that companies should stop equipping them with female voices at the base and in any case provide different representations of women in the various manifestations of the artificial intelligence provided to these devices.
The choice to give a default female voice is the result of market research made by technology giants, but the study refutes these conclusions by arguing that most people prefer a voice of the opposite sex, not necessarily female.