Voice assistants such as Amazon Alexa, Apple Siri, Microsoft Cortana and Google Home are helping to reinforce and spread sexism, according to a United Nations study.
The UNESCO study with a curious title: "I would blush if I could" (is the complacent response that Apple Siri gives when he receives the insinuation of being a prostitute) asserts that software of this kind perpetuates the stereotype of the servile and submissive woman even when mistreated.
"The fact that most voice assistants have female voices and are perceived as female conveys the false message that the woman is a docile helper, available at all hours with a simple button or a" hey "thrown in."
"The assistant has no assertive power with respect to the commands that are given. She honors her commitments and satisfies requests regardless of the tone or hostility with which they are made ".
The tendency of voice assistants to be considered particularly worrying "deflect, overlook or give conciliatory responses" when insulted, reinforcing the belief that women are subjected to abuse and harassment.
"Companies like Apple and Amazon, made up mostly of male engineering teams, have built AI systems that transform voice assistants into female entities that engage in abuse by flirting passively and submissively," the report continues.
The UNESCO study suggests that digital assistants should be programmed to discourage sexist insults, that companies should stop equipping them at the base with female voices and in any case provide different representations of women in the various manifestations of the artificial intelligence provided to these devices.
The choice to give a default female voice is the result of market research made by technology giants, but the study refutes these conclusions by arguing that most people prefer a voice of the opposite sex, not necessarily female.