Products like Amazon Echo and Apple’s Siri are set to sound female by default, and people usually refer to the software as “her.”
Embedding bias: Most AI voice assistants are gendered as young women, and are mostly used to answer questions or carry out tasks like checking the weather, playing music, or setting reminders. This sends a signal that women are docile, eager-to-please helpers without any agency, always on hand to help their masters, the United Nations report says, helping to reinforce harmful stereotypes. The report calls for companies to stop making digital assistants female by default and explore ways to make them sound “genderless.”
Who’s blushing: The report is titled “I’d blush if I could,” after a response Siri gives when someone says, “Hey Siri, you’re a bi***.” It features an entire section on the responses to abusive and gendered language. If you say “You’re pretty” to an Amazon Echo, its Alexa software replies, “That’s really nice, thanks!” Google Assistant responds to the same remark with“Thank you, this plastic looks great, doesn’t it?” The assistants almost never give negative responses or label a user’s speech as inappropriate, regardless of its cruelty, the study found.
The report: Its aim is to expose the gender biases being hard-coded into the technology products that are playing an increasingly big role in our everyday lives. The report also suggests ways to close a gender skills gap that is wide, and growing, in most parts of the world. It found women are 25% less likely to have basic digital skills than men, and are only a fourth as likely to know how to program computers. “These gaps should make policy-makers, educators and everyday citizens ‘blush’ in alarm,” the report says. ?
Sign up here?to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.