2019-5-22 04:50 |
AI-powered voice assistants from Google, Amazon, Apple, and others could be perpetuating harmful gender biases, according to a recent UN report. The report, titled “I’d blush if I could” — Siri’s response to provocative queries or flirtatious statements — says the female helpers are often depicted as “obliging and eager to please,” which reinforces the idea that women are “subservient.
” Worse, it states, is the way in which they give “deflecting, lacklustre, or apologetic responses” to abuse or criticism. Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch…
This story continues at The Next Web
.
Similar to Notcoin - Blum - Airdrops In 2024