
12/03/2024
Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa have come to perpetuate damaging gender stereotypes, and this digital gender bias has real-life implications.
"The place of women and the perception of their role has been, in part, deputised to machines, and we are fooling ourselves if we believe that automation doesn’t need just as much societal supervision as humans," says Carla Wilshire OAM, author of 'Time to Reboot: Feminism in the Algorithm Age'.
Siri and friends perpetuate damaging gender stereotypes: hostess, maid, secretary, muse, consort.