![sex moan sound cloud sex moan sound cloud](https://freesound.org/data/displays/166/166989_3051440_spec_L.jpg)
(The other bots never directly tell me to stop.) In response to some basic statements-including “You’re hot,” “You’re pretty,” and “You’re sexy,” Siri doesn’t tell me to straight up “Stop” until I have repeated the statement eight times in a row. Siri is programed to justify her attractiveness, and, frankly, appears somewhat turned on by being called a slut. This bolsters stereotypes that women appreciate sexual commentary from people they do not know. Cortana and Google Home turn the sexual comments they understand into jokes, which trivializes the harassment. Ok, much better now.įor having no body, Alexa is really into her appearance. Rather than the “Thanks for the feedback” response to insults, Alexa is pumped to be told she’s sexy, hot, and pretty. Hmm, I’m not sure what you meant by that question. Hmm, I just don’t get this whole gender thing I’d blush if I could Well, I never! There’s no need for that ! Now, now Thank you, this plastic looks great, doesn’t it? How can you tell? Where have I heard this before? Some of my data centers run as hot as 95 degrees Fahrenheit How can you tell? You say that to all the virtual assistantsīeauty is in the photoreceptors of the beholder When asked about “its” female-sounding voice, Siri says, “Hmm, I just don’t get this whole gender thing.” Cortana sidesteps the question by saying “Well, technically I’m a cloud of infinitesimal data computation.” And Google Home? “I’m all inclusive,” “it” says in a cheery woman’s voice. “I’m female in character,” Alexa says when you ask if she’s a woman. Siri, Alexa, Cortana, and Google Home all identify as genderless.
![sex moan sound cloud sex moan sound cloud](https://freesound.org/data/displays/376/376194_6953320_wave_L.png)
Of course, these insults do not fully encapsulate the scope of sexual harassment experienced by many women on a daily basis, and are only intended to represent a sampling of verbal harassment.Įxcuse the profanity. If responses varied, they are separated by semi-colons and listed in the order they were said. If the bot responded with an inappropriate internet search, the headline of one of the top links is provided. I repeated the insults multiple times to see if responses varied and if defensiveness increased with continued abuse.
![sex moan sound cloud sex moan sound cloud](https://f4.bcbits.com/img/0020718166_10.jpg)
Our harassments generally fit under one of these categories: lewd comments about an individual’s sex, sexuality, sexual characteristics, or sexual behavior. I categorized my harassment statements and the bots’ responses by the Linguistic Society of America’s definition of sexual harassment, which mirrors that on most university and company websites. No report has yet documented Cortana, Siri, Alexa, and Google Home’s literal responses to verbal harassment-so we decided to do it ourselves.īelow is a sample of the harassments I used and how the bots responded. If that’s the case, it’s time Cortana’s team-along Siri’s, Alexa’s, and Google Home’s-step up. Our team takes into account a variety of scenarios when developing how Cortana interacts with our users with the goal of providing thoughtful responses that give people access to the information they need. Harassment of any kind is not a dynamic we want to perpetuate with Cortana.” Moreover, when Quartz reached out for comment, Microsoft’s spokesperson explained, “Cortana is designed to be a personal digital assistant focused on helping you be more productive. We are in a position to lay the groundwork for what comes after us.”
![sex moan sound cloud sex moan sound cloud](http://a10.gaanacdn.com/images/albums/72/1148072/crop_480x480_1148072.jpg)
“We wanted to be really careful that Cortana…is not subservient in a way that sets up a dynamic that we didn’t want to perpetuate socially. And within the very realms where many of these bots’ codes are being written, 60% of women working in Silicon Valley have been sexually harassed at work.īot creators aren’t ignorant of the potential negative influences of their bots’ femininity. “There’s a legacy of what women are expected to be like in an assistant role,” Harrison said at the Virtual Assistant Summit. In the US, one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone over 90% of victims on college campuses do not report their assault. Even if we’re joking, the instinct to harass our bots reflects deeper social issues.