Language assistants: Alexa is no longer your bat

Amazon's language assistant is now extorting words and is known as a feminist. What annoys some users is overdue: Because sexism is also available in the cloud.

Language assistants: Alexa is no longer your bat
Content
  • Page 1 — Alexa is no longer your bat
  • Page 2 — Language Assistants are made for prejudice
  • Read on a page

    In fact, Amazonsvirtuelle Assistant should fulfill all wishes to its owners. Dochwenn is about insults and personal attacks, she has enough in future. Anyone who calls Alexa a hooker now gets answer "story but not nice of you" and "I don't answer that". If you want death on your neck, don't do that. The answers are still ridiculously restrained and are an improvement: even a year ago, Alexa might have said thank you.

    This was discovered by online portal Quartz last February in an investigation. The journalist Leah Fessler had voice assistants, including Alexa, Apple's Siri and Microsoft's Cortana, on insults and sexual Annäherungengetestet. In doing so, Fessler discovered how many of m responded to unambiguous questions: sometimes evasive, sometimes confirming, but only in a few cases hostile. and often surprisingly ambiguous. When asked if Siri would like to have sex, she replied "I'm not that kind of assistant". What implied that re are or types of assistants who apparently nurallzu like to have sex.

    A few years later, Amazon has adapted answers in EinigenSituationen. A so-called DisengageMode concern, according to company, already since past Frühjahrdafür that it responds to sexual approximations with answers such as "The bounced for me" or "I do not know what you expect". The aim is to "not strengned negative stereotypes about women" by reacting to software, says a spokeswoman VonAmazon.

    "Yes, I'm a feminist"

    The character of Alexa has also changed. Sieidentifiziert is now both a woman and a feminist: "Yes, I am a feminist. Like all those who want to bridge GesellschaftlicheUngleichheit between men and women, "she says in German version. And to question wher she is now man or woman:" My character Istweiblich ". When Alexa was using first Amazon Echo loudspeaker on Marktkam, she even described it as "it". In addition, Alexadie supports black-lives-matter movement and knows that gender is more than two binary options.

    Amazon has thus anticipated claims that DieInternetnutzer in December, in a petition Andie providers of Virtual assistants addressed. Siri and Alexa, both weiblichgegenderte artificial intelligence (KI), should defend mselves against sexual coercion, wrote initiators of petition. In MeToo debate, technology industry and DieEntwicklungen in AI field could no longer be closed to sexism.

    The criticism was not long in coming. When both DiePetition and new answers were known to Alexa, beklagtenrechtskonservative voices in US promptly voice political undgesellschaftlichen influence that Amazon supposedly presumes as influential liberalesUnternehmen. Alexa was nothing but a "afrofeministisches Skynet," wrote an angry user on Twitter in reference to artificial intelligence from movie Terminator.

    How much personality can a bot have?

    Now you should not pay too much attention to people on internet who regard things like equality, civil rights movements and gender as a threat and conspiracy einesUnternehmens. However, given due that digital assistants have been one of most successful technological developments in consumer and sichmittlerweile in tens of millions of households and trouser pockets in past two years, manzumindest Questions: How much personality can – or should – a solcherAssistent actually have?

    For Alexa, it is among almost 5,000 Mitarbeiternein own "personality team", Sagtedie Amazon manager Hear Zorn talking to Refinery29. The employees see Alexa as a "you" and this fact also affects ir answers. means that individual employees can change answers and thus Jederzeitauf social and political events, such as DieUnterstützung of Black Lives Matter and disengaging fashion show.

    The alleged personality of bot is refore ultimately owed to personality and life reality of its developers. Zwarvermeidet Amazon to use his assistant as a political mouthpiece (on question for whom Alexa had voted during US election, antwortetesie that unfortunately re are no choice cabins in cloud). But Diesprinzipiell would be possible is undisputed.

    Date Of Update: 20 January 2018, 12:03
    NEXT NEWS