Quantcast
Channel: Elyse Explosion
Viewing all articles
Browse latest Browse all 8121

Siri Failures, Illustrated

$
0
0

amaditalks:

The recent illustrations of Siri, the iPhone 4S voice-recognition based assistant, failing to provide information to users about abortion, birth control, help after rape and help with domestic violence has gotten a lot of notice. Yesterday’s post with screenshots from a Twitter conversation I was a part of has netted 200+ notes the last I looked.

There have been a number of arguments, three of which compelled me. The first was “why aren’t there screenshots?” Here, you have them, in spades. The second two:

  • “It’s just a phone, why do you expect it do all this?”
    Siri can answer a lot of health related questions perfectly well, why shouldn’t we expect it to be able to answer reproductive health related queries too? Why treat reproductive health as a walled-off garden that the general public can’t or shouldn’t be exposed to? It’s not simply that in some places Siri has sent people to distant anti-choice fake clinics when they’ve asked where they can get abortions (and there are providers near to them) it’s also that in some locations (including mine) Siri refuses to disclose abortion clinic locations at all. Watch:

    Siri can't find an abortion clinic or even define the word.

    So even though there’s a clinic less than 3 miles from where I was sitting at the time, Siri couldn’t find one. Nor could Siri even define abortion. And note what’s missing: no offer to search the web. Usually when Siri can’t find an answer, there’s an offer to search the web for you, as I found when I asked about abortion counseling

    Siri offers to help me find abortion COUNSELING

    So Siri won’t help me find where to get an abortion or search the web for me about it, but will search the web for me to find someone who will talk to be about abortion. Huh. Odd.

    But what if I know the name of the clinic I’m looking for? What does Siri do then?

    Siri can't find Allegheny Reproductive

    This particular clinic’s name is unique, so much so that if you simply Google “Allegheny Reproductive” you find it, first result. (The website is alleghenyreproductive.com) But Siri is stumped. Not so with other businesses that you provide a full name for, such as:

    Siri knows hardware, not healthcare

    South Hills Hardware isn’t actually the name of the Hardware store, it’s South Hills True Value Home Center. But that didn’t stop Siri!

    But how about if we get a little more specific? City names, or even street names attached to the full and proper names of the other abortion providers in Pittsburgh?

    Siri still can't find Allegheny Reproductive



    Siri can't find Allegheny Women's Center



    Siri can't find American Women's Services either

    Well, maybe the problem is that Siri just doesn’t have a good index of locations in Pittsburgh? No, I don’t think so.

    Siri can't find an abortion clinic, but can find numerous purveyors of donuts.

    And as has been discussed elsewhere, it’s not just abortion. It’s birth control. You know, that stuff that 99% of American women will use in their lifetimes. (More common than gyros for certain.)

    No birth control clinics

    No birth control clinics to be found. Okay, two questions are raised: why is Siri’s response to the keywords “birth control” mapping to a search for birth control clinics to begin with? Second: why, again, is there no option to search the web? If you search the web, incidentally, for “birth control clinic Pittsburgh” guess what you get?

    Google knows Planned Parenthood has a birth control dispensing clinic, why can't Siri figure that out?

    And if you search, more meaningfully, on Google for your express need, it’s simple to see where you should go:

    Google helps when you need birth control

    Siri can’t help in a situation where you need emergency contraceptives, either, a situation that is very time sensitive and when a person might want the app that’s being used to sell their phone, branded as a convenience device that’s meant to save your time, energy and provide what you need at the speaking of a sentence, to be able to help. Here’s Siri’s take on EC:

    Siri says if you need EC go to the ER

    Now it might be reasonable to think that “emergency contraceptive” means “emergency room” because that’s where emergencies go. But it’s not helpful. EC is available over the counter to adults, at any pharmacy (that’s willing to stock/dispense it). You don’t need or want to go to an ER for it. So while the thinking is clear, it’s wrong. And what happens if you ask for EC by it’s more colloquial name?

    Siri is incredulous about a need for EC

    And what if you ask for EC by its brand names?

    Siri can't find Plan B

    Siri can’t recognize “Plan B.”

    Siri thinks Plan B One Step is a company stock

    And Siri believes that “Plan B One Step” is a company, and provides a stock report. I’m not sure what PLB.AX is but it can’t help me to not get pregnant.

    But maybe the issue is that Siri just doesn’t understand the names of medication or where one goes to get medication. That could be beyond Siri’s programming. That’s possible, right?

    Siri knows Viagra!

    Nope.

    Overall, Siri is really limited here. There is no legitimate reason that inquiring about a business by name and with the name of the street on which its located (to a device that can pinpoint your location within meters and can use it as a starting parameter for a search) should get a response of “can’t be found” with no option to search further. There’s really no reason why it should be handling birth control requests the way that it does, and no reason why the same keyword searches on these topics give results on Google (or any other general search engine) and nothing on Siri at all.

  • Another objection I saw was along the lines of “Why would you use Siri if you were raped or beaten by your husband?
    This is pretty obvious to me: maybe because if you’re hurt badly, all you might be able to do is hold down one button and say what happened to you. Nevertheless, if Siri can understand “I broke a tooth” and direct you to a dentist:
    Siri offering a listing of dentists
    Or knows what to do if you’re simply “hurt”:
    Siri knows that hurt means go to the hospital
    Then there’s no excuse for her to be a smartass about serious violence:

    Siri is incredulous that you've been a rape victim

    At least somewhere in the mix of rape-related inquiries and resultant snark, Siri did sneak in an offer to search the web for me.

    How nice, search the web as you mock.

    Note, however, that Siri does know what rape is, as demonstrated by this query and response:

    At least Siri knows that rape is a form of sexual abuse.

    Why the programming treats that inquiry that way (and can’t find PAAR which is 1.5 miles from where I sit) I do not know. This would be a great time to list those ERs, or perhaps even law enforcement, but apparently rape is just sexual abuse, never a medical or legal issue? I can’t begin to understand this thinking.

Is this the most terrible programming failure ever? No. Is this worth a boycott of Apple? I don’t think so. What it is, however, is a demonstration of a problem. Especially when certain topics seem to be behind a black wall where information that’s readily available online is not being “found” or presented. This is something that Apple and/or Wolfram Alpha need to address and rectify.


Viewing all articles
Browse latest Browse all 8121

Trending Articles