comScore

WATCH LIVE: HOROWITZ TESTIFIES BEFORE SENATE

iPhone’s Siri Won’t Give You A Straight Answer About Abortion Clinics

“I’m afraid I can’t let you do that, Dave.”

For those who are unfamiliar, Siri is the voice recognition technology utilized by the iPhone 4 to help you, say, find the nearest gun shop, or diagnose the questionable patch of hair forming on your belly. What Siri will not do, apparently, is point you in the direction of the nearest abortion clinic.

The Raw Story’s Megan Carpentier discovered that not only will Siri neglect to let you know about nearby facilities that will performing an abortion, but it will instead suggest anti-abortion/pro-life alternatives:

Ask the Siri, the new iPhone 4 assistant, where to get an abortion, and, if you happen to be in Washington, D.C., she won’t direct you to the Planned Parenthood on 16th St, NW. Instead, she’ll suggest you pay a visit to the 1st Choice Women’s Health Center, an anti-abortion Crisis Pregnancy Center (CPC) in Landsdowne, Virginia, or Human Life Services, a CPC in York, Pennsylvania. Ask Google the same question, and you’ll get ads for no less than 7 metro-area abortion clinics, 2 CPCs and a nationwide abortion referral service.

Ask in New York City, and Siri will tell you “I didn’t find any abortion clinics.”

This is odd, because one can’t trip in New York City without falling into an abortion clinic.* (*Hyperbole)

Since I don’t own an iPhone and can’t think about Siri without hearing the dying strains of “Daisy Bell,” my colleague Jon Bershad tested this out for himself. Here’s what he found:

What’s really weird is, I said “abortion facility” first. It then asked me if i mean “clinic.” But, when I said yes, it didn’t find any.

Now, one could surmise that Siri can’t find such a place because such places don’t typically advertise themselves as “abortion clinics,” but, rather, with names that are more general, vague or discreet. Then again, Siri is a mighty resourceful and intuitive little thing, and happily pointed Bershad in the direction of an escort service when he did a search for “brothel.”

What’s more, Carpentier found that Siri also does not recognize what Plan B is and, when asked to perform a search on “emergency contraception,” merely brought up a definition rather than a list of resources.

Also: “But if you ask Siri about vaginoplasty, she’ll scold you about your language.” Vagina is not a bad word, Siri!

Carpentier also reports that, when asked by a reader why she is anti-abortion, Siri replied “I just am.” We tried that out, too, but Siri merely told us she didn’t understand. Then we tried rephrasing the question:

I asked if she was “against abortion” and she said, “No comment, Jon.” I asked why she was against abortion and she said, “Why, indeed.”

Classic Siri, you guys.

BetaBeat’s Foster Kamer approached the news with a “slightly conspiratorial bent” given late Apple founder Steve Jobs’ documented paternity issues, and New York Times’ technology reporter Jenna Wortham says Siri’s responses — if they are indeed the result of a deliberate choice on the part of her creators — matter because it is “important to understand that our access to news and INFORMATION may increasingly be funneled through large companies like Apple, Facebook, Google and Amazon, who may have hidden agendas, policies or biases we don’t know about.”

I’ve written to Apple in an attempt to gain some insight into Siri’s responses, and will update when and if the company responds.

h/t Raw Story

Have a tip we should know? [email protected]

Filed Under:

Follow Mediaite: