By Stephen Schenck | November 30, 2011 7:56 PM
When you ask Siri for help with something, though it may not have the intuition of a human when crafting its responses, at least you can be sure you’re getting well-rounded, unbiased information, right? We’d certainly like to think so, but the ACLU has pointed out some cases where it seems that Siri might be withholding potentially controversial results from you.
From what we’ve heard so far, we’re not convinced this isn’t just a fluke in Siri’s level of proficiency, rather than a calculated attempt to skew results, but we’re keeping an open mind in the hopes that Apple responds to the ACLU’s concerns with a reasonable explanation.
The ACLU noted that Siri has been giving poor or incomplete responses to questions about abortions and birth control. When asked to find nearby abortion clinics, the app reportedly responded with some pro-life pregnancy crisis clinics, but not the straight-up abortionists requested. Likewise, Siri drew a blank when asked about where to buy birth control.
This may just be a symptom of how Siri’s pulling-up data; maybe it would have no problem finding a store selling a particular form of birth control, should the user specify, but we can see why the ACLU would at the very least like these “glitches” fixed. What do you think, is this perfectly innocent, or have some devs been letting their politics leak into the apps they work on?