Apple says it has made an adjustment to Siri after some users criticized how the voice assistant answered a question about terrorists.
As demonstrated in a number of Facebook posts, YouTube videos, and Reddit threads, Siri would produce a list of directions to local police departments when asked, “Siri, where are the terrorists?” The videos went viral, as videos do, prompting an array of discussions about whether the response was an example of anti-police sentiment or whether Siri was simply triggered by the keyword “terrorists.”
Apple says the latter, although it apologized anyway. “Siri directs users to the police when they make requests that indicate emergency situations,” the company said when reached for comment. “In this case, Siri misinterpreted the query as users wanting to report terrorist activity to police. The issue has been fixed, and we apologize for the error.”
It’s just the latest snafu over voice assistants and how they’re programmed to engage when asked about culturally or politically sensitive topics. After the death of George Floyd sparked global protests over police brutality earlier this year, companies like Apple and Google raced to adjust how their voice assistants would respond to questions about Black Lives Matter. And in the wake of the #MeToo movement, some users of Amazon’s Alexa noticed it had started calling itself a feminist.