Apple reprogrammed Siri’s responses to deflect questions on feminism, ‘Me Too’: report
Apple reprogrammed its Siri voice assistant to change how it handles “sensitive topics” including feminism and the #MeToo movement, instructing developers to not “engage,” to “deflect” and lastly to “inform,” The Guardian reported Friday, citing leaked documents.
If a user asks Siri about feminism directly, the service is trained never to say the word “feminism.” Instead, it says it supports equality, the Guardian reported, citing internal documents it says were leaked by a former contracted worker who focused on ensuring Siri’s replies were accurate.
When asked about feminism and related topics, if Siri doesn’t reply with a general statement about supporting human equality, it’s reportedly programmed to use its “knowledge graph” — which takes information from Wikipedia and the iPhone dictionary — to inform the user about feminism in an unbiased way, the British newspaper reports.{mosads}
When asked if it’s a feminist or if it supports women’s rights, Siri reportedly used to say “Sorry, I don’t really know,” or “I just don’t get this whole gender thing” or “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.”
Now, it will reply: “I believe that all voices are created equal and worth equal respect,” or “It seems to me that all humans should be treated equally,” The Guardian noted.
The guidelines, last updated in June 2018, also apparently stipulate that “Siri should be guarded when dealing with potentially controversial content” as a reason for dodging questions about feminism.
“They can be deflected … however, care must be taken here to be neutral,” the guidelines reportedly state.
Apple told The Guardian in a statement: “Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”
Apple did not immediately respond to a request for comment from The Hill.
Apple apologized last month for listening to audio recordings of some users’ conversations with Siri and said it would let users opt into having requests recorded while also seeking to minimize the amount of human review of collected audio.
The move followed a Guardian report detailing how contractors listened to stored recordings — and heard sensitive and private information — as part of a quality control program to determine if Siri was successfully completing requests.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..