Microsoft is updating its terms of use to specifically ban the posting of “terrorist content” on its services.
The company is also considering whether to prominently display “positive messaging” in its Bing search results when users look for terror-related materials. However, it will not hide results for terror content on its search engine unless it is banned by local governments.
{mosads}”Terrorism is one of the truly urgent issues of our time,” the company announced Friday. “We are committed to doing our part to help address the use of technology to promote it or to recruit to its causes.”
The announcement follows a number of steps by other tech companies like Twitter. Earlier this year, the Obama administration began publicly pressuring tech companies to help combat the spread of terror-related content online.
Much of the concern about the spread of terror content and recruiting has focused on major social media platforms. While Microsoft does not own any of those, the company said it wanted to be transparent about its policies.
“Although Microsoft does not run any of the leading social networks or video-sharing sites, from time to time, terrorist content may be posted to or shared on our Microsoft-hosted consumer services,” the company said. “In light of this, we want to be transparent about our approach to combatting terrorist content.”
Microsoft said it already bars hate speech and violence on its services, but now it is specifically banning terrorist content.
It defined terror content as posts that support any organization on a sanctions list maintained by the United Nations and that depict “graphic violence, encourages violent action, endorses a terrorist organization or its acts, or encourages people to join such groups.”
Other tech companies had received criticism that their terms of service that banned terror-related content were overly vague.
Microsoft said it will only remove content from its services when its is flagged by others. It will be more strict about removing terror content through Bing searches. It will only remove content on the search engine that is required under local law.
The company, however, is working with nongovernmental organizations to explore displaying “positive messaging and alternative narratives” for people who make terror searches. It is also donating money to fund researchers exploring how to combat terror-related accounts that quickly pop up in slightly different forms after they are removed.
“We have a responsibility to run our various Internet services so that they are a tool to empower people, not to contribute, however indirectly, to terrible acts,” the company said.
“We also have a responsibility to run our services in a way that respects timeless values such as privacy, freedom of expression and the right to access information.”