Technology sector should not be shielding sex traffickers online
Sex trafficking is a growing problem that has moved from the street to the smart phone.
The National Center for Missing and Exploited Children (NCMEC) reports an 846 percent increase in reports of suspected child sex trafficking from 2010 to 2015, which it found to be “directly correlated to the increased use of the Internet to sell children for sex.”
{mosads}Victims of illegal sex trafficking have found it difficult to seek justice in the courts due to the blanket liability protection provided in Section 230 of the Communications Decency Act (CDA) to technology companies that host sites that facilitate sex trafficking, even when it’s been found that those sites actively and knowingly facilitated the criminal sex trafficking of vulnerable women and young girls, and covered up evidence of the crimes.
In August, a bipartisan group of U.S. Senators introduced narrowly crafted legislation to make it possible for victims of sex trafficking to hold online enablers accountable. The Stop Enabling Sex Traffickers Act (SESTA) was introduced by Senators Rob Portman (R-Ohio), Richard Blumenthal (D-Conn.), John McCain (R-Ariz.), Claire McCaskill (D-Mo.), John Cornyn (R-Texas) and Heidi Heitkamp (D-N.D.), and is supported by human rights activists, courts, district attorneys, and attorneys general from across the U.S. for its potential in protecting trafficking victims.
“Internet companies right to close neo-Nazi sites, but terror still too easy to find” https://t.co/obfu6GgHVr pic.twitter.com/jT7X6Q341q
— The Hill (@thehill) September 11, 2017
Major technology companies are, unfortunately, lobbying to defeat this sensible and modest bill designed to make it harder for sex traffickers to misuse the Internet and abuse women and children for profit.
Under the guise of a “free and open Internet,” the technology sector opposes SESTA. When invited to collaborate on the legislation, however, they stonewalled Senate outreach and offered “no constructive feedback.”
A reasonable argument could be made that breaking the liability shield of CDA 230 could interfere with Internet freedom. This shield, however, is not absolute and technology companies are, for example, required to comply with notice-and-takedown requests to remove copyright material. Surely we can agree that if these companies are required to protect the financial interests of the movie and music industry, they should take responsibility to protect some of the most vulnerable among us.
Putting aside the bill’s narrowly crafted, straightforward nature, the opposition puts Google, Facebook, Twitter and others on the same side as Backpage whose website has been involved in 73 percent of all child trafficking reported to NCMEC by the general public.
Facebook allowed advertisers to target categories like “Jew haters”: report https://t.co/8SbtTX3oFY pic.twitter.com/jYe9j9Ofmr
— The Hill (@thehill) September 15, 2017
This is not the first time that the technology industry has resisted necessary and modest change. In 2003, the rise of the Internet brought with it the unprecedented global proliferation of illegal child pornography. Then-Attorney General John Ashcroft called the major technology companies of the time together to discuss the problem, leading these companies to create the Technology Coalition, with a stated mission of “eradicating online child sexual exploitation.”
For the next five years, however, the Technology Coalition did not develop or deploy any effective solution to disrupt online child pornography — and the problem festered and worsened.
Because of my expertise in the field of digital forensics, Microsoft teamed up with me in 2008 to develop technology to help reduce the child pornography epidemic. Within a year we developed a technology called photoDNA, which today is in worldwide use. Each year, photoDNA accurately and efficiently removes tens of millions of images of child exploitation from online platforms. It took us only a year to develop and deploy this technology, yet it still took another five years before the biggest companies like Google were willing to deploy this technology on their platform.
This pattern of denial and inaction continues into recent events. Before the Charlottesville tragedy, technology companies ignored The Daily Stormer and Vanguard America, both well-known racist and anti-Semitic websites that regularly espouse racism and violence that clearly violate these companies’ terms of service.
“Alt-right” Twitter rival could lose domain for violating anti-discrimination laws https://t.co/cKs5PLp9B1 pic.twitter.com/XNzh6LT0Jp
— The Hill (@thehill) September 18, 2017
It was only after the tragic events of Charlottesville and the ensuing public outcry that these websites were taken down. And yet, these same companies continue to host websites affiliated with terror groups including the Taliban, al-Qaeda, and ISIS. The lack of responsibility of tech and social media platforms does not stop there.
Recently, Facebook admitted that it had allowed Russian groups to create hundreds of fake user accounts for the purpose of purchasing political ads. Add to this the untold number of fake users that regularly posted content on both Facebook and Twitter in what has been described as “an unprecedented foreign intervention in American democracy.”
Tech was on the wrong side of the online child exploitation issue in the early 2000s, and they are wrong today in hindering the fight to stop illegal sex trafficking.
The CDA law that provides blanket legal protection for the things people post online was never intended to shield sex traffickers and other illegal and dangerous online activities. Technology companies believe they should continue to have the power of self-regulation, while ignoring the horrific acts that take place on their platforms. These companies have a responsibility to clean up their platforms from the worst actors that they deliberately tolerate.
These companies must take a more active role in taking down material that promotes illegal and despicable behavior instead of sitting idly by and profiting from it.
Hany Farid is the Albert Bradley 1915 Third Century Professor and Chair of Computer Science at Dartmouth College and senior advisor to the Counter Extremism Project (CEP).
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..