The views expressed by contributors are their own and not the view of The Hill

Should internet platforms censor too much or too little?

Facebook, Twitter, Google and other platforms face constant criticism about content they remove from, and allow to remain on, their sites. As a result, they struggle with whether they should err on the side of too much censorship or too little.

The First Amendment makes clear that the government cannot censor speech except under certain circumstances — shouting “fire” in a crowded theater, for example. The government, therefore, must err on the side of too little censorship, allowing too much offensive speech rather than censoring views that some people, or even most people, find offensive.

{mosads}Private actors such as newspapers face no such legal constraints. They generally are free to allow or disallow whatever content they want. Thus, claims of First Amendment violations in response to not being allowed to publish content in privately-owned platforms are meaningless.

Large internet platforms are a relatively new phenomenon that do not fit into our historical social framework. As private entities, they are protected by the First Amendment but not required by it to publish anything. In principle, they can allow or disallow any content they want, as long as they don’t violate other laws, such as those on child pornography. In practice, though, these platforms are so ubiquitous that large numbers of people — including politicians around the world — take an acute interest in how these platforms structure their rules of engagement.

When Texas Republicans Rep. Joe Barton and Sen. Ted Cruz asked Facebook CEO Mark Zuckerberg this year about “censoring conservative bloggers,” they implied that Facebook’s policies erred on the side of removing posts that should stay. Zuckerberg called this issue “an enforcement error,” acknowledging that perhaps in that case Facebook’s algorithm should be designed to err in the other direction, leaving up posts that some might wish to see removed.

Contrarily, calls for platforms to remove right-wing conspiracy theorist Alex Jones were complaints in the other direction — that the platforms were erring on the side of not removing posts that should be removed.

Pressure in both directions is understandable. Harassment on social media is real and vicious, often with a vileness rarely seen on such a wide scale elsewhere. Most people are likely to favor some kind of censorship on the worst of these posts. At the same time, many people believe that platforms should not take political positions and should never censor political views. But there’s a lot of gray space between these two relatively easy positions.

And yet it is precisely this balancing act that we ask platforms to make. In response, they develop increasingly complex rules for what is acceptable. They employ a growing variety of approaches that include artificial intelligence and reviews by humans to determine whether posts meet those guidelines. All these rules and reviews — including those by algorithms — are at some point based on judgment. And that means some decisions will be controversial.

Internet platforms never will get the balance right because there is no right balance. There are too many differing preferences across people within the United States, let alone the entire world, to always make the right choices. Still, they must decide whether algorithms should err on the side of taking down posts that should stay, or keeping posts that should go.

Even with no correct answer, some general principles might help platforms move toward a socially acceptable approach. First, simply acknowledging the choice between erring on the side of censoring too much or too little content may help provide some coherence on a seemingly incoherent process. Second, markets — via platform use and stock market values — may help aggregate information on preferences, although identifying effects of censorship issues will be difficult, given other issues that users and investors may care more about.

Finally, transparency on criteria used to make censorship decisions and how those criteria are applied are important. Platforms are taking important steps in this direction. Google’s transparency reports, for example, provide some insights into how it addresses the issue. Facebook is following suit. An additional useful piece of information would be measures of complaints that resulted in no action and the reason why. With information on what is taken down and what is not, the public can better understand the balance that platforms are trying to maintain and how they weigh erring on the side of taking down too much or too little.  

Scott J. Wallsten is president and senior fellow of the Technology Policy Institute, and a senior fellow at the Georgetown Center for Business and Public Policy. Follow him on Twitter @scottwallsten.

Tags Censorship Digital media Facebook Joe Barton Mark Zuckerberg Social information processing Social media Ted Cruz

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

More Technology News

See All

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video