The views expressed by contributors are their own and not the view of The Hill

Government efforts to ‘fix’ social media bias overlooks the destruction of our discourse

Enough with the bad CDA 230 takes and revisions. Just stop. Each week we seem to see another attempt to revise, do away with, or chip away at Section 230 of the Communications Decency Act, the law that protects online platforms, such as Facebook and YouTube, from liability for how people use their services.

Nearly every attack is aimed at the wrong problem – limiting political bias and censorship on the part of big-tech firms.

Sen. Josh Hawley (R-Mo.) attacked the law on Twitter on Wednesday, blaming it for protecting social media firms that block certain ideas. He contended, “the law must be changed to challenge #BigTech censorship.” 

Last week, the president joined the chorus, tweeting, “REPEAL SECTION 230!!!” to his 87 million Twitter followers. He was likely responding to Facebook’s decision to remove a post he made that compared COVID-19 to seasonal flu. Twitter tagged the same message as misleading.

The Department of Justice joined the list of bad CDA 230 revisions last month when it released its proposed changes to the law at the president’s direction. Once again, it focused on the extent to which big-tech firms can and cannot take down content.

It’s as if the dam is leaking and the president and lawmakers keep suggesting we need more water. CDA 230 has its flaws, but censorship and bias are only surface issues. 

Justice Clarence Thomas joined the detractors Tuesday, but his critique came closer to the mark. His ideas, which came in a statement that accompanied the Supreme Court’s decision not to hear a case, and not a precedent-setting decision, questioned the law’s interpretation and highlighted that, as interpreted, CDA 230 discourages online firms from monitoring their spaces.

His concerns get us closer, though not all the way, to identify the problem: If we are to rethink the flow of information online, the focus must be on protecting democracy. That’s where the dam is leaking. 

Social media are having a toxic effect on the flow of democratic discourse. Big-tech firm algorithms are limiting our exposure to ideas and each other. Bots and troll farms are flooding our marketplace of ideas with false and misleading content. We are becoming more extreme and less capable of speaking to one another.

None of the CDA 230 revisions would address these problems. It’s important to remember that CDA 230 is not the same as the First Amendment. The law makes it financially feasible for big-tech firms and anyone who wants to create a platform online, to do so. 

These safeguards allow big-tech firms significant leeway to decide which content is left up or removed. While the DOJ and lawmakers seem focused on “arbitrary content moderation decisions,“ which politicians ironically protest using the social media firms they are claiming to censor them, the larger concerns are polarization, false information, and tribalism.

It’s the architectures of these systems and their financial rather than political motivations we should be concerned about. The systems are built in a way that democracy is damaged by the product. 

The DOJ’s revisions, which are only suggestions at this point, would not address these problems.

Requiring online forums to remove some comments or leave posted information they would take down would create an incredible legal morass. 

These online forums are privately owned spaces. The owner of a private space generally decides what is permissible and what is not. If I put a Biden sign up in my yard, the government cannot require me to take it down or put up a Trump sign. Requiring these platforms to leave messages online that they would otherwise takedown is akin to the government forcing them to speak.

The First Amendment does not allow the government to force individuals or corporations to share political ideas they do not want to share. It also does not allow the government to censor communicators. 

Legislation of this kind would be the wrong tool for solving the problems big-tech firms are causing democracy.

The revisions also open the door to new, vexing questions for the courts, big-tech firms, and users for as long as they exist. The DOJ revisions would require courts to read terms of service agreements as if they are laws. What constitutes a violation of Twitter’s terms of service? The company and the government will likely disagree and no one will know what their rights are.

These same problems plague the Online Freedom and Viewpoint Diversity Act, which was proposed by Sens. Roger Wicker (R-Miss.), Lindsey Graham (R-S.C.) and Marsha Blackburn( R-Tenn.) in September.

“For too long, social media platforms have hidden behind Section 230 protections to censor content that deviates from their beliefs,” Wicker stated in a news release. 

Hawley proposed the Ending Support for Internet Censorship Act in June 2019. The law would require firms to apply to the FTC for CDA 230 protection. The firms would have to prove to the FTC, a government agency appointed by the president, that they are not biased in their content moderation.         

The most reasonable bill has been the Platform Accountability and Consumer Transparency Act proposed by Sens. Brian Schatz (D-Hawaii) and John Thune (R-S.D.) in June. The bill requires firms to create an “acceptable use” policy available to users, provide a tool for users to file complaints, and generate quarterly reports regarding flagged and taken-down content. 

It’s probably good that none of these bills is likely to become law because any “take-down” regulations won’t fix the basic big-tech architecture. The platforms are built to profit from our interaction. When we interact, they gather more data about us, which they can sell and target advertisements toward us. 

We must find a way to incentivize these corporations to redesign their spaces in ways that encourage meaningful conversation and discourage echo chambers and fragmentation. 

Let’s take a break from bad CDA 230 revisions and complaints and shift our conversation — our energy — to revising how we think of and write policy regarding big-tech firms and how we approach regulation of these spaces. We must find a way to protect democratic discourse. That’s where the dam is leaking.

Jared Schroeder is an associate professor of journalism at Southern Methodist University, where he specializes in First Amendment law. He is the author of “The Press Clause and Digital Technology’s Fourth Wave.”

Tags Brian Schatz Clarence Thomas Communications Decency Act Internet censorship Internet censorship in the United States John Thune Josh Hawley Lindsey Graham Marsha Blackburn Operating systems Roger Wicker Twitter

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video