The views expressed by contributors are their own and not the view of The Hill

Don’t blame Big Tech for misinformation online

Getty Images

This may come as a shock to some legislators, but holding social media companies accountable for the spread of misinformation will do nothing to prevent it. It will harm well-intentioned users in the process.

Today, social media CEOs testify before the House Committee on Energy and Commerce about the dissemination of misinformation and extremism online. In a press release published last month, Democratic committee chairs claimed that “big tech has failed to acknowledge the role they’ve played in fomenting and elevating blatantly false information,” which has in turn impacted public health and safety. Social media does have a fake news problem, but playing the blame game and scapegoating isn’t going to help anything. 

The committee hopes to change incentives for companies and build upon the momentum of content moderation bills proposed over the past year. Rep. Jim Jordan (R-Ohio) and Sens. Josh Hawley (R-Mo.), Lindsey Graham (R-S.C.) and Amy Klobuchar (D-Minn.) are among the legislators who have introduced bills targeting Section 230 of the Communications Decency Act, the law which empowers social media companies to moderate content on their own websites. Some scholars fear mandating a stronger enforcement regime would result in censorship.

The spread of false information is a lot like my high school’s rumor mill. During the school day students didn’t have access to their cell phones, but that did not stop salacious rumors from quickly traveling through the hallways. It was inevitable because teenagers confined in a small building will talk with each other. No one would fault the school itself for being a space where students gather and engage in conversation. But administrators nevertheless tried to respond and mitigate the situation, much like social media firms today.

It doesn’t matter if it’s my small school in suburbia or a billion dollar company, it’s a universal truth: The options for addressing the spread of false information are never going to be perfect.

The committee chairs assert that social media companies knowingly turn a blind eye to misinformation and even claim that “industry self-regulation has failed.” But the evidence doesn’t support this accusation. Since January 2020, Facebook, Google and Twitter have been monitoring and responding to COVID-19 misinformation and strengthening their moderation policies for election content. In a March 2021 declassified report, the National Intelligence Council concluded that social media companies proactively assisted government officials in countering online foreign threats during the presidential election.

That doesn’t mean that these companies are doing everything right. Admittedly, social media firms should control some aspects of the fake news dilemma. That would look like tweaking their algorithms, banning offending accounts and flagging inaccurate information, which they already do. But, at the end of the day, they cannot stop users from being deceived or being deceitful. Furthermore, requiring them to try would automatically give social media companies the final say on what is or is not the truth. Neither side of the political aisle should want that.

Scapegoating isn’t the solution. Punishing tech companies for their users’ content will do very little to bad actors and much more to everyday users. Those who use media to coordinate violence will continue to effectively communicate with one another through other means. They don’t need social media to operate in their small silos. Foreign nations similarly will find other ways to attack our country and its sovereignty, as it’s been done for centuries. 

Although radicals do and always will be online, users hoping to improve society are the ones who actually need social media to be as unshackled as it is. Racial justice, anti-harassment reform and democracy are just a few issues that have a groundswell of support because of hashtags and online conversations. These movements rely upon websites like Facebook and Twitter to inexpensively and quickly engage with the public about their cause. Lawmakers attempting to reduce misinformation and polarization need to give greater consideration to the digital discourse in its entirety. Legislation probably won’t reduce harm by bad actors, but it will almost certainly suppress the marginalized voices and perspectives finally being heard.

Social media companies have a role in combating fake news online, but not based on the ambiguous and unrealistic standards that congressmen are trying to impose. Let firms continue to address misinformation, and let us users — the ones that need it most — elevate online conversations with peaceful ideas.

Rachel Chiu is a Young Voices contributor who writes about technology public policy. Her writing has been published in USA Today, Techdirt, The American Conservative, Real Clear Policy, and elsewhere. Follow her on Twitter @rachelhchiu.

Tags Amy Klobuchar Big tech Censorship Communications Decency Act Conspiracy theories Fake News Jim Jordan Josh Hawley Lindsey Graham misinformation online misinformation Section 230 Social media

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

Main Area Top ↴
Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video