The views expressed by contributors are their own and not the view of The Hill

Don’t sue or ban social media companies. Force them to change.

On Oct. 8, 14 states sued TikTok for intentionally addicting young users and harming their mental health. The company already faces a ban in the U.S. if it’s not sold by its Chinese owners to a U.S. firm by January.

But lawsuits and bans against TikTok are moving a critical conversation about social media regulation in the wrong direction. The anti-TikTok discourse has failed to acknowledge how American-owned social media apps like Instagram, X, and Facebook are being used to spread misinformation, hate speech and anxiety among teens. 

Although it is great that lawmakers are finally awakening to the dangers of social media, we need them to require all social media companies to adopt design solutions that will be beneficial, feasible and enforceable across multiple platforms.

Importantly, the new lawsuit recognizes how TikTok has been reshaping the daily lives of teenagers. As mounting evidence shows, social media apps, including Instagram and X, are highly correlated with self-harm, anxiety, depression and social isolation. Such concerns are backed by a plethora of empirical studies, including data collected, concealed and ignored by the tech companies themselves.

That American teenagers, especially girls, are faced with an epidemic of body dysmorphia, self-hatred and attention deficit is one of the most pressing problems of our tech-dependent lives. Yet solutions like the TikTok ban or the recently-announced “Teen Accounts” on Instagram, despite offering more privacy and safety controls, are insufficient in addressing how algorithmic recommendation systems are strategically designed to exploit cognitive vulnerabilities to maximize screen time. 

Disturbing and violent TikTok trends like the viral “Blackout Challenge,” which led to the death of at least 15 children who tried to engage in self-asphyxiation, are a horrifying and heartbreaking illustration of where the algorithmically-supported rabbit hole might lead.

Solutions suggested so far are likely to be ineffective. Adding Teen Accounts to Instagram — which, among other changes, will stop notifications to minors from 10 p.m. to 7 a.m. to promote sleep — is a step in the right direction, but it might be difficult to enforce. As my Yale undergraduate students repeatedly tell me, they are adept at finding workarounds to avoid age restrictions on mobile apps, from changing their date of birth to using a fake profile or borrowing an ID from an older sibling.

On the other hand, banning TikTok probably won’t help either. As any parent who tried to throw their child’s favorite stuffed animal painfully learns, total prohibitions can easily backfire. Banning the most popular social media platform is likely to strengthen dependence on Instagram, X and Snapchat, which are just as addictive and potentially toxic.

Instead, we can look outside the social media sphere for potential solutions. For instance, regulatory agencies should mandate certain design features and prohibit others. 

We’re starting to see early steps in this direction: In July, the U.S. Federal Trade Commission announced the results of a study into the use of “dark patterns” — manipulative design techniques — that can put users’ privacy at risk or push them to buy products they otherwise wouldn’t have. 

One such pattern is “nagging,” which involves repeatedly asking the consumer to perform an action. In TikTok’s case, this might look like prompting users to upload their contacts to the app every time they open it, even after they refuse to do so.

To counter the harms of dark patterns, the FTC, much like California lawmakers with their recently passed AI regulations, signaled that interface design decisions should be subject to regulation. 

In the case of social media apps, such regulatory measures might look like incorporating and enforcing daily screen time restrictions, ensuring that content moderation — especially for viral videos or trends — is consistent, rigorous and aimed at protecting potentially young users and, similarly to a new California bill targeted at AI-generated images and videos, add watermarks to clearly distinguish ads and paid content. 

To be sure, an app that will automatically lock the user out after two hours and only let them log in again after 12 hours might invoke forceful resistance from both tech companies and teenagers. That intense lobbying by tech companies successfully buried a more sweeping AI bill in California attests to the uphill battle any politician or lawmaker might face. But imposing such restrictions is more efficient than banning it altogether. 

Young Americans understand that social media can easily turn toxic, and some have already paid with their lives. A design-based approach to harm reduction can hold every social media app — regardless of ownership — responsible for the well-being of its users. 

To save lives, we need to start with regulating the interface. 

Neta Alexander teaches algorithmic and media literacy at Yale University. Her book, Interface Frictions, is forthcoming in 2025 from Duke University Press. She is a Public Voices Fellow of Yale and The OpEd Project.

Tags American teenagers Blackout Challenge Facebook Federal Trade Commission Instagram Politics of the United States social media ban social media engagement social media regulations Teen Accounts on Instagram TikTok TikTok ban X (formerly Twitter)

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

More Technology News

See All

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video