Big Tech should pay for damaging mental health
A Wall Street Journal investigation recently uncovered internal research showing Instagram is harmful to the mental health of roughly a third of teen girls, with Black girls impacted most severely. This is a grave threat to the health of our youth and Congress and the Biden administration must aggressively pursue a regulatory and research agenda to address it.
The lack of action to date may be because policymakers believed there was still some ambiguity in the research — ambiguity caused in part by the refusal of social media giants to open their data to outside researchers. Though some amount of ambiguity exists for almost any major health threat, social media companies use their lack of transparency to claim ignorance, as Mark Zuckerberg did in March when asked about mental health by Rep. Cathy McMorris-Rodgers (R-Wash).
Here is where there is no ambiguity: Depression rates among teenagers began increasing at exactly the moment social media became prevalent. In 2009, Facebook became the most popular social platform in the world. In the decade that followed, major depression doubled in our teens. Our screening data at Mental Health America show a similar trend.
As Congress and the administration increasingly recognize the problem, the question then becomes how to fix it. How would we begin to address, or measure, the impact of social media on children’s mental health? Here is just one approach to consider as Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.) arrange a hearing with social media executives for Sept. 30.
The federal government should create financial incentives for social media companies like Facebook to protect and promote children’s mental health, but not by offering more money to these companies. Instead, the federal government should withhold some of the companies’ revenue — an amount that is fair and could be considered “ill gotten” by targeting children. Companies would then get the chance to earn this revenue back by demonstrating how they are positively impacting (or at least not harming) children.
To earn it back, companies would need to share data with a government agency charged with oversight. The agency would work with young people, researchers and other stakeholders to create fair and transparent methods for assessing the impacts of the social media platform on children’s mental health, based on the data submitted (which could include requests for reasonable additional data collection). Measuring this would be a new area of regulatory science, similar to how the Food and Drug Administration and the Environmental Protection Agency created methods for determining the impact of different factors on health.
If the agency finds no harm to children, then the company gets back its income. If the agency finds a net positive impact on children, then the company can get back its own income and some of the money that other companies did not earn back.
To protect innovation, only companies of a certain size would be subject to these policies. This would give large companies a financial incentive to help, rather than harm, children without receiving additional money from the federal government. To reduce costs, the agency itself could be paid for through user fees from companies submitting data.
While it might sound revolutionary, this approach would not be new — all of these policy strategies are already used in the health care industry. Earn-back incentives, for example, are common as part of the push to reduce overall spending on health care while also improving outcomes.
The FDA, to take another example, is funded in part through user fees. Congress regularly enacts a bipartisan comprehensive statute describing the fees and activities funded by the resources. Companies seeking FDA approval frequently have to submit data from various assessments and processes to show safety.
Ultimately, this kind of approach would apply the best thinking in health policy to tackle the growing crisis in children’s mental health, fueled by social media. At the same time, it would avoid mandates and keep costs to the government low. It would also give social media companies a strong incentive to develop new ideas for making a positive impact.
This is just one approach, but it deserves significant consideration. We can’t keep scrolling past this issue. In part through this unethical targeting of children, Facebook earned over $29 billion in net income in 2020 — an increase of 58 percent from 2019. Congress has an opportunity to create financial carrots and sticks that motivate social media companies to improve youth well-being. Otherwise, Facebook and others will wait for this news cycle to pass and stop conducting internal research on this altogether.
Nathaniel Counts is the senior vice president of Behavioral Health Innovation at Mental Health America and clinical assistant professor in the Department of Psychiatry and Behavioral Sciences at the Albert Einstein College of Medicine.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..