A heated hearing with the CEOs of major social media companies last week reinvigorated the debate over the
Kids Online Safety Act (KOSA). The measure would regulate the type of content companies can show minors online and grant the Federal Trade Commission and state attorneys general power to enforce the rules.
While internet safety advocates tout the bill as a way to limit kids’ exposure to harmful content, such as posts promoting self-harm or eating disorders, other groups have warned the legislation could limit access to information about gender identity, sexuality and reproductive health for young people.
The bill, co-sponsored by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), would require companies to limit access or allow minors to opt out of certain features, like automatic video playing and algorithmic recommendations.
The bill would also legally obligate tech platforms to prevent the promotion of content about certain topics, such as suicide, eating disorders and self-harm.
Groups that oppose the legislation say they worry the bill’s duty of care provision, broadly defined as “taking reasonable measures” to mitigate harm, leaves too much up to interpretation.
“We’re concerned that the bill allows for
50 different interpretations across 50 different states — some of which have already defined or articulated the harm that kids face online to be caused by just the mere exposure to concepts related to whether or not gender is a binary, or exposure to information related to reproductive health care,” said Aliya Bhatia, policy analyst with the Center for Democracy and Technology’s (CDT) Free Expression Project.
Some of CDT’s largest financial supporters are tech companies, including Meta and TikTok.
The bill’s supporters have pushed back against claims that KOSA will restrict minors from accessing certain information — especially after amendments to the version that advanced in July narrowed the definition of duty of care.
The measure would only regulate certain content that platforms promote, not what minors search for, said Haley Hinkle, policy counsel at the child online advocacy group Fairplay.
“KOSA is not about the existence or removal of any individual piece of content, it’s about the way a platform’s design and operation impact specific, defined harms. KOSA’s duty of care explicitly protects minors’ ability to search for content, including harm prevention or mitigation resources,” Hinkle said in a statement.
Read more in a full report at TheHill.com.