TikTok, Snapchat executives to make Capitol Hill debuts
TikTok and Snapchat executives will make their debut on Capitol Hill Tuesday, testifying for the first time before a Senate panel about safety precautions for young users.
Members of the Senate Commerce consumer protection subcommittee are expected to grill executives from the social media companies, along with a representative for YouTube, about their platforms’ effects on kids and teens. The hearing comes as some lawmakers look to use the building momentum from leaks about Instagram and Facebook to push for new regulations on tech companies.
TikTok, Snapchat and YouTube, which has previously dispatched executives to Capitol Hill, will likely face similar questions to ones the panel posed to Facebook’s global head of safety Antigone Davis at a hearing earlier this month.
Josh Golin, executive director of Fairplay, a nonprofit that fights commercial marketing to children and excessive screen time, said lawmakers should put questions about the platforms’ amplification of harmful content front and center during the hearing.
The senators should grill the companies — especially TikTok and YouTube, which are driven by sophisticated algorithms that recommend video content to users — about personalizing content in ways that lead to “excessive time” on platforms and exposure to potentially harmful content, Golin said.
Although there’s no trove of leaked documents available for lawmakers to sink their teeth into, as they had with Facebook, TikTok’s head of public policy for the Americas Michael Beckerman, may face questions about allegations that the company is violating the Children’s Online Privacy Protection Act (COPPA).
TikTok settled with the Federal Trade Commission in 2019 to resolve allegations charges that it violated COPPA. The company agreed to pay a $5.7 million fine, obtain parental permission before collecting personal information and delete information about users identified as under 13.
But last year Fairplay, then known as the Campaign for a Commercial Free Childhood, was among 20 nonprofit groups that filed a complaint alleging TikTok violated the settlement agreement by not following through on the promises made.
“I really hope that there is some pushing on that at this hearing because the problems with this space are not just the lack of new regulations, it’s also an enforcement problem and here we have a company that’s already been caught red-handed and hasn’t done nearly enough to address what they were caught red-handed for,” Golin said.
A spokesperson for TikTok was not available for comment. When asked about the allegations, a spokesperson for TikTok previously told The Hill that the company “takes the issue of safety seriously for all our users, and we continue to further strengthen our safeguards and introduce new measures to protect young people on the app.”
In addition to concerns about potentially harmful content, the companies have faced pressure over the highly commercial and potentially manipulative advertising content targeting kids.
YouTube’s vice president of public policy, Leslie Miller, is expected to push back on those complaints.
“We set a higher bar for what channels can make money on YouTube,” Miller is slated to say during her opening remarks.
“Channels that primarily target young audiences need to deliver high-quality content and comply with kids-specific monetization policies.”
But children’s safety advocates say efforts put in place by tech companies to safeguard children are often misguided or fruitless. For example, YouTube’s requirement for paid promotional content to be disclosed doesn’t help children who are not yet able to read — one of the most vulnerable groups for influencer marketing.
The company’s policies prohibit any paid promotional content on its YouTube Kids platform.
The American Academy of Pediatrics says that kids under the age of 6 have a hard time identifying ads, such as popular unboxing videos on YouTube where creators show off merchandise often supplied by companies on camera.
YouTube also treats personal information from anyone watching children’s content on the platform as coming from a child, regardless of the user’s age, meaning the company limits data collection and use, and disables some product features, on any content that is classified as being “made for kids,” Miller will tell the panel.
The company’s track record with following through on those policies, though, has come into question before. Rep. Raja Krishnamoorthi (D-Ill.) sent the company a letter in April accusing the platform of allowing content labeled as “made for kids” to contain product placements.
The KIDS Act, a proposal Sens. Ed Markey (D-Mass.) and Richard Blumenthal (D-Conn.), reintroduced this year would ban websites from recommending content with influencer marketing, such as unboxing videos, to children and young teens. The bill would also ban “auto-play” settings that advocates say can lead to prolonged viewing sessions online.
The proposal, though, has failed to amass bipartisan support despite Republicans, including subcommittee ranking member Marsha Blackburn (R-Tenn.), raising similar concerns as Democrats during recent hearings.
But advocates said they’re cautiously optimistic about proposals advancing given the heightened attention to the issue after Facebook whistleblower Frances Haugen leaked internal company documents and appeared before Congress to accuse the platform of putting profits over children’s well-being.
“I think this is a watershed moment, thanks to the egregious conduct of Facebook and Instagram,” said Jim Steyer, CEO and founder of Common Sense Media.
“I think change is on the horizon, and it generally starts with young people, because the lawmakers have to look in the mirror and ask themselves why they haven’t done anything about this for so long,” Steyer added.
Snapchat in particular may face questions about the use of anonymous features that advocates have criticized as amplifying cyberbullying.
Yolo, an anonymous question-and-answer platform, previously allowed for integrated use on Snapchat. The social media platform suspended the use of Yolo after Yolo was sued by the mother of a 16-year-old who died by suicide after receiving disturbing and sexually explicit content “meant to humiliate him,” according to the complaint.
Snapchat declined to comment publicly ahead of the hearing as to whether Yolo will be permanently banned for integrated use on the app.
“In light of the serious allegations raised by the lawsuit, and out of an abundance of caution for the safety of the Snapchat community, we are suspending both YOLO and LMK’s Snap Kit integrations while we investigate these claims,” a company spokesperson said in a statement.
During the hearing, though, the company will disclose a more in-depth description of content moderation policies used for its Spotlight feature. The Spotlight feature on Snapchat is similar to TikTok, allowing users to create and share videos with the broader platform community.
Snapchat’s policy triggers a human review for all content on Spotlight that receives at least 25 views. The review is meant to make sure the content isn’t gaining attention because of gruesome or derogatory content that may have been missed by the artificial intelligence review, according to the company.
The hearing is part of the subcommittee’s series focused on safety measures for kids, and the leaders of the panel have signaled that they will keep cracking down on the companies. For example, Blumenthal last week called for Facebook CEO Mark Zuckerberg or Instagram head Adam Mosseri to testify as lawmakers push the platform to permanently cancel its plans for an Instagram for kids under 13.
“I think that 2022 is going to be a major year for legislation and regulation regarding technology and social media, not just regarding kids and families, but I think the whole technological landscape,” Steyer said.
—Updated at 7:58 p.m.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..