Klobuchar, Warner press tech leaders to take action against election-related disinformation
Democratic Sens. Amy Klobuchar (Minn.) and Mark Warner (Va.) sent a letter to major tech leaders Tuesday, urging them to take “decisive action” against disinformation related to the 2024 general election.
In a letter to the leaders of Meta, X, Discord, Twitch and Alphabet Inc., Klobuchar and Warner expressed “persisting concerns” over how election-related disinformation has the ability to go viral and reach millions of viewers. The lawmakers recommended “bolstering content moderation resources” to prevent deceptive content aimed at misleading voters or sowing violence.
“As [artificial intelligence] technology gets more sophisticated, voters will have an increasingly hard time knowing if what is being presented to them on your platforms about candidates or how to cast a ballot is real or fake,” the lawmakers wrote.
The letter adds to the growing scrutiny over what responsibilities social media platforms have, if any, in preventing the spread of disinformation as the election quickly approaches.
Earlier Tuesday, Meta announced it banned Russian state media from its social media platforms in the wake of the outlets’ “foreign interference activity.”
“After careful consideration, we expanded our ongoing enforcement against Russian state media outlets: Rossiya Segodnya, RT and other related entities are now banned from our apps globally for foreign interference activity,” Meta said in a statement shared with The Hill.
Warner and Klobuchar, in their letter, pointed to the Justice Department’s (DOJ) recent action against Russian government-backed efforts to inference in the election. The DOJ announced earlier this month it seized more than two dozen web domains it said Russia was using for covert campaigns.
The DOJ also recently handed down an indictment accusing two RT employees of leading a covert influence campaign by partnering with conservative company Tenet Media to hire various right-wing influencers.
“Recent reports have raised significant questions about the extent to which online platforms are prepared to combat the threats presented by election-related misinformation, disinformation, and foreign influence efforts,” the lawmakers wrote. “Particularly in the context of safeguarding elections, it is vital that your companies maintain trust and integrity teams devoted to a number of functions related to addressing malicious activity, including content moderators, incident responders, legal compliance personnel, digital forensic specialists, and investigators.”
The lawmakers laid out a series of questions for the tech leaders, including what actions the companies have already taken or plan to take to counter election misinformation, along with how the tech companies are addressing deceptive artificial intelligence (AI)-generative content related to candidates and elections.
They also asked the companies about what policies are in place for entities that impersonate legitimate media organizations.
The lawmakers requested that companies respond by Oct. 1.
The Hill reached out to the companies for further comment.
Meta, in a release last November, laid out how its platforms are preparing for the 2024 election, touting more than $20 billion in investments in safety and security since 2016.
“While much of our approach has remained consistent for some time, we’re continually adapting to ensure we are on top of new challenges, including the use of AI. We’ve also built the largest independent fact-checking network of any platform, with nearly 100 partners around the world to review and rate viral misinformation in more than 60 languages,” the platform said at the time.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..