Enrichment Arts & Culture

YouTube more likely to recommend election-fraud content to those skeptical of the 2020 election: study

“Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice.”
Vote here sign.
iStock.

Story at a glance


  • The proliferation of online misinformation pertaining to elections has been top of mind for some social media companies in the wake of the 2016 presidential election.

  • In an effort to determine what type of YouTube content is fed to those skeptical of the 2020 election outcome, researchers conducted a study on over 300 Americans. 

  • Findings showed those more skeptical of the election results were fed more election-fraud content, potentially perpetuating false beliefs. 

Tailoring algorithms to user interests is a common practice among social media companies. But new research underscores the harms this practice can yield, especially when it comes to public perception of election legitimacy. 

A study conducted by researchers at New York University found YouTube was more likely to direct videos on election-fraud to those skeptical of 2020 election results compared with those less skeptical. Researchers say this practice may perpetuate existing misconceptions.

Those who were most skeptical of the presidential election outcome were shown up to three times as many videos on election-fraud in their feeds, compared with those least skeptical of the outcomes. This accounted for about eight additional recommendations out of 400 in total. 

“Narratives of fraud and conspiracy theories proliferated over the fall of 2020, finding fertile ground across online social networks, although little is known about the extent and drivers of this spread,” authors wrote.


America is changing faster than ever! Add Changing America to your Facebook or Twitter feed to stay on top of the news.


To better understand the association, researchers sampled over 300 Americans with YouTube accounts in November and December 2020, who were then surveyed about their level of concern regarding the legitimacy of the presidential election.  

A browser extension was installed to record the amount and content of recommended videos displayed to users. Users were instructed to click on a randomly assigned “seed” video then on one recommendation following the video, according to a randomly assigned rule. This way, researchers could isolate the algorithm’s influence on users in real time. For example, some participants were assigned to always click on the second video displayed in recommendations.

Despite the overall low prevalence of election-fraud videos displayed in the study, “Our findings uncover the detrimental consequences of recommendation algorithms and cast doubt on the view that online information environments are solely determined by user choice,” said co-author James Bisbee, an assistant professor at Vanderbilt University, in a statement.

Displaying election-fraud content to those most concerned about the election’s legitimacy is one way misinformation, disinformation, and conspiracy theories can reach those most likely to believe the false claims, researchers explained.  

“Many believe that automated recommendation algorithms have little influence on online ‘echo chambers’ in which users only see content that reaffirms their preexisting views,” Bisbee continued. 

However, results of the current study highlight “the need for further investigation into how opaque recommendation algorithms operate on an issue-by-issue basis.”

With campaigning for the 2022 midterm elections in full swing, many social media companies are working to mitigate the spread of misinformation that plagued previous election cycles. 

YouTube recently announced new policies aimed at addressing election-related misinformation that will add context to any midterm-related videos in both Spanish and English.


changing america copyright.