YouTube algorithm was responsible for showing rigged election videos to Trump Supporters
According to a new study, YouTube’s recommendation algorithm pushed more videos about election fraud to people who were already skeptical about the legitimacy of the 2020 election. Although there were fewer films regarding electoral fraud, the most dubious YouTube users viewed three times as many as the least doubtful users.
“The more receptive you are to these types of narratives about the election…the more content about that story you would be suggested,” says research author James Bisbee, who is now a political scientist at Vanderbilt University.
Former President Donald Trump has promoted the erroneous idea that the 2020 election was stolen in the aftermath of his defeat, even pushing for a new election as recently as this week. While charges of voter fraud have been widely debunked, spreading the debunked claims remains a profitable strategy for conservative media characters, whether in podcasts, films, or internet videos.
Bisbee and his research team were investigating how frequently dangerous content, in general, was recommended to users and happened to be conducting a study at the time. “We were overlapping with the United States presidential election and the following propagation of misinformation about the outcome,” he explains. So they took advantage of the timing to investigate how the algorithm recommended items related to electoral fraud.
The research team polled over 300 people about the 2020 election, including how concerned they were about fake ballots and foreign government influence. People were polled between October 29th and December 8th, and those polled after election day were also questioned if the election’s conclusion was valid. The research team also tracked participants’ YouTube experiences. Each person was given a movie, to begin with, and then a path to follow across the site, such as clicking on the second recommended film each time.
The crew combed through all of the videos provided to participants and identified those pertaining to electoral fraud. They also classified the films’ attitude on election fraud — whether they were neutral about charges of election fraud or backed electoral misinformation.
According to the findings, persons who were the most doubtful of the election had an average of eight more recommended films regarding electoral fraud than those who were the least skeptical. Skeptics watched an average of 12 videos, whereas non-skeptics watched an average of four. The types of videos were also varied – those seen by skeptics were more likely to support electoral fraud suspicions.
Bisbee’s team, in particular, observed a slight but considerable push from the algorithm toward misinformation for the people who were most likely to believe it. It could be a nudge specific to election fraud information, but the study cannot establish if the same is true for other sorts of misinformation. It does, however, imply that there is still much to learn about the role of algorithms.