Daily Tech News, Interviews, Reviews and Updates

YouTube algorithm was responsible for showing rigged election videos to Trump Supporters

According to a new study, YouTube’s recommendation algorithm pushed more videos about election fraud to people who were already skeptical about the legitimacy of the 2020 election. Although there were fewer films regarding electoral fraud, the most dubious YouTube users viewed three times as many as the least doubtful users.

“The more receptive you are to these types of narratives about the election…the more content about that story you would be suggested,” says research author James Bisbee, who is now a political scientist at Vanderbilt University.

Former President Donald Trump has promoted the erroneous idea that the 2020 election was stolen in the aftermath of his defeat, even pushing for a new election as recently as this week. While charges of voter fraud have been widely debunked, spreading the debunked claims remains a profitable strategy for conservative media characters, whether in podcasts, films, or internet videos.

Bisbee and his research team were investigating how frequently dangerous content, in general, was recommended to users and happened to be conducting a study at the time. “We were overlapping with the United States presidential election and the following propagation of misinformation about the outcome,” he explains. So they took advantage of the timing to investigate how the algorithm recommended items related to electoral fraud.

The research team polled over 300 people about the 2020 election, including how concerned they were about fake ballots and foreign government influence. People were polled between October 29th and December 8th, and those polled after election day were also questioned if the election’s conclusion was valid. The research team also tracked participants’ YouTube experiences. Each person was given a movie, to begin with, and then a path to follow across the site, such as clicking on the second recommended film each time.

The crew combed through all of the videos provided to participants and identified those pertaining to electoral fraud. They also classified the films’ attitude on election fraud — whether they were neutral about charges of election fraud or backed electoral misinformation.

According to the findings, persons who were the most doubtful of the election had an average of eight more recommended films regarding electoral fraud than those who were the least skeptical. Skeptics watched an average of 12 videos, whereas non-skeptics watched an average of four. The types of videos were also varied – those seen by skeptics were more likely to support electoral fraud suspicions.

Bisbee’s team, in particular, observed a slight but considerable push from the algorithm toward misinformation for the people who were most likely to believe it. It could be a nudge specific to election fraud information, but the study cannot establish if the same is true for other sorts of misinformation. It does, however, imply that there is still much to learn about the role of algorithms.

 



Readers like you help support The Tech Outlook. When you make a purchase using links on our site, we may earn an affiliate commission. We cannot guarantee the Product information shown is 100% accurate and we advise you to check the product listing on the original manufacturer website. Thetechoutlook is not responsible for price changes carried out by retailers. The discounted price or deal mentioned in this item was available at the time of writing and may be subject to time restrictions and/or limited unit availability. Amazon and the Amazon logo are trademarks of Amazon.com, Inc. or its affiliates Read More
You might also like

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More