Facebook and Instagram algorithms shaped what content US-Americans saw in the 2020 election, but did not affect their political attitudes

Scientists – including our scientific coordinator Drew Dimmery – have created a collaboration format that forms a reliable basis for future research in this field

A large collaboration between university scientists and Meta researchers, including Drew Dimmery from the University of Vienna's Research Network Data Science @ Uni Vienna, studying the way algorithms on Facebook and Instagram affected US-Americans' behaviour and attitudes during three months around the 2020 US election was released today. The findings were released in a bundle of new peer-reviewed studies in Science and Nature. The scientists found that Facebook and Instagram algorithms clearly shaped the content Americans saw during the US 2020 election period, including the amount of political news and untrustworthy content they encountered. Altering these algorithms did not change users' political attitudes and off-platform behavior. These findings add crucial insight to the ongoing study of how technology affects the way its users interact with the world, including how social media platforms impact political outcomes such as elections.

These new findings are part of the most comprehensive research project to date examining the role of social media in American democracy. The structure of this collaboration was novel and contributed mainly to making the findings reliable. The academic team proposed and selected specific research questions and study designs with the explicit agreement that the only reasons Meta (owner of Facebook and Instagram) could reject such designs would be for legal, privacy, or logistical (i.e., infeasibility) reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions. The study began in 2020. 

With this unprecedented approach to industrial collaboration, the main findings of the international team are:

  1. Algorithms are extremely influential in terms of what people see and in shaping their on-platform experiences.
  2. Three experiments run during the 2020 election period suggest that although algorithm adjustments significantly change what people see and their level of engagement on the platforms, the three-month experimental modifications did not notably affect their attitudes or off-platform behaviors.

Drew Dimmery from the University of Vienna’s Research Network Data Science @ Uni Vienna participated in this collaboration as part of the Meta team examining how the impact of algorithmic changes varied from person to person. “This is a critical part of answering the questions about the effects of algorithmic changes on Facebook and Instagram, and they echo the top-line results of the studies: while there is some heterogeneity in how behavior changes on platform, this is not the case for attitudes and off-platform behavior”, explains Dimmery. 

While these studies focus on the effects of social media in the U.S., their relevance extends to other countries, too. “Beyond the immediate results of the studies, the collaboration as a whole provides a blueprint for how to approach the crucial question of how technology impacts society”, says Dimmery. “This mode of close collaboration between internal researchers at large platforms and external experts which have the final say over research and writing decisions is key to being able to answer societally important questions about how particular parts of that platform may affect behavior. As the EU’s Digital Services Act comes into effect, it’s vital to ensure that regulation is guided by rigorous science.”

Altering the algorithm in different ways and its effects

The three experiments examined the effect of different algorithm adjustments. One of them was removing reshared content on Facebook. This substantially decreased the amount of political news and content from untrustworthy sources people saw in their feeds, decreased overall clicks and reactions, and reduced clicks on posts from partisan news sources. Removing reshared content on Facebook produced a decrease in news knowledge but did not significantly affect political polarization or other individual-level political attitudes.

Another experiment examined the different effect of chronological versus personalized social media feeds. The chronologically ordered feed decreased the time users spent on Instagram and Facebook. At the same time the chronological feed increased content from moderate friends and sources with ideologically mixed audiences on Facebook and also the amount of political, untrustworthy and uncivil content. Despite these substantial changes in users’ on-platform experience, the chronological feed did not significantly alter levels of issue polarization, political knowledge or other key attitudes during the three-month study period.

A third study examined impacts of deprioritizing content from like-minded sources on Facebook. The researchers found that this had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims during the 2020 U.S. presidential election.

“Social scientists have been limited in the study of social media’s impact on U.S. democracy,” said Talia Jomini Stroud, and Joshua A. Tucker, the academic leads of the studies. “We now know just how influential the algorithm is, but we also know that changing the algorithm for even a few months isn't likely to change people’s political attitudes. What we don't know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources.”

 

Collaboration details

Academics from Dartmouth, Northeastern University, Princeton, Stanford, Syracuse University, University of California, Davis, University of Pennsylvania, University of Virginia and William & Mary are the lead authors of these initial studies. The lead researchers from the Meta team were Pablo Barberá for all four papers and Meiqing Zhang for the paper on ideological segregation. Meta project leads are Annie Franco, Chad Kiewiet de Jonge, and Winter Mason.

 

Original publications: 

  • Andrew Guess, Neil Malhotra, Jennifer Pan, Pablo Barberá, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, Drew Dimmery, Deen Freelon, Matthew Gentzkow, Sandra González-Bailón, Edward Kennedy, Young Mie Kim, David Lazer, Devra Moehler, Brendan Nyhan, Carlos Velasco Rivera, Jaime Settle, Daniel Thomas, Emily Thorson, Rebekah Tromble, Arjun Wilkins, Magdalena Wojcieszak, Beixian Xiong, Chad Kiewiet de Jonge, Annie Franco, Winter Mason, Natalie Jomini Stroud, and Joshua A. Tucker. "How do social media feed algorithms affect attitudes and behavior in an election campaign?" In press, Science. https://doi.org/ doi:10.1126/science.abp9364 Embargo until 27 July 2023, 20:00 CET
  • Andrew Guess, Neil Malhotra, Jennifer Pan, Pablo Barberá, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, Drew Dimmery, Deen Freelon, Matthew Gentzkow, Sandra González-Bailón, Edward Kennedy, Young Mie Kim, David Lazer, Devra Moehler, Brendan Nyhan, Carlos Velasco Rivera, Jaime Settle, Daniel Thomas, Emily Thorson, Rebekah Tromble, Arjun Wilkins, Magdalena Wojcieszak, Beixian Xiong, Chad Kiewiet de Jonge, Annie Franco, Winter Mason, Natalie Jomini Stroud, and Joshua A. Tucker. "Reshares on social media amplify political news but do not detectably affect beliefs or opinions." In press, Science. https://doi.org/ doi:10.1126/science.add8424 Embargo until 27 July 2023, 20:00 CET
  • Nyhan, Brendan, Jaime Settle, Emily Thorson, Magdalena Wojcieszak, Pablo Barberá, Annie Y. Chen, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, Drew Dimmery, Deen Freelon, Matthew Gentzkow, Sandra González-Bailón, Andrew Guess, Edward Kennedy, Young Mie Kim, David Lazer, Neil Malhotra, Devra Moehler, Jennifer Pan, Carlos Velasco Rivera, Daniel Robert Thomas, Rebekah Tromble, Arjun Wilkins, Beixian Xiong, Chad Kiewiet de Jonge, Annie Franco, Winter Mason, Natalie Jomini Stroud, and Joshua A. Tucker. "Like-minded sources on Facebook are prevalent but not polarizing." In press, Nature. https://doi.org/10.1038/s41586-023-06297-w Embargo until 27 July 2023, 20:00 CET

© pexels/cottonbro studio