Algorithms Play Key Role in Shaping User Experience, But Have Little Effect on Political Beliefs, New Study Finds

New Study Shows Social Media Algorithms Play a Critical Role in User Experience

Regulators and activists have long been concerned that social media platforms’ algorithms contribute to political division and the spread of conspiracy theories. In response to these concerns, Meta opened up its internal data to university researchers to study the impact of Facebook and Instagram on the 2020 presidential election. The first results of this research reveal that the algorithms used by Meta’s platforms play a significant role in guiding users towards partisan information that aligns with their beliefs. However, the research also suggests that strategies implemented by Meta to discourage viral and engaging content have minimal impact on users’ political beliefs.

The research project, led by the Center for Social Media and Politics at New York University, included four studies that were recently published in the journals Science and Nature. These studies were conducted in collaboration with Meta’s own analysts and analyzed how social media influences political polarization and people’s understanding of news, government, and democracy. Researchers altered the Facebook and Instagram feeds of thousands of users prior to the 2020 election to assess whether exposure to different information could change their political beliefs, knowledge, or level of polarization. Ultimately, the changes made to the feeds had little impact on these factors.

Further studies are currently being conducted, including an examination of data collected after the attack on the U.S. Capitol on January 6, 2021. This research is part of an ongoing debate surrounding the responsibilities of tech companies in combatting harmful content on their platforms. Regulators have proposed new rules that would require social media companies to be more transparent about their algorithms and take increased responsibility for the content their algorithms promote.

The findings of these studies are expected to strengthen social media companies’ arguments that algorithms alone are not the cause of political division and upheaval. Meta’s Global Affairs President, Nick Clegg, highlighted the experimental findings as evidence that their platforms do not cause harmful polarization and have minimal effects on political outcomes. However, critics argue that these results do not absolve tech companies of their role in amplifying division, political upheaval, and the spread of misinformation. Some advocates believe that social media platforms should do more to combat viral misinformation and maintain accountability.

Among the experiments conducted, researchers assessed the impact of switching users’ feeds to display content chronologically rather than by algorithm. While the chronological timeline was found to be less engaging and exposed users to more political stories and untrustworthy content, it had little effect on levels of polarization or knowledge of political issues. This aligns with Meta’s internal research, which suggests that the algorithm-driven feed presents higher-quality content.

However, critics argue that the timing of these experiments may have influenced the results. For instance, by the time the researchers evaluated the impact of a chronological feed in the fall of 2020, many users had already joined large groups that flooded their feeds with potentially problematic content. Additionally, Meta had implemented election protection measures during the months leading up to the 2020 election, potentially skewing the results.

Other experiments focused on limiting the visibility of viral content in users’ feeds, resulting in users being exposed to less political news. While this led to decreased engagement and lower levels of political knowledge, it did not impact users’ political polarization or attitudes.

Overall, the studies released thus far paint a complex picture of social media’s impact on political discourse. It reveals that users with different political affiliations interact with and consume news from vastly different sources. It also highlights that right-leaning content, which is often rated as false by third-party fact-checkers, is more prevalent on Facebook. Moreover, attempts to expose users to opposing viewpoints did not significantly impact their political attitudes or belief in false claims.

Researchers urge caution in interpreting the results, acknowledging that a different study period or location might yield different outcomes. The research was also conducted in a world where political engagement and exposure to information from various sources were already high.