A slate of research papers published Thursday suggest that the algorithms that drive Facebook and Instagram are not entirely to blame for those platform’s political polarization, as some previously believed.
When results are taken together, the studies suggest that Facebook users seek out content that aligns with their views and “echo-chambers” allow different political groups to rely on, interact with, and consume divergent sources of information and misinformation, the research states.
The four papers published in Science and Nature had unprecedented access to Facebook and Instagram data from around the 2020 election, and modified different parts of the sites’ algorithm to test its impact on people’s political beliefs and polarization.
While algorithms have been the focus of criticism for the platforms and their impacts on politics, including attempts to regulate social media sites, the research found that the algorithms themselves do not have a major impact on polarization.
“We find that algorithms are extremely influential in people’s on-platform experiences and there is significant ideological segregation in political news exposure,” University of Texas Professor Talia Jomini Stroud, a lead on the studies, told The Associated Press. “We also find that popular proposals to change social media algorithms did not sway political attitudes.”
Social media algorithms generally recommend content to users based on what the algorithm believes the user wants to see and will interact with. That has created concerns that the algorithms create disinformation cycles, where users are fed more and more mis- and disinformation to reinforce their political beliefs.
Conflicts over regulating Facebook’s algorithms and how it pays its content creators — including news outlets — have already resulted in news feeds being taken down by the company in Canada, and sparked a similar threat in California.
But research found when the algorithm’s suggestive nature was turned off and users were shown content chronologically instead, polarization did not decrease. Similarly, when researchers disabled content sharing on users’ feeds, polarization stayed about the same while the distribution of misinformation decreased significantly.
One of the studies reduced the amount of political content users saw from those of their own political ideology in their feeds, but that also had little impact on polarization and political opinion. Each of the studies saw users use the platforms less overall.
Northwestern University Professor David Lazer, who worked on all four papers, told The Associated Press that the algorithm serves users what they already want to see, “making it easier for people to do what they’re inclined to do,” he said.
Meta lauded the studies in a company memo, saying they prove that the sites’ algorithms are not malicious.
“Despite the common assertions that social media is ‘destroying democracy,’ the evidence in these and many other studies shows something very different,” the company said.
Critics of the social media giant Meta, which owns Facebook, called the studies “limited,” noting that researchers only got access to specific data which Meta let them use.
“Meta execs are seizing on limited research as evidence that they shouldn’t share blame for increasing political polarization and violence,” Nora Benavidez, senior counsel to the nonprofit Free Press, said in a statement. “Studies that Meta endorses, which look piecemeal at narrow time periods, shouldn’t serve as excuses for allowing lies to spread.”
Taken together, the studies also pull back the curtains on the activity tendencies by users of different political beliefs. Conservatives are most likely to read and share misinformation, for example, and also have the widest range of sources that cater to their beliefs.
About 97 percent of the sites that spread misinformation were more popular among conservatives than liberals, the research stated.
Lazer called the limitations put in place by Meta on the data reasonable, noting they were mostly for user privacy, and said more findings are on the way.
“There is no study like this,” he said. “There’s been a lot of rhetoric about this, but in many ways the research has been quite limited.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Technology News Click Here