Home / News / A study found that Facebook Pages and Groups create ideological echo chambers

A study found that Facebook Pages and Groups create ideological echo chambers

New research published Thursday provides an unprecedented look at political behavior on Facebook and Instagram, two major online platforms where people express and discuss their political views. Four Science and Nature papers by an interdisciplinary team of researchers and Meta internal groups examined behavior on both platforms around the 2020 U.S. election.

The 2020 Facebook and Instagram Election Study (FIES), a rare collaboration between Meta and the scientific research community, inspired the papers. The project was led by University of Texas Professor Talia Jomini Stroud of the Center for Media Engagement and NYU Professor Joshua A. Tucker, co-director of the Center for Social Media and Politics.

Complex findings.

Researchers examined Facebook’s ideological echo chambers to determine how much political content users saw. According to the researchers, Facebook is ideologically segregated.

Data revealed two intriguing findings. First, Facebook Groups and Pages had more “ideological segregation” than users’ friends’ posts. The researchers wrote that Pages and Groups segregate and polarize audiences more than users.

That may seem obvious, but Groups and Pages have been instrumental in spreading misinformation and uniting like-minded users around dangerous shared interests, such as QAnon, anti-government militias like the Proud Boys, and life-threatening health conspiracies. Misinformation and extremism experts have long worried about Facebook’s two products’ role in political polarization and conspiracies.

“Our results reveal the impact of Facebook Pages and Groups on the online information environment,” the researchers wrote. Pages and Groups benefit from the easy reuse of content from established political news producers and provide a curation mechanism for ideologically consistent content from many sources.

That study also found a large political content gap on Facebook. Conservative Facebook users are exposed to more online political misinformation than left-leaning users, as Meta’s third-party fact-checking system found a “far larger” share of conservative Facebook news content to be false.

“Pages and Groups misinformation has audiences that are more homogeneous and completely concentrated on the right,” the researchers wrote.

In another experiment with Meta’s help, Facebook and Instagram users had their algorithmic feeds replaced with a reverse chronological feed—the rallying cry of those fed up with social media’s endless scrolling and addictive designs. The experience didn’t change users’ political views, offline engagement, or political knowledge.

In that experiment, reverse chronological feed users saw one major change. “We found that users in the Chronological Feed group spent dramatically less time on Facebook and Instagram,” the authors wrote, demonstrating how Meta boosts engagement and encourages addictive behavior by mixing content algorithmically.

These findings represent a small portion of the current results and future papers. Meta has been portraying the results of the new studies as a win, reducing complex findings to a publicity stunt. This data is crucial for future social media research, regardless of Meta’s interpretation and the odd arrangement between the researchers and the company.

About Chambers

Check Also

Researchers have recently identified the initial fractal molecule found in the natural world

Fractals, which are self-repeating shapes that can be infinitely magnified without losing their intricate details, …