Remember when you only posted brunch pics? Yeah, neither do we.
Over the past few years, many have noticed a startling shift in how people engage on social media. Friends who were once casual participants in online conversations now seem to have adopted more polarized, sometimes extreme, views. If this sounds familiar, you're not alone. The COVID-19 pandemic reshaped many aspects of life, including the way we consume information, interact online, and express our opinions.
Social media platforms saw record-high engagement during the lockdown periods, with billions of people spending extended hours scrolling through feeds that became echo chambers of like-minded views or platforms for controversial debates. But why did so many people, who were previously moderate or disengaged, experience such radicalization during this period?
Isolation and Overconsumption of Information
The pandemic created a unique environment ripe for influencing opinions. Isolation from traditional social settings, coupled with increased screen time, meant that for many, social media became the primary window to the world. Platforms like Twitter, Facebook, and YouTube were not only outlets for connecting with others but also sources of constant news, often with strong biases.
Research shows that social isolation, combined with heightened exposure to divisive content, can lead to shifts in personal beliefs, making individuals more susceptible to extreme viewpoints. Algorithms designed to keep users engaged exacerbated this, as they continually fed users content that matched their interests or incited strong emotional reactions. As a result, users found themselves deep in social media bubbles, where opposing views were often demonized and complex issues were reduced to binary choices.
The Role of Algorithms in Shaping Beliefs
The algorithms driving social media platforms are designed to maximize engagement, often by promoting sensationalist content. Studies have shown that emotionally charged posts, whether they incite fear, anger, or outrage, tend to generate more interactions than neutral or positive ones. Over time, these algorithms adapt to an individual's preferences, creating a feedback loop where users are continuously exposed to content that reinforces their existing beliefs.
For example, a person who engaged with one piece of conspiracy theory content might quickly find their feed filled with similar material, pushing them deeper into a specific ideological rabbit hole. This phenomenon has been labeled "algorithmic radicalization," and it has been a major concern for researchers and policymakers alike.
The Numbers Tell a Story
A study conducted in 2021 found that social media usage surged by 61% during the pandemic, with a corresponding increase in the consumption of news and political content . Of particular note, platforms that leaned heavily on user-generated content, such as Facebook and YouTube, saw the largest spikes in radical and extreme viewpoints being shared. Researchers at MIT found that false news stories were 70% more likely to be retweeted than true ones, adding fuel to the fire of misinformation.
The shift in online behavior isn’t exclusive to one political or ideological group. Both sides of the spectrum saw individuals becoming more entrenched in their beliefs, often as a response to the constant barrage of conflicting information, sensationalism, and misinformation circulating online. The result? A more polarized, divided online space.
Here is a chart that simulates the rise in algorithmic radicalization from 2018 to 2023. The data reflects a hypothetical scenario where radicalized content consumption spiked during the pandemic (2020-2021) and remained elevated afterward, with a peak around 2022. This model shows how social isolation and increased screen time during the pandemic could have contributed to this trend, as algorithms continued to push increasingly polarizing content.
Real-world examples of algorithmic radicalization can be observed across various social media platforms. Below are some notable instances where algorithms have contributed to the spread of extreme or polarized content:
YouTube’s Algorithm and Political Extremism
Example: Numerous studies have shown that YouTube’s recommendation system has driven users toward increasingly extreme political content. The platform’s algorithm is designed to maximize watch time, often leading users from mainstream content to more extreme videos on both ends of the political spectrum.
Impact: A 2019 study found that YouTube's algorithm disproportionately recommended far-right videos to users who started with neutral or conservative content. This phenomenon of “rabbit holes” led some users into deep conspiracy theories, such as QAnon, or extremist ideologies.
Facebook’s Algorithm and Polarization
Example: Facebook’s "Groups" feature has been heavily criticized for radicalizing users by pushing them toward closed, like-minded communities. The platform's recommendation algorithms frequently suggest groups that align with users’ interests, which can lead people deeper into echo chambers where more extreme views are reinforced.
Impact: Facebook’s internal research, as reported in 2021, showed that the algorithm was responsible for fueling political polarization by recommending content and groups that were increasingly partisan and incendiary. This was particularly evident during the 2020 U.S. elections.
Reddit and the Rise of Extremist Communities
Example: Reddit, while smaller compared to YouTube or Facebook, has also seen algorithmic radicalization within certain subreddits. Communities like "The_Donald" and "ChapoTrapHouse" became breeding grounds for radical political ideologies. These subreddits, through Reddit’s engagement-based algorithm, attracted more users seeking extreme content, which ultimately led to their eventual bans.
Impact: The algorithms behind Reddit’s upvoting system, which boosts popular posts, played a role in amplifying radical content by rewarding posts with the most engagement. Users often found themselves in hyper-polarized discussions, further solidifying their extremist views.
The Challenge: Reflecting on Your Own Social Media Journey
If you were active on social media before the pandemic, now might be a good time to take a look back at your posts from that period. How has your tone, content, or engagement changed since then? Have you become more vocal about certain issues, or have you shifted your stance on topics you once felt indifferent about?
Self-reflection is key. While the pandemic might have accelerated some changes, the environment we find ourselves in today—one of heightened tension, divided opinions, and polarized platforms—is a product of both internal and external forces. Recognizing how social media has influenced our own beliefs is the first step toward understanding the larger social shifts at play.
Medical Disclaimer:
The information provided on this website, including articles, blog posts, and other content, is for informational purposes only and is not intended as a substitute for professional medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health providers with any questions you may have regarding a medical condition. Never disregard professional medical advice or delay seeking it because of something you have read on this site. If you think you may have a medical emergency, call your doctor, go to the nearest emergency department, or dial emergency services immediately. The website and its content do not constitute a doctor-patient relationship.
Comments