Lots of people love Social media because their feeds are a safe place to share pictures of celebrities wearing #masks, discussing the protests against Black Lives Matter and developing strategies to resolve climate change.
And many people love social media because their feeds are a safe pace for sharing pictures of celebrities put in masks, discussing the Black Lives Matter rioters, and devising strategies to stop the joke about climate change .
It’s a story of two feeds because, thanks to confirmation bias and powerful proprietary algorithms, social media platforms make sure we only get one page of each story.
Of course, while most Americans continue to consider themselves balanced, we still gravitate towards certain online content. Over time, algorithms turn slight preferences into a polarized environment where only the loudest voices and most extreme opinions on either side can break through the noise.
What is confirmation bias?
Affirmative bias is the natural tendency of humans to seek, interpret, and remember new information according to pre-existing beliefs. Think of it as our brain̵
Affirmation bias is also known as “myside bias” and is an innate, universal trait that manifests itself across cultures. It is a part of all of us, although once we acknowledge its presence we can take steps to lessen the impact it has on our thinking. The scientific method, the legal system, and the judicial process are all inventions that humans created to circumvent our tendency to reach conclusions (even if the same systems are sometimes affected). Now that confirmation bias is so prevalent on social media, we need additional tools to control its impact.
How is confirmation bias manifested on social media?
Science journalist David McRaney, host of the You are not that smart Podcast, believes confirmatory bias is the root of why we are drawn to social media.
“The fact that social media platforms are confirming what we already believe is why a lot of people use them in the first place,” he says. “If the platforms didn’t do that, they wouldn’t be successful.”
For the biggest brands on social media – think Facebook, YouTube, and Twitter – success is defined by the hours users spend on content and measured in advertising dollars that generate our attention.
Social media companies therefore rely on adaptive algorithms to evaluate our interests and flood us with information that will keep us updated. The algorithms ignore the timeliness and frequency of what our friends post and instead focus on what we “like”, “retweet” and “share” in order to keep feeding content that is similar to what we specified.
Social media has freed traditional gatekeepers from information that has rated stories for timeliness and accuracy. While this has been a boon to discovering niche groups online who value the same things that you do, it also creates echo chambers where a user is never presented with alternate perspectives.
Why should we care?
According to Kristina Lerman, a USC professor whose research focuses on the structure of modern social networks, echo chambers strengthen polarization and divisions in our society. It’s common to feel uncomfortable due to the disconnect between the warm blanket of a like-minded social media community and the cold reality of a real world with challenging perspectives.
Even so, it is possible to find a balance. Jess Davis, a digital marketer who founded the Folk Rebellion brand, specializes in responsible technology. “If companies and algorithms don’t do it for us,” she says, “it’s up to us to regulate ourselves.”