Sep 02 2022

Algorithms Still Reinforce Echochambers

Why do societies collapse? This is an interesting question, and as you might imagine the answer is complex. There are multiple internal and external reasons, but a core features seems to be that a combination of factors were simultaneously at work – a crisis that the society failed to deal with adequately because of dysfunctional institutions and political infrastructure. All societies face challenges, but successful ones solve them, or at least make significant adjustments. There are also multiple ways to define “collapse”, which does not have to involve complete extinction. We can also add political or institutional collapse, where, for example, a thriving democracy collapses into a dictatorship.

There are many people concerned that America is facing a real threat that could collapse our democracy. The question is – do we have the institutional vigor to make the appropriate adjustments to survive these challenges? Sometimes, by the time you recognize a serious threat it’s too late. At other times, the true causes of the threat are not recognized (at least not by a majority) and therefore the solutions are also missed. So the question is, to the extent that American democracy is under threat, what are the true underlying causes?

This is obviously a complex question that I am not going to be able to adequately address in one blog post. I would like to suggest, however, that social media algorithms are at least one factor contributing to the destabilizing of democracy. It would be ironic if one of the greatest democracies in world history were brought down in part by YouTube algorithms. But this is not implausible.

My core premise, one that I think is not controversial, is that democracy requires an informed electorate in order to be healthy. Since all citizens are collectively making decisions, either directly or through our elected representatives, then a critical mass of those citizens need to be armed with accurate information relative to the decisions they are collectively making in order for democracy to function. (Please, no one feel compelled to point out that “we are not a democracy but a republic.” We are both, a democratic republic, and many decisions are made by direct referendum.)

There are many ways to have a pseudodemocracy – to look like a democracy but only superficially. People could vote, but only be given a choice of a single candidate or party. Or, the mechanisms of voting could be rigged and the outcomes preordained. But there is a more subtle way to have a pseudodemocracy – if those in charge have sufficient control over the dissemination of information, then voters are robbed of their ability to make an informed political choice. In this way, disinformation alone can kill democracy. This is why a free and independent press is critical, and why there needs to be political competition.

That’s the question we are facing today – do social media algorithms combined with extremely biased mainstream news outlets create isolated information ecosystems that are able to trap significant portions of the population into an alternate reality? It certainly seems that way. But of course, everyone thinks they are getting the real news and that everyone else is getting propaganda. That is a huge part of the problem, you cannot see the echochamber from the inside. It is like confirmation bias (because it is confirmation bias), we are confronted with an overwhelming amount of information that confirms one perspective. This creates a powerful perception that this perspective must be correct.

This can happen purely for organic reasons, without anyone consciously making it happen. That’s confirmation bias – we tend to filter the vast amount of information in the world to reinforce what we already believe, or what we want to believe, creating the illusion that all the evidence supports our view. Unless we specifically seek out contradictory information, or have a working knowledge of how confirmation bias works, this is the default mode of human information processing. But the situation is far worse than just confirmation bias, because there are external forces that are consciously curating information to also reinforce a particular political point of view. If you watch mostly Fox News or MSNBC, then you are viewing reality through a biased lens.

This has always been the case, with confirmation bias and propaganda siloing people into different political identities. Now, however, there is a third force at work – social media algorithms. It’s old news that these algorithms act as external curators of information, suggesting content to users that reinforce their existing biases. These are more similar to confirmation bias than propaganda – they are designed mainly to maximize clicks. Maximizing clicks is a function of engaging the user, and the best way to engage the user is to feed them increasingly extreme and emotional content. This creates a feedback loop that acts like a rabbit hole sucking people into a vortex of increasingly fringe ideas. This can turn even a slight curiosity about, say, flat earthers into a true believer. Algorithms weaponize confirmation bias. They also lead to curated echochambers that also reinforce the fringe beliefs.

While the big social media companies have promised to improve the algorithms to address this issue, there is evidence that the algorithms still feed information that reinforces existing beliefs, including conspiracy theories. In a recent study, for example, people already skeptical of the results of the 2020 election were suggested by YouTube three times as many videos casting doubt on the election as those with the least amount of skepticism.

All three factors, therefore, work together – algorithms, confirmation bias, and deliberate propaganda. The effect can be overwhelmingly powerful, and we need to understand it in order to have any hope of relating to our fellow citizens. It can be difficult. I know, for example, people who I think are otherwise savvy and intelligent who believe utter demonstrable nonsense. It’s hard to wrap your head around it, and it’s tempting to just right them off as hopelessly deluded. But that doesn’t get us anywhere.

Rather what we are seeing is the result of literally centuries of developing the art of hacking the brain. Strategies have evolved over the course of human history to exploit neurological and psychological vulnerabilities to control what other people believe. Social media has just accelerated the process, but it did not create it. The question is – has this phenomenon reached a critical threshold where it is an actual existential threat to democracy? Many people think the answer is yes. I am not as pessimistic, but the next decade will be very telling. In fact, the next election will be very telling.

No responses yet