Apr 19 2018

The Real Problem with Echochambers

It has rapidly become conventional wisdom that the widespread use of social media has resulting in an increase in the “echochamber effect.” This results from people only consuming media that already is in line with their existing beliefs and ideology. This is nothing new, psychologists have long documented that people are much more likely to access information that reinforces their existing beliefs and biases, and much less likely to engage with information that directly challenges their beliefs.

One of the hopes of the internet is that it would help people break out of their self-imposed echochambers of thought, by making a greater diversity of information, opinions, and perspective a mouse click away. That dream was thwarted, however, by the real world. Social media giants, like YouTube and Facebook, trying to maximize their own traffic, developed algorithms that placed information in front of people that they were most likely to click – meaning the kind of information they are already consuming. Watch one video about dog shows, and you will find a helpful list of popular dog show videos on the right on your browser. The next thing you know your mild interest in dog shows becomes a fanatical obsession. OK, maybe not, but that is the concern – self-reinforcing algorithms will tend to have a radicalizing effect.

There are also clearly virtual networks on the web developed to function like echochambers. There are blogs and channels dedicated to one specific world view, or opinion on a specific topic. The site is curated to be friendly to those with the same view, who are welcomed as compatriots. If you disagree with the approved view of the site, you are a troll. Your comments are likely to be blocked and you may even be banned. Of course, people have the right to protect their sites from truly disruptive and counterproductive behavior, but what makes a troll is in the eye of the beholder.

There are also metasites that curate multiple other sites, as well as news items, that cater to one world view, whether it be a political faction, specific activism, or ideology.

Supporting this echochamber narrative is the fact that people are becoming more polarized, tribal, and emotional over time. People hold more negative views of their political opponents, and are less likely to think that, while they disagree, they have a valid perspective.

The hope of the internet seems to have backfired. Rather than bringing people together, the internet has facilitated people separating themselves into multi-layered factions. The web is tribalism on steroids.

Not So Fast

While this narrative makes sense, is supported by evidence, and seems to conform to our everyday experience, a new study suggests that it might also be a bit simplistic. Seth Flaxman and colleagues at Oxford University examined the browsing histories of 50,000 US users. They found that people who searched for specific information, rather than just browsing favorite websites, were both more likely to find extreme websites, they were also more likely to search for and visit websites with an opposing viewpoint. Overall they had a higher diversity of opinions they were consuming online. They write:

We find that social networks and search engines are associated with an increase in the mean ideological distance between individuals. However, somewhat counterintuitively, these same channels also are associated with an increase in an individual’s exposure to material from his or her less preferred side of the political spectrum. Finally, the vast majority of online news consumption is accounted for by individuals simply visiting the home pages of their favorite, typically mainstream, news outlets, tempering the consequences—both positive and negative—of recent technological changes. We thus uncover evidence for both sides of the debate, while also finding that the magnitude of the effects is relatively modest.

So overall people are consuming information the way they always did, browsing mainstream media outlets, like flipping through the pages of a newspaper. Those who are searching directly go for more radical sites, but also go for information from the “other side.” But this is also relatively few people.

Also, a 2016 Pew survey of 2000 users found that only 8% restricted their browsing to a “low diversity” of websites.  Essentially they found that echochambers exist largely on the fringes on the political spectrum. Most people in the middle share news with a variety of other people consisting of a variety of viewpoints.

So maybe the whole echochamber thing is overblown and the internet did achieve its promise of more access to more diverse information. A remaining question, however, is if the size of the group on the fringe has grown. Is access to information sucking more people down the rabbit holes of extreme conspiracy theories, pseudoscience, political radicalization, etc. ? Anecdotally it seems so, but I would like to see some data on this. Probably everyone can think of someone for whom this is true, and the availability heuristic would therefore make it seem as if it is therefore a common phenomenon. But it may be relatively a small phenomenon, as suggested by this survey.

Still, expanding the radical fringe even to 8% can have a significant effect on society.

Cognitive Echochambers

Even if the echochamber effect, as commonly understood, is a smaller phenomenon than many have assumed, there is something else at work here that is perhaps more insidious. Even if most people are consuming mainstream media and when searching they are getting a diversity of information, we are looking at all of that information through out cognitive biases and filters.

Another study, just published in March 2018, this time of Twitter users, found that increased exposure to opposing viewpoints actually increased confidence in radical opinions. There are several cognitive effects likely at work here. The first is motivated reasoning, something I have written about frequently. We are very good at making sense of the facts in such a way that they confirm what we want to believe in the first place. We are endlessly creative in finding ways to minimize inconvenient information, finding and amplifying facts that seem to support our view, and interpreting data so as to be in line with our beliefs. The more motivated we are (the more emotionally held the belief or central it is to our identity) than the greater this effect. We can be quite reasonable about facts that have little or no emotional significance to us, but when we care about the outcome, we can be overwhelmed with motivated reasoning.

To take a currently obvious example – people from different sides of the political spectrum look at the exact same data about President Trump and come to radically different interpretations. At one end there are those who think that Trump is a con-artist and semi-literate moron. On the other end (again – looking at the exact same data) there are those who think he is a courageous genius.

In addition to motivated reasoning, when we do take the time to expose ourselves to opposing viewpoints, that can have a radicalizing effect in a few ways. It gives us the illusion of knowledge. We now feel more entitled to hold strong opinions, even those that may seem radical. We have earned the right to those extreme opinions precisely because we took the time to understand all sides. Of course, if we did so through a haze of motivated reasoning, the outcome was predetermined. We looked at the information to find ways to support our existing beliefs, not to determine what we should believe.

Exposure to opposing viewpoints also tends to give people more negative views of those who hold those viewpoints. Rather than moderating our opinions through understanding, we simply have more ammunition to claim that those “X” on the other side really are crazy assholes.

So simple exposure to more information, and more diverse information, isn’t enough. The process we use to evaluate information still is the most important variable. Are we trying to divorce ourselves from our emotional investments, and look at information in a purposefully unbiased way, examining our own logic and judgement for fairness and consistency, and trying to follow facts and logic as best we can? Or are we just going through the motions, looking for information to support our predetermined conclusion, and using the experience to justify those conclusions?

Perhaps the best way to summarize the effect that the internet and social media has had is that it has not caused echochambers, radicalization, polarization, belief in pseudoscience, or any perceived information ill. Rather, social media is just jet fuel, amplifying all the things, good and bad, that people were already doing. It is now just faster and easier to engage in information-related activity. This also means the cultural evolution has sped up also. It is a lot easier to find justifications for our beliefs, and the internet is a meme-sharing device that provides us prepackaged rationalizations. Don’t like that fact, it’s “fake news.” Don’t like that opinion, here are twelve others to try on for size.

This also means that critical thinking is more important than ever (just like nuclear weapons make diplomacy more important). I do think that critical thinking is one of the many things that is increasing because of the internet, but I would like to see it increase much more.

 

No responses yet