9 Cognitive Biases
Goals
- Defining epistemic beliefs, selective exposure, and common examples of cognitive biases.
- Understanding how our media use and our cognitive biases work together to influence our citizenship behaviors.
Epistemic Beliefs
Epistemic belief is a very strong, individual difference that impacts how people engage with media. I’d like to share with you an article by two professors one currently in The School of Communication and a former PhD graduate. The article is called “Epistemic beliefs role in promoting misperceptions, and conspiracist ideation”. The article was published in PlusOne, which is a open source (available to anyone for free), scientific, peer reviewed journal.
Misperceptions have some very strong consequences for democratic decision making. In this book, we’re talking about being active, informed and responsible citizens. Obviously, being informed is a big part of making good decisions for democracy. But unfortunately, people are not always terribly well informed. Over the last 15 years, we’ve seen an increase in disinformation in the media. Disinformation is not new though. It has been around as long as the news has been around. But we’ve seen an increase, particularly with disinformation online. And this has led to some real misinformation in the general public. So for example, many Americans still to this day, believe that prior to the Iraqi war, that Iraq had weapons of mass destruction, which they did not, but many Americans still believe that Iraq did have those weapons of mass destruction. Another example, many Americans, again, to this day still believe that President Barack Obama was not born in the US, yet we have plenty of evidence that he was. Another example, many Americans believe that climate change is not real, despite the fact that scientists overwhelmingly agree that it is real and it is manmade.
So we need to be able to make good quality decisions, and those good quality decisions have to be based on a shared reality. And that’s one of the problems that we’re seeing a lot these days is that people are not even viewing the world in the same way. We see this with the Covid pandemic; we have to have a global effort to fight a global problem. But we can’t do that when some people think that the virus is real, and some people think that it’s not. So in that way, the role that the media plays and keeping people informed, is a really vital to have strong functioning democracy. We need assess available evidence in kind of an empirical way or a scientific way.
To address misperceptions, the first thing that we need to do is examine why they exist. This can be a really difficult thing to do, both scientifically and emotionally as people since it can be difficult for us to give a good, honest look at people who are misinformed and really, truly ask ourselves, why they are so ill informed (as opposed to judging them). But some psychological processes have already been identified, as well as some different kinds of thinking styles that can help us understand why people have these misunderstandings.
Some of those psychological processes show us that people tend to believe information that confirms how they already feel. That’s that confirmation bias. So when people are presented with information that is consistent with their political ideology, they’re more likely to believe it, whether it is true or not. Another issue is that people don’t want to risk being ostracized by their in-group, if they believe differently. For example, there was an article in The New York Times about a small town in Florida where most people did not believe at the time that the Coronavirus was a serious risk. So there was this very strong community sentiment that they were not going to accept that the pandemic was real. Unfortunately, there was a woman whose husband was hospitalized down there with the virus. But, she didn’t even want to tell people about it because she was so worried that her neighbors would reject her if they found out that she believed that the virus was real and that she believed that it was real because her husband was in the hospital. So those kinds of pressures can be very strong.
Another issue is that we as humans, like to have simple solutions for problems that are actually quite complex. One of the things that we do to have those simple solutions is called the false cause. This is it’s exactly what it sounds like: Thing A happens before Thing B. So we assume that Thing A caused Thing B. I saw a fun example of this by a comedian on a late night show the other day, and he was talking about groundhog day, when they pull the groundhog out, and if it sees its shadow, then there is supposed to be six more weeks of winter. But the point that he was making is that much of America went into lockdown in the year 2020, after Groundhog Day, so he was saying maybe the groundhog caused the virus, which is obviously a joke.
Another issue is that repeated exposure can make us start to believe something, even if we didn’t believe it initially. One of the reasons for that is that we are not always good at remembering the source of information we get: source monitoring. Source monitoring is where you remember what the source of some kind of information was. Let me give you an example. Let me see if this has ever happened to you. It certainly happened to me. Do you ever have a conversation with someone, and you’re thinking about that conversation later? And you remember everything about the conversation: you remember what you said. You remember where you were during the conversation. You remember the details of the conversation. You remember how you feel about the person that you were talking with. And so you remember every single thing about the conversation, except that you can’t remember who you had the conversation with. That is a failure in source monitoring. It happens in interpersonal conversations, but it happens with the media a lot as well. So if you were exposed to some kind of false information in the media, generally right after you’re exposed to that false information, you know that it wasn’t real. But if you keep hearing that false information over and over again, you may eventually start to believe it, even though you initially did not because you sort of forget where that information is coming from.
Thinking styles can be a problem as well. So again, this is individual differences, where we might be willing to accept claims that are really baseless, or we have a tendency to think that everything that happened was intentional, because that helps us make sense of the world around us. Or we might be someone who just has a stronger distrust for people in a position of authority.
So that brings us to epistemic beliefs, which, as we’ve defined before, is how we think about thinking or beliefs about how do we know what’s true and what’s not true. This can have an impact on your understanding of the world around you, and the way that you learn the way that you believe about science, about politics, and about the relationship between them.
There are different ways that people come about believing what they believe, or different types of how they think that they can determine what’s true and what’s not true. The first is just go by your feeling. For example, “I feel like that’s true, so it probably is”. The second is to go by evidence where you’re looking for some kind of facts or examples to support a statement. The final one is intent where some people kind of believe that if you intend it to be true, then it kind of is true.
So breaking those down, faith and intuition, you can put it into these two kinds of routes. There’s the rapid automatic intuition where you’re not thinking very deeply about it. You just come to a conclusion without thinking, and then there’s the slower, more thoughtful, systematic route where you put a lot of thought into making a decision before you you do so. So some people are very good at tapping into this gut feeling and some people are not very good at it. And what’s interesting is that people who are not good at tapping into their gut feeling actually tend to make bad decisions, even if they’re really good at the more systematic route. This is a little bit unexpected because generally you think people are always going to come to better conclusions if they think logically about it. But what we find is that you actually need a combination of these where people who make the best decisions are very good at this slower, thoughtful, systematic route of thinking, where they put a lot of time and effort into their decision making. But they’re also very in tune with their their gut instincts as well. But of course, people who relied too much on this intuition or gut instinct tend to make very bad decisions.
I was pleased that Dr. Garrett included in his paper, the phrase “truthiness”. It comes from a comedian, Stephen Colbert, back when he had The Colbert Report, which is not on TV anymore. But this phrase truthiness means that if something feels true, then it probably is true. And that can lead people to make misattributions and other kinds of errors in judgment.
Another way to come to believe what is true is to rely on the presence or absence of evidence. The problem is not everybody really values evidence. In the social sciences, we’re very evidence based, but not everybody is really as interested in evidence as they are their feelings. For example, there’s an interesting result with belief in climate change, where for people who self identify as liberal, how much they know about climate change, has an effect on how they feel about it. The more they know about it, the more likely they are to believe that it is real. But we don’t see the same effect with people who self identify as conservative for those folks on average. That doesn’t mean every single person who is conservative fails to undertand climate change. It means though that , on average, how much they know about climate science does not predict whether or not they believe that it is real. Again, on average, they don’t value evidence as much.
The final epistemic belief is the idea that facts are subjective. In this way, simply providing someone with facts is not enough for them to be informed because some people don’t think that facts are universal. So there’s this belief that science is one way to get facts, and the other methods like their own personal feelings are equally valid. This epistemic belief is not congruent with the social sciences. The core of Social Sciences is that we want everything to be evidence based. This doesn’t mean that feelings don’t matter. It just means that your feelings don’t trump evidence. Unfortunately, the media does not always help us make this distinction. That’s where the relationship between media and citizenship can get a bit complicated.
The best thing for us to do is to be aware of these different ways that people come to believe what they believe, and acknowledge that not everybody has the same epistemic beliefs as we do. That can help us really take a look at the why. For example: why would someone vote against their own interest? Why would someone vote against the interest of their country or their state or wherever they’re a citizen of? If you look at this idea that their epistemic beliefs may be different than yours, it can clear up some kind of miscommunications that you might have otherwise.
Selective Exposure and Information Bubbles
Filter bubbles refers to the notion that people generally engage with media that aligns with how they already feel. They are less likely to engage with media that presents them with different viewpoints on current topics. However, when we discuss filter bubbles, we should acknowledge that there is some concern about the terminology because it suggests that people just never encounter ideas that they disagree with online (because a bubble is all encompassing). However, according to Dr. Kelly Garrett, that’s not really the case. While algorithms do filter out a good deal of content that they think is not going to be of interest to you, we don’t find that people are never exposed to ideas that they disagree with online. Even on Facebook, Dr. Garrett et al. found that between 25 and 35% of the content in a user’s social media feed came from a source that an algorithm would predict they would not agree with or would not be interested in. Additionally, between 20 and 30% of the content on social media that people chose to look at was cross cutting, meaning that it was not in line with what they generally are interested in or believe in. But, this isn’t to say that exposure is balanced. Clearly, it’s not since these numbers are not 50/50. But as Dr. Garrett would say, there is no bubble here.
Echo chambers are not that different than filter bubbles. The idea of the filter bubble is that you have this world that you live in, and you’re not exposed to other environments, other sources of information. An echo chamber is like you’re hearing your own thoughts, repeated by other people. For example, Democrats don’t like hyper-partisan sites that are Republican leaning, and vice versa. That’s different than avoiding the other side, which is what people often assume when they hear the term echo chamber. So on the one hand, the algorithms for your news content may be feeding you information that it thinks you’re going to agree with, but you may or may not be actively avoiding it. This is probably why audiences for less partisan news sites are bigger than audiences for hyper partisan news sites, because people in general don’t always intend to avoid the other side.
I would like to share with you a short summary of a book, which was a New York Times bestseller, The Filter Bubble (cite).
A little bit of history, first of all, the author was around when the internet was created, and he, like everybody else, thought the internet was going to solve a lot of the world’s problems. It was going to increase democratic participation because people would have access to more information. But one of the problems that we run into is that people are actually inundated by too much information. Then it can be difficult to sort through. Additionally, in 2009, that’s when Google launched their personalized search for everybody. Google uses signals to gather information about its users, and this happens whether you are logged into Google or not, if you have a Google account or not. Signals includethings like where are you logging in from, what browser are you using, and your previous searches. Theyt use that information to make predictions about you and what kind of information you would be interested in. so there isn’t really any standard Google anymore. For example, if you and I both Google the same search term, we’re not going to get exactly the same results, the way that we would have prior to 2009. The implications of this are something that we’re still really digging through.
So again, the author expected that the internet was going to make democracy better. The reason that he and many other people thought that is that the idea was that leaders would be able to communicate without filters directly with the people that they represent. That was going to be good for democracy. Bloggers and citizen journalists would be able to build up this public media. In that way, if you have politicians communicating directly with their constituents, they don’t need money for campaign funding. This was expected to create more transparency and accountability; it was going to decrease corruption in political campaigns, and it was going to increase civic engagement, which would be good for democracy. So that’s what everybody expected would happen. But unfortunately, that is not what we’ve seen. Part of the problem is that we now have gatekeepers to the information they’re exposed to when they may not be aware of it. Traditionally, trained editors for newspapers decide what to publish, but now untrained people can get a lot of attention on the internet.
Also, according to The Wall Street Journal (cite), the top 50 internet sites install on average 64 cookies and other tracking beacons each. For example, let’s say that you search the word “depression”, in dictionary dot com, all of a sudden, you are going to start seeing advertisements for depression medication on your social media feed, or even on your newsfeed. You’ve probably noticed something similar happening before.
Another example is Amazon’s business model. They make billions of dollars by guessing what you’re going to buy and making those suggestions to you. Netflix does the same thing for about 60% of their views. That’s not always a bad thing. Sometimes that’s really convenient if you’re someone that doesn’t know a lot about TV shows, maybe it’s nice to have those kind of personalized recommendations. But there are a lot of downsides as well, especially considering that 36% of Americans under 30 years old, get their news, almost exclusively through social networking sites, which means that they are not being exposed to information that is informative for democratic purposes.
The problem with this is that in order for us to solve the world’s problems, we have to be on the same page. We have to have the same information and understand our shared experiences. So when those are filtered out, it’s like we’re seeing the world through tinted sunglasses. Even the NSA is unable to keep up with all the data that is out there. So obviously, each individual person isn’t either.
Making everything personalized also means that the internet loses much of what was good about it, especially that everything had an equal chance of being viewed. That is not the case anymore. For example, Amazon’s algorithms can be bought to promote products, in what looks like objective recommendations, but it’s really not because someone paid for it. Google, like we said, monitors every signal that they can where you are and what emails you open or delete. Additionally, they’re not just looking at what you’re clicking on, but they’re looking at how much time you spend looking at an advertisement or an email. Their goal is to be able to eventually answer hypothetical questions. So if, for example, if you ask Google the question, what college should I go to, at this point in time, they’re not going to be able to really personalize that answer for you. But their goal is to get to know you well enough that they will be able to, which again, has some benefits and a lot of downsides as well.
Facebook has also been well known for their issues with harvesting your data. What’s interesting about Facebook is that sometimes they’re not necessarily harvesting your data so much as you just give it to them with what you post. Also there are connections between Facebook and many of the other apps you use that make it a bit more invasive than people necessarily know. One of the issues with that is that because Facebook is so big and so popular, a lot of people find themselves locked into it. The idea is, they know all the downsides of it, and they want to quit, but they don’t have any other way to stay in contact with the people that they want to stay in contact with. For a certain age group anyway, Facebook is the main way that people stay in touch.
This can cost us individually. One of the problems is when you have access to all the information in the world, you’re not necessarily going to be attracted to information that is informative for democratic decision making. We are often with our lizard brains attracted to junk information. This may be hardwired into us through evolution because through evolution, we are trained to be aware of circumstances that are unusual and so that may be why we’re attracted to that kind of celebrity gossip and stuff. It can also cause a decrease in our creativity and an increase in advertising towards our vulnerabilities. For example, if you search the term “depression” or “weight loss”, all of a sudden, you’re going to start getting people sending advertisements that they know exactly what your insecurities and vulnerabilities are. They’re going to exploit that, and that can cause invasive pricing as well because if they know you’ll want to buy something, they may jack up the prices.
This can also lead to “informational determinism”, where the information that is available to you has an effect on the your ability to function in the world. The author also describes filter bubbles as increased bonding and decreased bridging. This is where you’re able to connect more with people who may feel the same way as you do about some particular topic. But you’re not bridging out and meeting people who are different than you.
Finally, issues that are really quite complex are less likely to get attention than issues that seem very simple and straightforward. This is a problem because a lot of issues that we are dealing with as a society as a democracy, things like the rising prison population and homelessness, they are not simple, straightforward issues. It can be complex, but people are not drawn to learning about complex information.
You’ve probably heard the phrase before “the user is the content” in that if you’re not paying for a service, that you are the product.
Targeting means that we were living in an era where you have to develop premium content to get premium audiences. However, that is coming to an end because now you have these kind of niche audiences so you can create niche content, and it’s not necessarily going to be as high quality. This can cost us as a society because when we have these kind of large scale issues, we’re not able to cooperate as well. The filter bubble distorts our perceptions of what’s important and what is true. The new gatekeepers are not people who are necessarily educated in media. And that’s when we get things like this, where we are attracted to, like I said, this kind of unique or unusual circumstances, media, but not necessarily media that is going to help us be active, informed and responsible citizens.
Psychological Reactance and Other Cognitive Biases
Finishing up our unit on individual differences, let’s talk about a few of the more common cognitive biases.
The first one is the fundamental attribution error. I think that this is one of the most important cognitive biases to talk about, because it seems to be driving a lot of problems that we have in the world that revolve around miscommunication. The idea with the fundamental attribution error is that we generally as humans are not very good at putting ourselves into other people’s shoes, or seeing the world from their perspective, or just acknowledging that their lived experiences are different than ours. Certain people are, of course, better at it than others. But this can be a real problem when someone kind of sees their own life as being driven by their own good decisions. For example, if something good happens to me, it’s because I made good choices. If something bad happens to me, it was just bad luck. The opposite occurs with evaluations about other people. If something bad happens to someone else, it’s just because they made a bad choice. Or if something good happens to someone else, they just got lucky. We do see that quite often in the real world, particularly where people don’t acknowledge that others have had different experiences, opportunities, and challenges in their lives.
The next one is the Dunning Kruger effect. This is where the more you know about a topic, the more you know you don’t know. So people who are highly knowledgeable don’t consider themselves to be highly knowledgeable. But people who are very uninformed, they tend to think that they’re more informed than other people. So you see the people who know the most thinking they knew the least and the people who know the least thinking that they know the most. I also see it happening a lot with my students in a way that’s really sad because some of my brightest students are quite hard on themselves. If you’ve ever scrolled through the “comments” section on a scientific youtube video (not recommended), you’ll also see many people thinking they are experts in areas that they very much are not.
Confirmation bias is where we seek out or are more likely to remember information that confirms how we already feel. So for example, I really like black cats. But a lot of people think that black cats are bad luck. So if you think that black cats are bad luck, every time you see a black cat and something negative happens, you’re going to kind of think to yourself “Oh, yeah, there it happened again”, and just pay attention to when that happens but not pay attention to when it doesn’t happen. We see these kinds of things with important topics related to citizenship behaviors as well.
Another cognitive bias is the functions of attitudes. This is the idea that people have attitudes for a certain reason. So, if you can figure out why they have the attitude that they do, then you can start to understand a little bit better, how that’s going to affect their citizenship behavior. But one of the functions of attitudes that I see making a big difference in citizenship behaviors is called the ego defensive function of attitude. The idea is, someone will have a particular attitude towards something, but that attitude changes to protect their self esteem. So for example, if I see a car that I really like, I have a positive attitude towards that car, and I think, “Oh, maybe I’m gonna buy that car”. But then, when I find out how much the car costs, and I realized that I can’t afford the car, I’m going to immediately start dumping on the car. My attitude towards the car is going to become more negative, just to make myself feel better, that I couldn’t afford the car. We see this happening a lot in the real world, there’s this interesting phenomenon where a lot of people when they should feel maybe sad or disappointed at rejection, instead, they respond with anger.
Declinism is a cognitive bias where we tend to look at the past, inaccurately. Very often when we do that we’re looking at the past as if it was nicer than it actually was. In a lot of ways, that’s not a problem since that’s not necessarily going to have catastrophic effects. If you kind of look back at the past and feel good about it, that can be perfectly fine. But there are cases where this cognitive bias can be very dangerous. Those are cases where we end up repeating mistakes that have happened in the past.
Cognitive dissonance is when you have two different thoughts that are incongruent with each other. So you can imagine it as you know, the angel and the devil on your shoulder. For example, if someone is very supportive of a politician, let’s call them Candidate X, and they’re really in favor of Candidate X and believing they’re on the same page for most issues. But then all of a sudden, Candidate X says or does something that is bad or wrong or illegal. Now they’re having this cognitive dissonance where on the one hand, they’re thinking, “I should stop supporting Candidate X , because Candidate D did this illegal stuff”. On the other hand, they might be thinking, “I’ve really supported Candidate X in the past, and I want to save face and look good in front of people. So I don’t want to admit that Candidate X is in fact, not a good politician”. The way that they resolve that dissonance has some real impacts for citizenship behaviors.
A very interesting cognitive bias is called the backfire effect. This is the idea that sometimes when you present people with facts, it doesn’t work to change their mind in a good way; they may actually double down on their incorrect beliefs when they are presented with facts to the contrary. One study examining voting preference showed the introducing people to negative information about a candidate that they favor often caused them to increase their support for the candidate, doubling down (cite). Another study which examined people’s intention to get a vaccine against the flu, found that giving people who thinks the vaccine is unsafe information disproving myths often ended up with a reduced intent to vaccinate again, doubling down on their incorrect decisions. So this is something that we should be aware of, and it is related to the next one psychological reactance.
Psychological reactance is the idea that people don’t like to be told what to do. So if someone feels that they are being manipulated, they are likely to reject your message. For example, imagine that your uncle is a smoker, and he’s been smoking a pipe for a long time, and you’re starting to really worry about his health. And you say to him, “Uncle Joey, I really think that you should stop smoking that pipe because it’s not good for your health”. But, Uncle Joey doesn’t like to be told what to do. So he’s going to start smoking more than he did before. People actually do this a lot. Unintuitively, people can sometimes even act against their own best interest just so that they don’t feel like they’re being manipulated.
Everyone is susceptible to these kind of cognitive biases. But the best thing that you can do really is to be self reflective, to catch yourself doing these biases in your own thoughts. Perhaps the more often you catch yourself doing it, the less often you’ll do it in the future.
References
Bellware, K. (date). Colorado Republican official accused afer voting system passwords are leaked to right-wing sites. The Washington Post.