You are not in control of your thoughts. You may think you are, but that’s just an illusion.
I know that may sound like hippy stuff, but it’s actually one of the central premises of psychologist Daniel Kahneman’s 2011 book Thinking, Fast and Slow.
The book covers everything from economics to politics to human relationships, but one of its primary focuses is how people think and make decisions–and why those decisions are often wrong, even if people don’t realize it.
Since most of college (and life for that matter) involves thinking and making decisions, Kahneman’s material is quite relevant to college students. In particular, his documentation of common cognitive biases and fallacies (the source of many of our mental mistakes) is quite useful, and it’s what I’ll focus on in today’s post.
While I would encourage you to read the book in its entirety, it’s fairly long and a bit dense at times. Consider today’s post a distillation of the best material. If it piques your interest, you can read the full book for more details.
Warning: Your mind may be blown.
Cognitive biases (also called cognitive illusions) are errors that arise from our brain’s tendency to make intuitive judgments and jump to conclusions.
In general, this tendency is helpful, as it allows us to live our lives without having to agonize over every little decision in our day. Much as we generally aren’t aware of breathing, so does much of our mental activity occur without us noticing.
As for fallacies, I’ll let Kahneman explain:
”The word fallacy is used, in general, when people fail to apply a logical rule that is obviously relevant” (158).
Fallacies are related to cognitive biases, arising from our brain’s same tendency to take mental shortcuts (often called heuristics) in making decisions or answering questions.
As logical as you think you are, your brain deals better with intuition than logic, better with generalizations than statistics. In certain situations ranging from school to work to your personal life, the brain’s failure to apply logical rules and its tendency to use heuristics can have unfortunate consequences.
Before I get into the specific biases and fallacies, I should explain how Kahneman views human thought. To make things easier to understand, he divides thought into two systems:
System 1 “operates automatically and quickly, with little or no effort and no sense of voluntary control” (20).
System 2 ”allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentrations” (21).
Think of it this way: unless you make a deliberate effort to reason through something, you’re probably using System 1.
If you’re doing a difficult math problem, you’re probably calling upon System 2.
It’s also important to remember that System 1 is the default for most things you do. Your mind prefers to take the path of least resistance, the less deliberate thought the better.
As you can imagine, this preference for avoiding deliberate thought can lead to some problems.
While no means exhaustive, the following are six common cognitive biases or fallacies to watch out for in your studies, work, and personal life.
When you make unfounded judgments about a person’s character or ability to perform certain tasks based on favorable qualities you have observed in the person, you’re falling victim to the halo effect.
For instance, let’s say you think that beards are a sign of trustworthiness (or you just find them sexy). If you then chose to read and follow College Info Geek simply because Thomas and I both have epic beards, you’d be falling victim to the halo effect. Even though we may both indeed be trustworthy and knowledgeable (and sexy), it’s a mistake to conclude that just because of our beards.
Another more realistic example would be choosing to take a certain class because the professor looks attractive and acts confidently, even though these factors may have no bearing on the professor’s ability. When asking other students about a professor, make sure to ask about the professor’s teaching abilities, intelligence, or other relevant qualities.
While it won’t completely avoid any biases the students may have, it should get you a more useful answer than if you’d asked, ”Do you like this professor?” Same goes if you’re using Rate My Professors–“hotness” shouldn’t be your primary metric of evaluation.
The best way to avoid the halo effect is to rely on objective measures of competency as much as possible.
The presentation of information matters in the way people interpret it. Marketers exploit this tendency all the time.
For instance, “90% fat-free” sounds better than “contains 10% fat.” A 70% graduation rate sounds better than a 30% dropout rate. Logically, these statements are equivalent, but people are not intuitively logical.
Being aware of the framing effect is helpful for a couple reasons.
On the one hand, if you’re aware of the framing effect, you can at least be aware of when others attempt to use it to influence your behavior. When a school boasts that 70% of its graduates are employed within 6 months of graduation, you can consider the 30% who aren’t and ask why that is.
On the other hand, awareness of the framing effect also allows you to use it to influence other people’s decisions. I’m not advocating deception or manipulation, of course, but you should be aware that you have a choice in how you present information, and that how you present it matters.
If you’re applying for a research grant, for instance, you would do best to focus on the percentage of previous recipients who went on to receive doctorates as opposed to the number who did nothing especially remarkable.
Use the framing effect to advance your goals, or haplessly allow others to influence your actions through it: the choice is yours.
”I see there is a beard theme going on with this website’s employees.”
– Annie Tohill, in a comment on a College Info Geek post
The availability bias is a phenomenon in which are judgments are biased by how easily you can call examples to mind (it is ease, not the number of examples, that matters).
For instance, if someone asks you what percentage of U.S. men have beards, your answer will be biased by how many bearded people you know. If you hang around with a lot of hipster dudes (or consume a lot of College Info Geek content), you’re likely to guess higher than the actual statistic (an estimated 17% according to this article).
The availability bias also manifests itself in less obvious ways, such as working in teams. As Kahneman points out, “many members of a collaborative team feel they have done more than their share and also feel that the others are not adequately grateful for their individual contributions” (131).
This is because your own contributions to a team project are most available to your memory and experience. You can never be as familiar with other members’ contributions as your own.
Remember this next time you get frustrated with a group project–you may actually be doing more work than the other members, but probably not as much as you think.
For more tips on surviving group projects, check out Thomas’s book.
“The ultimate test of an explanation is whether it would have made the event predictable in advance” (200).
A popular statement of this bias is, “Hindsight is 20-20.” Hindsight bias describes our tendency to evaluate past actions based on present information that we could not possibly have known at the time of making a decision. Once we know a piece of information, it’s nearly impossible for us to imagine a time when we didn’t possess it.
Just think of all the stupid things you thought or did when you were a child. With what you know now, it seems obvious that you can’t dig to the other side of the world, travel through time, or train your cat to fetch (or do anything, really). At the time, though, these seemed like perfectly plausible undertakings.
”Okay,” you think, ”that’s obvious. I’m not disagreeing that I did stupid things as a kid. How does it matter to me now?”
You should be aware of hindsight bias because it can lead to some dangerous conclusions.
For instance, let’s say that in general you score B’s on your physics exams even when you study really hard. Now let’s say that for whatever reason you do much less studying than usual and end up making an A. While it is possible that you were studying too hard in the past and that the mental break improved your performance, it’s more likely that you just got lucky.
Because of hindsight bias, however, it’s tempting to conclude, with that A in front of you, that you’ll do better if you don’t study. When I explain it this way, it seems obvious, but it may not be when you find yourself in that situation, the dopamine still rushing from the high grade.
Keep it in mind.
“The best laid schemes o’ mice an’ men / Gang oft agley [go often awry].”
– Robert Burns, Scottish poet
The planning fallacy describes plans and forecasts that are unrealistically close to best-case scenarios and could be improved by consulting the statistics of similar cases.
The planning fallacy generally arises when the person in charge of a decision/plan does not consult outside statistics or opinions about the amount of money, time, or other relevant resources required to complete the project.
This causes obvious problems in small businesses, national corporations, and even entire governments, but it can just as easily apply to your school work.
When you plan your schedule for writing a paper, for example, do you consider how long it has taken you in the past or do you guess how long you hope it will take?
Consulting your past performance will give you a far superior estimate of the time required. The best way to do this is to establish a personal “base rate” for paper writing–use an app like Rescue Time or Toggl (or just the timer on your phone) to track how long it actually takes you to write a paper.
It can be sobering to find out that a paper takes you six hours to write on average, but it’s better to know that and plan ahead accordingly instead of pulling an all-nighter before it’s due.
This applies to pretty much any assignment you’ll undertake in college, from reading assignments to group projects.
If you don’t want to or can’t time the project (if it’s something you’ve never done, for instance), then at least employ the fudge ratio, which you can learn more about in this video.
Remember: If you’re unrealistically optimistic in your estimates, you’re only lying to yourself.
Looking for a way to make realistic plans? Check out our guide to strategic thinking.
I saved this one for last because it’s huge. It’s so prevalent that most people don’t even think about it. Kahneman defines it as ”The decision to invest additional resources into a losing account” (345).
In everyday terms, the idea is this: “Well, I already spent x hours/days/weeks/years reading this book/watching this movie/being in this relationship, so I guess I should finish it.”
Because humans are naturally loss-averse, it’s painful to give up on something into which we’ve already put a lot of time, even if ”giving up” would allow us to move onto something ultimately more satisfying or fruitful.
When it applies to a TV show or book you don’t really like, the worst you’re doing is wasting your time, but it can apply to relationships, jobs, and even educational plans as well.
Taken to an extreme, the sunk cost fallacy could keep you unhappy or even just dissatisfied your whole life!
It also applies to businesses started, money invested, and pretty much any project that you ought to let die but can’t let go of.
If you’ve been working on a project that you dislike or are in a relationship that is “just okay,” ask yourself if you’re keeping this up because you want to/enjoy it or just because you think you should finish it.
Learning to notice the sunk-cost fallacy could literally change your life.
I’m sure at this point you’re thinking, “What the heck am I supposed to do if I can’t even trust my own thoughts?”
Kahneman has an answer, albeit not a very comforting one:
”The question most asked about cognitive illusions is whether they can be overcome….The best we can do is compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high” (28).
In everyday life, you shouldn’t freak out about every little decision you make–whatever it is probably doesn’t matter that much, and if you fall victim to a cognitive bias or fallacy, it’s no tragedy.
In high-stakes situations, though, you would do well to step back, ”activate” your System 2, and notice any errors or biases in your decision-making process.
If you’re questioning your sanity now, that’s good.
It means you’re thinking.