You are the easiest person to fool
If there’s one form of evidence most of us value above all others, it’s our experiences. After all, our senses don’t lie.
Others’ experiences are almost as good, especially if they’re from people we know and trust.
Examples abound. Maybe you know homeopathy works because you’ve tried it. Or you believe in UFOs because you saw one. You’re thinking of starting the keto diet because it helped your friend lose weight. Your son had COVID and it’s no worse than the flu. A celebrity says her child got autism from a vaccine. A medium once told you something they could never have known without real psychic powers.
Anecdotes are personal experiences that are used as evidence for a claim. Our brains jump to conclusions and assume the experience is a good indicator of what’s typical and even that events are due to causation. Vivid and emotional stories are often particularly convincing and memorable.
Many people think that anecdotes are a sure-fire way of knowing what’s true. Indeed, it can be quite difficult to convince someone that they might be wrong.
But anecdotes are infamously unreliable.
Four reasons why anecdotes aren’t good evidence
As critical thinkers, our goal is to use evidence to decide what to believe. And as such, we need to be skeptical of anecdotal evidence for several reasons.
First, we can misperceive our experiences.
Our brains are trapped inside of our skulls, trying to make sense of the outside world using only our senses and its existing models about how the world works. There’s too much information to process, so it has to decide what to pay attention to….and what it means. The brain hates uncertainty, so it resolves any ambiguity by guessing and filling in gaps based on what it expects. It then constructs a consistent narrative to help us make sense of what it’s perceiving.
The result is a perception process that is incomplete, biased, and flawed. So while there is an objective reality outside of our heads, our perception of that reality is a subjective interpretation.
[Learn how optical illusions demonstrate the fallibility of human perception.]
The problem is that we’re often very confident that our experiences are “the truth”….. but in actuality our brains are constructing its reality.
When I was very young, I once saw a ghost. I was asleep in bed next to my grandmother. I remember feeling a cold finger on my hand and fingernails scratching up and down my arm. I woke up but couldn’t move. Out of the corner of my eye I saw an old woman with long, gray hair. She was holding me down and keeping me from screaming for help. It felt like an eternity and I was absolutely terrified.
While I (thankfully) never saw that particular ghost again, to this day I’m plagued by the likely cause of my experience: sleep paralysis, a condition in which you’re awake and conscious but your body is unable to move. When we’re in deep sleep our bodies are paralyzed, likely to prevent us from acting out our dreams and potentially harming ourselves (or others). Sleep paralysis is essentially a waking nightmare.
History is full of similar tales from cultures the world over. Even today, sleep paralysis likely impacts nearly 8% of the population, most of whom have never heard of the condition. To explain the terrifying experience, our minds fall back on what it knows and the cultural beliefs of the time, from demons to aliens. Or in my case, a ghost who bore a very strong resemblance to Disney movie witches.
None of this is to say that ghosts don’t exist. It’s certainly possible that they do. But extraordinary claims require extraordinary evidence, and personal experiences – even convincing ones – aren’t enough.
The criminal justice system is learning this lesson the hard way. Like most people, jurors trust people’s perceptions and experiences, and eyewitness testimony tends to be among the most valued forms of evidence in a trial. Unfortunately, it’s also the leading cause of wrongful convictions.
The point is, we are easily fooled. We’re convinced that “seeing is believing,” but our eyes can deceive us. Even more, believing is seeing: we can literally see what we expect to see. And not recognizing the limits of our perceptions can have serious real-world implications.
Anecdotes aren’t controlled.
Tina has been getting headaches. She doesn’t like to take pharmaceuticals, so she visited a naturopath, who suggested she take feverfew when she feels a migraine coming on. The naturopath also recommended Tina eliminate caffeine from her diet, get more exercise, and find ways to reduce stress.
After a few weeks of following the naturopath’s advice, Tina’s migraines seemed to be better. She shared her success story on a migraine support group on Facebook in the hopes that others might also benefit from trying feverfew.
Tina’s experience is incredibly common, but it’s important to remember that our experiences can fool us. By definition, anecdotes aren’t controlled, and her headaches could have improved for any number of reasons. Most alternative treatments “work” as well as placebos, which can be quite effective at treating subjective symptoms like pain. (Importantly, placebos do not treat the underlying cause of the symptoms.) Another possibility is that headaches generally go away on their own. She also changed her diet, exercised more, and reduced stress.
To find out if feverfew is effective it’s important to control for these types of factors. Medical researchers use what’s called a randomized controlled trial. In this case, we would randomly divide a large sample of migraine sufferers into two groups, giving one group feverfew and the other a placebo. (Importantly, we don’t let the participants know which group they’re in!) We can only conclude that feverfew is effective if the treatment group reports a greater improvement in headache symptoms than the placebo group.
Anecdotes often aren’t typical.
Small samples, such as personal experiences, are generally not representative of normal conditions. Therefore, we need to be careful about what kinds of conclusions we can draw from them.
The problem is, the human brain doesn’t intuitively grasp probabilities. When most of us are trying to decide what to believe our minds instead search for relevant experiences or stories.
The following examples might sound familiar. My grandfather smoked his entire life and was fine, so cigarettes aren’t harmful. Australia is dangerous because my cousin was mugged in Sydney. A cat bit me when I was young, so cats are mean. Toyotas are unreliable because I once had a Toyota that was always in the shop. This winter seemed really cold, so there’s no global warming.
Sometimes people disingenuously cherry pick anecdotes to make a point, while at other times our brains are simply taking short-cuts. Either way, using single observations can mislead us into thinking something is typical when it’s not. For that, we need statistical evidence.
And finally, people can lie.
The unfortunate truth is that people aren’t always truthful.
In the 1980s, Peter Popoff made millions as a faith healer. Elderly and ill people came from all over to attend his shows, hoping for a miracle. Popoff claimed to be channeling God as he dramatically called out names of lucky audience members and gave specific details of their lives and ailments before using his psychic energy to heal them.
Popoff caught the attention of skeptic James Randi, who had a history of exposing psychics. Randi’s investigation found that the information wasn’t coming from God, but from his wife Elizabeth, who had collected details from attendees through prayer cards and pre-show interviews. In 1986, Randi played the radio communication being transmitted to Popoff’s wireless earpiece on the Johnny Carson Show.
(I wish I could say that was the end of Popoff’s scams. But apparently he’s at it again, selling faith-based “get rich quick” schemes on late-night infomercials.)
The impact of media on our perception
In the “olden days,” we relied on our own experiences, as well as those of our close friends and family, to inform us about our world. We read books, watched the evening news, and read the Sunday newspaper.
Today each of us has our own “reality,” our own personal information ecosystem, which keeps us engaged by feeding us a steady diet of content that confirms what we already believe. We get trapped in echo chambers with others who think like we do, protecting us from ideas we don’t want to hear and further reinforcing our beliefs.
Anecdotes, from news articles to personal stories, permeate our thinking and distort our perception of reality and risk. We think some things are more common than they are. We think the world is a more dangerous place than it is. We’re afraid of the wrong things. And we think those outside of our tribes are wrong, stupid, or even evil.
We can all be misled. If you want a better understanding of reality, learn to be a better consumer of media.
The solution: Statistics are better than stories
Our lazy brains don’t like probability. Stories are infinitely easier.
For example, your Cranky Uncle doesn’t like immigrants because he thinks they’re criminals. At a recent family gathering he shared a story of a college student who was brutally murdered by an undocumented worker. He was understandably angry as he explained how she was killed and how devastated her family was.
Emotional stories like this can be very convincing. The question is, does this anecdote provide enough evidence to conclude that immigrants are dangerous?
No…. Actually, the data show that undocumented immigrants commit less crime than native-born Americans.
And then there’s your good friend, who’s so afraid of mass shootings that she doesn’t want to send her children to school. But while mass shootings are horrific and frightening, Americans are three times more likely to die from choking on food.
Remember that anecdotes often aren’t representative and generalizing from them can distort our perception of reality and risk.
These thinking errors happen all the time. So if your goal is to believe in things that are true, and not believe in things that aren’t, seek out reliable data, not emotional stories.
The take-home message
Most of the time our experiences are good enough to trust. I trust my nose when it tells me not to drink expired milk. And I trust my eyes and ears when a dog growls and shows its teeth.
But we should be wary of using anecdotes to draw broad conclusions.
Our brains prefer stories over data and statistics. And it’s hard to admit, but we are easily fooled. We can perceive things that didn’t happen, miss things that did, and misinterpret others. “Your truth” may not be “the truth.” The subjective “reality” inside of your skull might be different from objective reality.
If your goal is to align your beliefs with reality, be skeptical about the conclusions you can draw from your lived experiences, question your perceptions, and be humble enough to admit that you might be wrong. Learn to recognize your brain’s short-cuts and seek out reliable data instead.
The bottom line: Anecdotes aren’t good evidence.
To learn more
The Logic of Science: 5 reasons why anecdotes are totally worthless
The Body of Evidence: Anecdotes
Cranky Uncle: Anecdotes and hydroxychloroquine
Very nice summary, something worth sharing. Thanks for making the effort to put this out on the internet.
Thank you for the kind words!
Awesome!
Thank you! 🙂
Pingback: Quatre façons dont vos expériences personnelles peuvent vous égarer - CITIZEN4SCIENCE
Pingback: La certitude vous empêche-t-elle d'accéder à la vraie connaissance ? - CITIZEN4SCIENCE
Pingback: Tipps zur Faktenüberprüfung: 6 wichtige Fragen, um herauszufinden, ob deine Überzeugung stimmt
Pingback: VIER GRÜNDE, WARUM ANEKDOTEN KEINE GUTEN BEWEISE SIND
Pingback: FOUR REASONS WHY ANECDOTES AREN’T GOOD EVIDENCE
Pingback: ASTUCES FACT-CHECKING : QUATRE RAISONS POUR LESQUELLES LES ANECDOTES NE SONT PAS DE BONNES PREUVES
Pingback: 6 WICHTIGE FRAGEN ZUR FAKTENÜBERPRÜFUNG DEINER ÜBERZEUGUNG
Pingback: Feeling Foolish | Citizen Teacher