In an era of hot takes and wellness fads, confidence often masquerades as truth. We’re surrounded by bold claims delivered with unwavering assurance—even when the evidence is shaky at best. In The Certainty Illusion, Timothy Caulfield unpacks why humans crave simple answers, how misinformation exploits that instinct, and what the drive for certainty means for our health and our culture. 


Timothy Caulfield
Timothy Caulfield.

Timothy Caulfield is research director of the Health Law Institute at the University of Alberta in Canada, author of The Certainty Illusion, and cofounder of the ScienceUpFirst initiative, which fights misinformation. He spoke with Nutrition Action’s Caitlin Dow. 


The information ecosystem 

We’re hearing more and more about misinformation. Is that new?

TC: Misinformation has been with us forever, but over the last few decades, the problem has become more acute.

Today, we get so much of our information from social media and online sources. We’ve never had as much access to knowledge as we have now. That creates a huge paradox.

On one hand, access is good. There is carefully evaluated, independent information out there that can answer questions about everything from black holes to how cancer spreads.

On the other hand, there’s just so much information. It’s chaos. We’re constantly being bombarded with it.

By one estimate, an average person living in the modern world processes roughly the same amount of information in a single day that a highly educated person living 500 years ago processed in a lifetime.

Front cover of The Certainty Illusion by Timothy Caulfield
Curtis Trent.

And it’s not just about more information, right?

TC: Right. It’s also all the algorithms that drive what we see. An algorithm is a set of rules that social media platforms use to prioritize what you see based on how relevant and engaging the content is likely to be. Those algorithms play to our cognitive biases, so that our information environment is increasingly shaped by emotions like rage and fear and our political leanings.

The frenetic nature of that environment makes those algorithms more powerful. This is a fire hose of information, and we can’t really engage with all the content we see in a thoughtful way.

Of course, that’s not the goal of the algorithm creators. They want us to be rats on a wheel. An algorithm that keeps you scrolling is doing its job because it lets the platform sell ads or behavioral data. A calm, unworked-up user who doesn’t keep coming back for more doesn’t generate as much value.

What else has changed?

TC: We’ve seen the rise of bad science that’s published by either fake or poor-quality predatory scientific journals. The journals have a lax peer review process, if they have one at all, they charge high fees to publish, and they publish just about anything.

Why is that a problem?

TC: It’s becoming increasingly difficult to talk about a body of evidence. Good research is still out there, but that knowledge base is becoming polluted and more difficult to sift through. Add to that the fact that ChatGPT can write believable study abstracts. Even academics can’t tell they were written by AI.

One of the most important ways to counter misinformation is to have trustworthy evidence. And now that part of the equation is getting rigged too.

Here’s how it could play out: Some outlandish claim is made, and then someone does what I always tell them to do, which is to click on a post, read the article, and see if there is science on the topic. And when they do, they’re met with fraudulent science that legitimizes whatever claim is being made. That’s really scary.

How big is the problem with these journals?

TC: One group estimated that there are more than 15,500 predatory journals. It’s hard to estimate the financial cost of this, but likely billions of dollars in research budgets have been wasted by publishing in these journals.

That’s outrageous. Resources that could be used for rigorous research are instead lining the pockets of fraudsters. And studies that are published in those journals can be used to legitimize climate change denial, anti-vaccine crusades, chemtrail conspiracy theories, and other fringe and harmful ideas.

How can you spot predatory publications?

TC: It’s not easy. It takes time and effort to figure out if a journal or study is credible. I recommend looking at how others in the scientific community reference the research. If others aren’t mentioning it, be suspicious.


The drive for certainty 

How does the information ecosystem work against our psychology?

TC: We have access to more knowledge than ever, but we also have access to more misinformation. And that makes many people feel less certain about the things they care about, because it’s hard to tell what’s up from what’s down.

The human mind craves certainty. And I think this desire for certainty has become more acute because our information environment is so messy and chaotic.

One of the reasons misinformation is so attractive is because it satisfies our desire for an easy answer.

Conspiracy theories are a great example. The embrace of conspiracy theories is often about trying to find certainty or clarity in a chaotic world.

Guess what? Sometimes it’s just a chaotic world.

We’re looking for signals of clarity and certainty that can guide us through this really dark and messy forest. And that drive for certainty is easy to exploit.

How so?

TC: Take health halos, which are simple words or claims that play on our desire for goodness. They are a shortcut to certainty.

“Non-GMO” and “organic” are good examples.

There are really interesting and complex questions about biodiversity, sustainability, and so on that we need to grapple with. The problem is that the claims are presented as definitive. Something that’s “Non-GMO” must be better. “Organic”? Better. “Natural”? Better. “Clean”? Better. But, in fact, they may not be.

woman looking confuse at her phone
Misinformation can be appealing because it offers a sense of certainty. Before sharing an article or social media post, try to verify its accuracy.
PheelingsMedia - stock.adobe.com.

What’s a good example of that?

TC: Natural American Spirit cigarettes have boasted that they’re made with “100% additive-free natural tobacco” and “organic tobacco.”

Not surprisingly, a study found that a majority of Natural American Spirit smokers wrongfully believed that the brand was less harmful than other cigarettes. That’s despite the fact that the package explicitly stated that “Organic tobacco does not mean a safer cigarette.”

That’s the power of a health halo. 


Scienceploitation

How else can words be used to exploit our desire for certainty?

TC: I use the term “scienceploitation” to describe the use of real science that is often new and has gotten a lot of attention in the press to push unproven approaches.

One of my favorite examples is “stem cells,” which has become code for “cutting edge” or “scientifically exciting.”

It’s a genuinely exciting area of science, but despite decades of work, there are almost no stem cell therapies that are ready to treat diseases or conditions. Yet you’ll see “stem cells” marketed for everything from relieving joint pain to repairing damaged tissue to improving memory.

Any other words that should make us suspicious?

TC: There’s “longevity,” which is really just Wellness 2.0. Cold plunges, supplements, and wearable devices are sold as these fountains of youth, though there’s very little evidence in people that any of these things extend lifespan or healthspan.

“Microbiome” is another favorite example of mine. There are very few, if any, bona fide microbiome therapies, yet you even see the term on shampoo bottles now. It’s everywhere.

And science-y language is used because it works.

TC: Right. I sometimes joke that The Enlightenment won, not as a way of thinking or seeing the world—that is, the embrace of reason and critical thinking—but as a brand.

That’s why you see misinformation purveyors saying that something is “clinically proven” or that it’s based on “the latest research.”

They don’t say that it’s based on tea leaves. Instead, they use the language of science.

Combine that with fear, and the impact can be huge. We see that with claims about gluten, vaccines, fluoride in water, and restrictive diets that claim to prevent inflammation, blood sugar spikes, and so on.

Why is fear so powerful?

TC: We’re wired to be driven by fear. In one 2019 study, researchers showed BBC video news content to people in 17 countries around the world. On average, participants were more activated and attentive to negative news. It’s because of that response that more headlines are negative. As they say in journalism, “If it bleeds, it leads.”

Woman recording herself holding up a supplement bottle
Is that influencer reliable? Ask yourself if they’re an expert in the topic. And be wary if they’re trying to sell you something...like supplements.
rh2010 - stock.adobe.com.

Are people good at identifying misinformation?

TC: It’s pretty complex because there are so many different kinds of misinformation, and there’s nuance to the answer.

But research pretty consistently tells us that people think they’re better at identifying misinformation than they are. This is a pretty common psychological predisposition.

In fact, a number of cognitive biases make us more likely to fall for bad information. There’s overconfidence bias, which is where humans routinely believe we’re better at complex tasks than we are.

There’s also something called illusory superiority, where we think other people can be duped, but not us.

And then there’s motivated reasoning. That’s when our skepticism weakens when we’re shown information that supports our identity or values.

And we know that misinformation is having an impact on how we behave and what we believe, right?

TC: We do. That tells us that we’re not great at identifying misinformation and that misinformation is very effective.

For example, in one study, researchers found that only 32 percent of Americans think that mRNA vaccines like some Covid vaccines are “generally safe.” In fact, the mRNA Covid vaccine is one of the most studied biomedical interventions in human history, and there’s overwhelming evidence that it’s safe. And “generally safe” is a pretty modest way to describe them. So that tells us that misinformation is having an impact in a very broad way. 


What we can do

How can people evaluate whether claims that use science-y words are reliable?

TC: We need to avoid what I call single study syndrome. If a claim rests on literally one study, be very suspicious. A single study can be interesting and may cause us to revisit previous work on the topic, but a single study almost never upends an entire body of evidence.

I say this all the time, but you need to think about the body of evidence. Ask yourself: How does this claim or product relate to the broad body of evidence for whatever thing is being sold or idea is being pushed?

If a significant scientific breakthrough happens, I promise you’ll know about it. It won’t be just a single influencer on YouTube talking about it. It’ll be reported by multiple reputable sources.

illustration of a salad bowl with science-y words written on the lettuce
Beware of “science-y word salads” that are used to make a product or idea sound more evidence-based than it is.
Jorge Bach - CSPI.

Do scientific breakthroughs happen often?

TC: No. I used to ask my classes to name 10 genuine breakthroughs in health and medicine in the last decade, where a breakthrough means something that had broad impact on human health. It ain’t easy to do.

I’d put GLP-1 drugs for weight loss on that list. Also the Covid vaccine. But those are rare. So you should be suspicious as a default when you’re seeing promises about medical breakthroughs.

What about emerging areas where we don’t have a solid body of evidence?

TC: This one is a little trickier. Psychedelics for mental health, the role of probiotics in gut health, and the long-term health outcomes of GLP-1 drugs are good examples of areas where the science is still emerging.

In these cases, I recommend that people seek out trusted voices—good science communicators, science journalists, university resources—that pull together the science in a responsible way.

How can you tell if a voice is trustworthy?

TC: There isn’t a perfect gauge, but it’s a good idea to start by asking yourself: Is this person an expert in this specific area? And do they have something to gain by having me believe what they’re saying?

You should be suspicious of people who are trying to sell you supplements or something preposterous like an exercise plan that promises to give you ripped abs in only 10 minutes a day.

Another good question: Do they use overly complex language and what I call science-y word salads? That’s a sign that they’re not trying to inform, but to impress, confuse, or manipulate you.

A good science communicator will try to simplify things, describe nuance, and highlight the uncertainty in the evidence.

And be suspicious if someone claims to know something that mountains of data and decades of inquiry haven’t uncovered. Remember: Extraordinary claims require extraordinary evidence.

What else should people watch out for?

TC: It’s such a cliché, but if something sounds too good to be true, it just is. And be leery of claims like “Scientists don’t want you to know this” or “Doctors were baffled.”

Scientists want you to know things, and doctors have probably never heard of the thing they’re allegedly baffled by because it’s bunk.

What can people do to slow the spread of misinformation?

TC: The most important thing you can do is pause before you share something and verify its accuracy. I’m sorry, I know it’s a little bit of work, but it’s so important.

This is especially true if something makes you feel enraged or self-righteous. One recent study that looked at over a million posts on social media found that outrage is a driving factor in sharing misinformation.

And another study of 35 million Facebook posts found that 75 percent of shared news content was done so without clicking on it to see what it said.

Misinformation won’t spread if no one spreads it.

And having some humility can help?

TC: That’s right. I call it the humility fix, where I keep a running list of the things that I’ve shifted my mind-set on as more information has come out.

Changing your mind because the science changed should be a badge of honor. This is not flip-flopping. This is you being rational. We should embrace more of that mentality.

Donate to CSPI today

CSPI heavily relies on our grassroots donors to fuel our mission. Every donation—no matter how small—helps CSPI continue improving food access, removing harmful additives, strengthening food safety, conducting and reviewing research, and reforming food labeling. We don't take donations from corporations, and our flagship publication, Nutrition Action, doesn't run any ads. That means that everything we do is fiercely independent and unbiased from any bad actors, no matter how powerful. To help keep this online content 100% free, consider donating today to support CSPI.

A monthly gift helps more
Be part of our next win.
turnip

Let's stay in touch

Get our (free) healthy tips

Our free Healthy Tips newsletter offers a peek at what Nutrition Action subscribers get—scrupulously researched advice about food of all kinds, staying healthy with diet and exercise, and more.

Sign up now