When Numbers Trick the Mind

When Numbers Trick the Mind

Ayubkhon Azamov | May 1, 2025

When it comes to interpreting data and statistics, our eyes see digits, while our minds fill in the gaps. We like to believe that numbers reflect an objective reality, but as the late brain scientist Robert Ornstein tells us, the mind isn’t one unified system—it’s a coalition of semi-independent modules, each interpreting the world through its own lens. These “many minds” collaborate to help us survive, not necessarily to reveal the truth. This explains why seemingly clear-cut statistics can create misleading impressions. We don’t just see numbers—we perceive them, and that perception is shaped by narrative, emotion, and cognitive shortcuts.

Take the famous Colgate advertisement from late 2007 asserting that “More than 80% of dentists recommend” their toothpaste. The slogan felt and sounded scientific. It was backed by a survey, but when UK regulators investigated, they discovered a sleight of hand in the process and claim: dentists were asked to recommend multiple toothpastes in their survey. Most dentists did recommended Colgate—but also alongside other toothpastes. But the company’s ad wording left the impression that Colgate was recommended above, or more so than, the other brands. Still, since Colgate is a very popular toothpaste, the statistical claim seemed to check out. The perception of authority remained—even after that particular ad was banned. 

Similarly, think of hand sanitizers claiming to kill “99.9% of bacteria.” The statistic is technically true—but only in a lab, when hands are first cleaned with soap before the application of new microbes. In real-world settings, where grease and grime on the hands protect germs, the efficacy of those sanitizers can drop to around 46%. Without context, that clean-looking and almost perfect percentage creates a false sense of security. Statistics can inherently mislead.

These examples are relatively harmless, but the underlying issue is deeper. When our perception is manipulated, even seemingly straightforward and truthful claims can become misleading.

Consider another health headline: “Butter is back!” Based on another popular study, it appeared that butter, which is often associated with a high-fat and high cholesterol diet, had no negative health impact. But a closer look reveals the researchers simply assessed the impact of butter in an already unhealthy American diet. Even then, butter only slightly increased mortality. 

Charts and graphs are often used to back all manner of claims. These graphics are designed to look objective, but they speak to our perceptual mind, not our logical one. As Ornstein explains in his book Multimind, we don’t simply absorb facts—we interpret them through mental shortcuts, emotional cues, and patterns that “feel” right. And nowhere is this more apparent than in how we perceive health claims.

Take berries, for example. Their benefits are well-documented in scientific literature, and this knowledge shapes public perception—they’re seen as a “brain food,” a natural superpower. But here’s the twist: berries are perishable, and big food producers struggle to profit from them in raw form. So, they process them into long-shelf-life products—like bars, powders, or supplements—often destroying the original nutrients in the process. But when we see the names of various berries tied to these products, we infer optimal benefits. The perception of health remains, but the reality shifts.

To preserve the perception, companies often fund studies that seem to show benefits. One study claimed that a supplement containing a cocktail of healthy ingredients, including blueberry, improved brain function compared to a placebo. Sounds promising, right? But if you dig a little deeper, the numbers lose their shine: the cognitive scores increased only slightly, but the study presented them on a compressed scale that started far above zero, turning a barely noticeable shift into something that looked like a dramatic surge.

This is not an outright lie. The data points and statistics are real. But the presentation is crafted to shape your perception, to gently nudge your understanding in a particular direction.

An image showing people chasing a pie graph off of a cliff—mislead by information.

This is the essence of modern misinformation: not falsifying data outright, but framing it in a way that distorts how we perceive it. Whether in advertising or political messaging, the trick isn’t just to hide the truth—it’s to guide your mind to a particular version, or interpretation, of it.

Alcohol studies are another case. One widely cited paper claimed moderate drinkers live longer. But the “non-drinker” category actually included people who had recently quit due to health problems caused or exacerbated by their drinking. When lifelong abstainers are studied separately, the supposed benefits of alcohol disappear. 

The deeper lesson here is not that statistics are necessarily false—but that our minds perceive them through stories and shortcuts. As Ornstein writes in his book, God 4.0, “We believe we are seeing the world directly, but we are actually seeing our interpretation of it.” And in an age of information overload, this misperception is increasingly weaponized in ads, media, and politics.

Mark Twain famously said, “There are three kinds of lies: lies, damned lies, and statistics.” But perhaps the most dangerous deception is the one our own minds help to construct.


Ayubkhon Azamov is a writer, translator and educator with a background in economics, based in Uzbekistan.