The Conditioning Machines in Our Back Pockets

An illustration of a young woman holding and looking at her smart phone.

The Conditioning Machines in Our Back Pockets

John Zada | March 10, 2025

In the last few years I’ve noticed something odd: many people I know who’d never had much interest in politics have suddenly become quite deeply politicized around certain issues. It caught my attention first during the controversies of the COVID-19 pandemic, then continued with the Russian invasion of Ukraine, and more recently hit a pinnacle during the recent Middle East wars in Israel, Gaza and Lebanon. People I previously knew to be either politically indifferent, more balanced in their views, or of mild temperament had quite suddenly become excessively emotional and obsessive mouthpieces on a single issue; or embroiled in the polarized “culture wars” that continue to pit left versus right.

Of course, an interest and participation in politics, even activism, is considered an accepted norm in our culture within nonviolent bounds. We all come with largely pre-determined biases that shape our worldviews. And there is no shortage of negative news for us to react to. It’s hard not to be affected by horrible events, no matter how far away they play out. All the more so if they involve us, or our communities, more directly.

But what has struck me about my newly politicized friends, acquaintances and family—beyond the rapidity of “conversion” to their newfound views—is that for all of their certainty and deepness of conviction and righteousness, they often had a very shallow knowledge of, or experience with, the things they propounded. Indeed, in many cases when pressed on the source of their understanding, they’d cite content seen on their social media feeds.

This, to me, seemed to speak to a very serious problem indeed: what I was seeing—what we’re all seeing—in terms of the increasingly polarized behavior around us appears to be partly the result of the mass conditioning and indoctrination of belief by the internet and smartphones we carry so nonchalantly in our back pockets.

This idea that our world views can be shaped, or engineered, by the technology we use and content we consume may not sound at all that surprising, or new. The German Third Reich mass indoctrinated its population largely through radio broadcasts of Hitler’s speeches and rallies. The era of television perfected the art of advertising and consumer manipulation. It also gave birth to the 24-hour news cycles whose filtered and curated content can distort our picture of reality. Now a newer cultural vernacular, one of “digital silos,” “echo chambers,” and “influencers” informs our understanding of the age of the internet and the political polarization it has spawned.

Yet, of all the issues connected with digital technology, the least appreciated and understood seems to be that of inadvertent manipulation, indoctrination and “brainwashing” that can happen through our incessant and algorithmic consumption of content. The extent of this phenomena is similarly underappreciated: this influence is a daily, routine and commonplace occurrence impacting us all, to various degrees. We’ve devoted millions of words, and countless of hours of airtime in our media, to the problems of selling our attention, giving up our data, and the mental health costs of screens on children. And for good reason—these are also significant issues.

But what we should be turning our attention to next is teaching a fundamental knowledge that is long overdue: the ease with which we are all unwittingly influenced, and especially in this day and age, at our own hands, and by our own habits.

*

The term “brainwashing” was coined in the aftermath of the Korean War when American soldiers returning from Chinese POW camps expressed strange and unexpected political praise for their captors and their foreign governments. Upon investigation it was revealed that Chinese military officials used certain techniques to implant ideas onto the prisoners. The procedures ranged from less direct approaches like getting soldiers to repeatedly write essays criticizing their own societies and expressing political notions not their own, to more coercive means of suggestion that included interrogation, rewards and punishments, forced confessions, exposure to propaganda, deprivation, and isolation.

One psychologist and writer, Dr. William Sargant, in his seminal 1957 book Battle for the Mind: A Physiology of Conversion and Brainwashing, wrote that the most effective means of forced conversion requires the target’s mental equilibrium to be subverted—achieved by subjecting the victim to extreme emotional arousal, anxiety and stress. “One set of behavior patterns in man can be temporarily replaced by another that altogether contradicts it; not by persuasive indoctrination alone, but also by imposing intolerable strains on a normally functioning brain,” Sargant says.

Another important writer who often commented on the psychology of group behavior, the late Afghan author Idries Shah, made similar observations on thought manipulation a few decades later. “There are four factors which, when applied upon human beings, ‘programme’ them like machines,” he writes in Learning How to Learn. “These are the factors which are used in indoctrination and conditioning…They are: tension alternating with relaxation, sloganisation and repetition.”

Shah also writes:

“People are conditioned not only by deliberate indoctrination, but also by systems whose proponents themselves are ignorant of the need for safeguards to prevent conditioning. People are also conditioned by a constellation of experiences. In most human societies, unanimity of thought has been arrived at by an unrecognised conditioning process in which virtually all the society’s institutions may be branches of the conditioning process.”

The devil here is in the details, as the saying goes. Perhaps that’s why we seem to have forgotten those insights before they could become common currency in our culture. More often than not we now use the term “brainwashing” to describe simply the outcome of situations in which someone is consciously or directly manipulated, more often in person, as with cults, religions, or violent political movements. We use the term without necessarily understanding the exact mechanisms involved. If we did, our definition of it might widen.

What is crucial to understand is that brainwashing can happen inadvertently, through circumstances far less dramatic than those we often associate with the term. In fact, it should not just be seen as a single phenomenon, but rather a series of separate and related processes of influence on a spectrum that occur in everyday life, and that play upon the cues that trigger the decision-making reflexes we evolved as a species.

Conditioning, indoctrination, and various forms of suggestion and persuasion occur more frequently than we know: in school, in the home, at work, and in our social and political groupings. Other psychological dynamics like unmet needs, emotional dependency, the desire for social approval, and groupthink make us far more willing recipients of ideas we would otherwise not adopt.

“The need to be one with a group, to have group approval and therefore social support, means that individuals will very often change these attitudes themselves, to fit with the norm, instead of having to be persuaded,” writes Denise Winn in her book, The Manipulated Mind.

Which brings me back to the potentially heinous impacts of our smartphones. Now that so much of our lives have moved online, where each day we spend hours consuming algorithm-curated content, much of it of an ideological nature and highly emotive—is it even possible to fundamentally differentiate that digital influence from what we might call “brainwashing”? After months and years of repetitively consuming one-sided messaging on a certain topic, how likely is anyone to maintain an open and flexible mind that can see, or at least try to see, an issue in the round?

No matter how much agency or knowledge we think we have, or how critical we deem our faculties, or how “media literate” and objective we take ourselves to be, we are all at risk of becoming more biased and blinkered than we think we are. Our natural learning and adaptive reflexes and instincts that helped us survive for aeons are simply too easily hacked.  

Because of this, if not for other reasons, we must constantly approach our smartphones, and the content they unceasingly deliver with our abettance, with the utmost vigilance and caution.


John Zada is a writer and journalist based in Toronto, Canada. He is the author of the books, In the Valleys of the Noble Beyond and Veils of Distortion.