Propaganda and the Human Mind

Some people naively associate propaganda with totalitarian regimes. Certainly, the Nazis, the Soviet and Chinese communists, and brutal dictators like Saddam Hussein have made heavy and sometimes brilliantly effective use of propaganda. But totalitarians may not need to be true masters of propaganda, since they often merely bludgeon people into at least apparent belief and acquiescence.

Today’s  show is about the topic of propaganda.  Our guest will be Orville Schell,   Dean of the Berkeley School of Journalism.   The episode will focus on the nature of propaganda,  on what precisely is wrong with it, on the difference between the production and dissemination of propaganda in democratic and totalitarian societies and on what we can do to combat it. 

Some people  naively associate propaganda with totalitarian regimes.  Certainly, the Nazis, the Soviet and Chinese communists, and brutal dictators like Saddam Hussein  have made heavy and sometimes brilliantly effective use of propaganda.   But totalitarians may not need to be true masters of propaganda,  since they often merely bludgeon people into at least apparent belief and acquiescence.   It’s in supposedly democratic societies with  capitalist economies,  where  political consent   and economic demand are  manufactured, to use Lippmann’s apt phrase, that propaganda has been elevated to truly  high and insidious art form.   Indeed, it seems to me that largely through propagandistic manipulation of the means of public  communication and representation, the  concentrated,  self-serving powers that own so much of our politics and so much of our economy have succeeded  in thoroughly debasing our public discourse.

I do not know if we  will ever break the hold of a  narrow, self-serving elite on the means of public communication and representation.  One thing that gives me a modicum of  hope is  the rise of Internet, with its wild and untamed blog sphere.   To be sure,  the Internet remains so wild and untamed that  it may end up being a source of more heat than light.  But because it is open to so many comers, it is at least a place where contestable representations are contested, sometimes quite rigorously and thoroughly. 

But my main thought about propaganda this morning has to do less with institutional reforms   than the prospects for  reform of individual human minds, one-by-one. It’s clear that our own habits of mind, habits of mind  deeply ingrained in many, if not all of us,  often make us susceptible to propaganda.  We all have some tendency to prefer the comforting falsehood, to the hard truth, for example.  And that makes us easy prey for those who deploy comforting falsehoods in order to get us to sign on to  agenda that we might otherwise not embrace.   In the run-up to the war, authoritative voices told  us repeatedly that we would be welcomed as liberators, that stockpiles of WMD were present in Iraq, that  Iraq  bore some  vague  connection to 9/11, that the war would be quick, cheap, largely financed by Iraqi oil. And on and on.  A few dissenting voices could be heard, whispering  far off center stage that none of it was so.   But the public by and large ignored those voices and bought the tale they were told.

The wonder is less that we bought the initial tale, but that for many many the belief in the tale persisted even as the evidence spoke decisively against it.  Once  the comforting falsehoods had taken  hold,  they had  vice-grip on our beliefs.    This vice grip is the result of what social psychologists call confirmation bias — the tendency to notice and seek out what  confirms one’s beliefs, and to ignore,  avoid,  or undervalue the relevance of what contradicts one’s beliefs.    Confirmation bias often leads us to subject putatively disconfirming evidence to very severe criticism or outright dismissal.

I suspect that our tendency toward confirmation bias is deeply ingrained in the evolutionary pre-history of our mind-brains.  But that is a subject for another day.  But I also  suspect that a disposition to confirmation bias is  connected to a tendency to overestimate our  own epistemic reliability.    If I find myself believing  some proposition, then I  also find  myself believing that I have a good reason to believe that proposition.  After all,  one tends not to think of oneself as believing what one believes for no good reason.   But that  may appear to suggest that  someone  who challenges what I believe,  doesn’t just challenge my belief,  but also challenges me.  So, for example, if I have  committed  to believing Bush’s rationale for the war,  what am I to think about myself if I allow that that rationale is entirely fictitious.  That I’m not such a good believer after all?   That I was a mere dupe?   That’s a hard truth, few are naturally disposed to accept.  Rather than take myself to be a dupe, why not take someone who purports to present evidence for a contrary proposition to be mistaken.  Which is more comforting?

There are lots and lots of other foibles of human reason, widely discussed by social psychologists, and exploited by the masters of propaganda, that I won’t elaborate on  right now.   We talked about this  a fair bit on an earlier  episode of Philosophy Talk called “Humans: The Irrational Animal?”   Check it out.

Sometimes the truth does simply force itself on us.   Asleep in the woods, you  are awakened by what appears to be a very hungry bear.  You could believe the initially  comforting, but ultimately self-defeating falsehood that you are just dreaming and in no danger.  Or you could believe the hard, disquieting truth that you are about to be attacked.  Very likely, you believe the disquieting truth over the comforting falsehood.   But truth seldom forces itself on us in quite that way.   Truth is hard and elusive.  We often have to force it out via an unrelenting search for evidence pro and con and unyielding arguments that are never satisfied with mere comfort and convenience.

Can  human minds  be educated so that we always prefer the relentless pursuit of evidence and unyielding arguments to comforting and easy falsehoods?  That is surely an empirical question.  Unfortunately,   many of the foibles of human reasoning exploited by the masters of propaganda and marketing turn out to be rather robust. On the other hand,  there is  a growing body of  evidence that at least some of  our foibles do disappear when matters are rightly framed.  This  is a complex subject to which I won’t even try to do justice here.  The temporary bottom line is that we just don’t know decisively how much education can achieve in correcting the cognitive foibles that make us all so vulnerable to the masters of manipulation.  Moreover,  our educational institutions being what they are are controlled by who they are controlled by,   I doubt that education alone will ever supplant the need for wide-scale social reforms that redistribute and reconfigure the means of public communication and representation.  I do hope, though, that I  am not being utterly naive or self-deceptive in believing there is a place for fighting the power of propaganda one mind at a time.

LOGIN TO LISTEN TO THE EPISODE

Related Episodes

  • Propaganda

    April 26, 2005
  • Ethics in Journalism

    April 29, 2007
  • Democracy and The Press

    July 4, 2010
  • Why Propaganda Matters

    May 31, 2015
  • Hypocrisy

    July 8, 2012
  • Post-Truth Politics

    September 10, 2017