The strategies of mental defence mechanisms are learned during childhood, when the child needs a way to escape from the debilitating effects of fear. At the time, the fear is simple and “childish” — the boogie man, the monster under the bed and so on. Yet these threats are reasonable enough to the child, given his or her limited knowledge of the world. However, the anti-monster techniques are used later by adults who don’t want to think about real-world problems such as war, famine, torture and so on.
These strategies also get pressed into service to protect systems of thought that provide protection of the psyche. Such thoughts can pertain to political, spiritual or scientific concepts.
The reason people don’t like to learn of defence mechanisms is that they don’t want to know how they defend themselves — that would set up severe cognitive dissonance. On the one hand, they’d have to acknowledge that many defence mechanisms are inherently dishonest, but on the other hand, they don’t want to give up the protection. It’s easier to let their defence mechanisms make them forget about defence mechanisms.
To completely abandon defence mechanisms is impossible. One would have no way to push aside random thoughts about all the bad things that can befall us. We could not function without defence mechanisms.
I suspect that people who suffer from anxiety disorders have something wrong with their defence mechanisms. This may have come about from some traumatic episode that destroyed their confidence in their defence mechanism strategies.
This article deals only with those mental defence mechanisms that relate to fending off contrary beliefs. There are other defence mechanisms (such as “being in denial”) but these are not relevant, here.
Strategies of Antiprocess
Note: By “Strategies of Antiprocess”, I mean techniques we use to prevent ourselves from processing information with the full power of our rationality.
Irrational Thought-Stopping. This means using an irrational notion to halt further analysis. For example, a soldier in a foxhole might think, “The others will die, but it can’t happen to me” — yet he does not dare to consider that the statement is absurd.
Practical Thought-Stopping. This means that the person thinks along these lines: “This line of thinking has only caused me pain in the past, so I won’t think about it now.” Note that this is one of the few strategies of antiprocess that is done consciously.
Rhetorical Fallacies. This means using certain patterns of thought that are not logically supportable, such as “Argument from Authority”, “Post Hoc Ergo Propter Hoc” and the rest. While some of these fallacies are often erroneously used simply through ignorance (and not as a defence), they recur so often in debates that it seems likely that the subconscious must have considered that it’s worthwhile believing them to be valid. There is a cost in being blind to the problem, but on balance most people benefit more by not recognizing the fallacies.
Self-Censoring. This means deliberate forgetting. People who hear something they don’t like (including reasoning that seems to be true but yields a disturbing conclusion) can put it out of their mind in much the same way that children learn that they must forget the monster under their beds if they are to get any sleep.
Source Avoidance. This means people removing themselves from the source of the contradictory ideas. This is almost always accompanied by a rationalized “reason” for departing (e.g., “He’s just stupid — I’m not going to talk to him!”). A tactic of Source Avoidance is hostility. For example, if you become abusive towards the source of the disturbing information, they may go away.
Distraction (a.k.a. sublimation). This means setting up some kind of stimulus to soften the focus on the threatening idea or displace it altogether. This can include healthy distractions, such as engaging in strenuous exercise, or unhealthy ones, such as taking drugs or getting drunk. It can also include simple strategies such as humming, tapping the foot, singing, whistling and so on. Discursive thought (the chattering of our inner voice) is a subtle form of distraction, as is worry.
Nitpicking. This means focusing on one small aspect of the entire problem, declaring that it is a non-problem, and generalizing that conclusion to cover the entire matter.
Straw-Manning. This means misunderstanding the issue being considered and describing it in a way that can be dispensed with. This is not a conscious action, though it often seems so to others.
Smoke-Screening. This means raising objections to the issue faster than they can be addressed. This technique means that no meaningful analysis is taking place. One example of smoke-screening is relentlessly thinking cheery thoughts when one is afraid. Relentless nitpicking is also a form of smoke-screening.
Deferral. This means raising an objection and not pursuing it if it is not answered. The lack of an answer is then processed as if it meant that it was unanswerable. This technique works especially well with smoke-screening.
Strategies of Validation
By “Strategies of Validation”, I mean techniques we use to reassure ourselves of the truth of what we already believe.
Negative Validation. This means that when people successfully fend off a disproof of a belief, they take this as confirming their belief, even if the reality was that the disproof was faulty or badly presented.
Positive Validation. This means engaging in an activity that actively bolsters your existing beliefs, such as going to a rally of your favorite political party.
Selective Validation. This means engaging in an activity that on the surface appears to challenge your beliefs but is in fact guaranteed to give y ou more or less the result that you’re comfortable with. This includes reading material that you know you agree with, or associating almost exclusively with like-minded people.
Quasi-perilous Validation. This means engaging in what appears to be a belief-threatening activity, but doing it in such a way that there is, in fact, no risk. People with good defences (such as adeptness at using rhetorical fallacies) can engage in this kind of mock battle and be guaranteed a victory over the threat. This victory is the reason for the risk: it “proves” the belief in proportion to the perceived risk. An example: somebody who stubbornly clings to a point will emerge from the argument without having given in; this is a “victory”.
Conclusion
We have learned to defend our psyches so that we can function in day-to-day life. However, the strategies mentioned above can be applied even when the conscious mind thinks it is making an honest effort to understand (such as during a debate). These strategies are so deeply ingrained that it can require a very strong motivation to turn them off.
That, in my opinion, is why debates about firmly held beliefs are so often doomed to failure.
Leave a Reply