Tuesday, June 14, 2011
Via Lifehacker comes a lengthy article that delivers a grain of truth, but I thought could have been much more compelling. Titled, "The Backfire Effect," the article is a lengthy exploration of a phenomenon almost anyone who has engaged in debate on the Internet will be familiar with: Individuals who, confronted with evidence contrary to their views, only dig in deeper.
So, here you are, in the future surrounded by computers which can deliver to you just about every fact humans know, the instructions for any task, the steps to any skill, the explanation for every single thing your species has figured out so far. This once imaginary place is now your daily life.This is indeed an important question, but I disagree with David McRaney's explanation, which combines a plausible-sounding evolutionary explanation with soft determinism and an unfortunate lack of emphasis on the actual survival value of being correct:
So, if the future we were promised is now here, why isn't it the ultimate triumph of science and reason? Why don't you live in a social and political technotopia, an empirical nirvana, an Asgard of analytical thought minus the jumpsuits and neon headbands where the truth is known to all?
Have you ever noticed the peculiar tendency you have to let praise pass through you, but feel crushed by criticism? A thousand positive remarks can slip by unnoticed, but one "you suck" can linger in your head for days. One hypothesis as to why this and the backfire effect happens is that you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response. Those who failed to address negative stimuli failed to keep breathing.Yes, I think that, in very general terms, the bit above about negative stimuli attracting attention is a plausible consequence of the way our nervous systems and minds work, at least in terms of unpleasant stimuli as a subset of stimuli that stand out from the cognitive background. However, I find McRaney's treatment of this phenomenon unsatisfying.
For one thing, if your beliefs are mistaken and your method of "dealing with" the "negative stimul[us]" of a differing belief (that happens to also be correct) is merely to explain it away, you, too, can fail to "keep breathing." Furthermore, the fact is that people can and do change their minds about all kinds of things all the time: Clearly, people do not always dig in when their beliefs are challenged.
I find it interesting that the title of the Lifehacker post that points to this is, "Why you can't win that argument on the Internet." (If you count email, I once very memorably did change someone's mind about a very big issue, in the sense of getting him to check his premises. The first thing he did upon changing his mind was thank me.) I see that title as symptomatic of a kind of problem that many lengthy Internet arguments exemplify, which is this: What difference does what any particular person thinks -- other than oneself or a loved one -- make to one's own survival?
Culturally, yes, throwing good ideas out there to attract the interest of people who are amenable to rational argument can indirectly improve one's life if they accept them, but that still has to be done in a way that engages their minds. That said, one needn't get the last word in every time or "win" every (or any) argument one engages in so long as one presents one's ideas in such a way that a rational person who happens on them can see their merit and want to learn more about them. And sometimes, in order to defuse commonly-accepted, but mistaken arguments, it can be valuable to at least put the counterargument out there -- but that value has nothing to do with making your opponent bow down publicly before your superior wisdom.
Continuing on with the premise that getting good ideas out there can be in one's rational self-interest, even a rational audience can fail to agree with an argument for a correct position for any number of good reasons. The argument can be poorly presented, and even come across as a load of garbage that someone is trying to foist on others for whatever reason. I have seen people display what I can only describe as an apparent contempt for someone else's need to make up his own mind. Setting aside a clear-cut case of bullying (which is beyond the scope of my concerns at the moment anyway), the immediate reaction of anyone who doesn't simply head for the hills will be to oppose an argument presented in such a way. And then, of course, there is the fact that human beings have free will, and can, at any time, choose to ignore or evade the truth.
The importance of a method of communicating good ideas that aids in understanding cannot be overstated because man is a conceptual being. Grasping the truth requires not just self-evident sensory data, but the logical drawing of conclusions that are often far from obvious from that data. If an argument is complex, both the difficulty of evaluating that argument and of convincing others of that argument increase exponentially. (And then, for sufficiently complex topics, each side of an argument can be correct about some things, and incorrect about others). Although this is not always the case, I have often learned that people who hard sell notions that are, in fact, quite complex, do not really understand what they are talking about. This isn't always the case, but I think an intuitive, vague understanding of this possibility also drives people to resist positions presented in such a way. Such people have, to put a twist on the opposite problem, reached, at least for the moment, the "wrong answer for the right reason" (if the person they disagree with happens to have indicated a correct position which they do, in fact, disagree with).
So much for presenting an argument. There is a more fundamental issue, about which one should fully satisfy his mind long before advocating a given argument. Whatever survival value accepting an argument might have depends on whether it is correct. The more complicated that argument is, the more work one has to do to determine whether it is correct, and the more opportunities one has to make mistakes. One would think that knowing this would cause people to be more patient with each other during debates, but that often isn't the case. A lack of patience can come from many things, but a distinct possibility is that it comes from an incomplete or sloppy process of evaluating one's position before shouting it from the mountaintop. In the context of arguments that are very difficult to evaluate, that possibility, too, can impede communication of the truth, because many people will consider it, at least on some vague level, when deciding whether the mental effort of considering an argument is really worth their time.
I think McRaney does get around to getting the reader to consider the idea that he can be wrong, but within the context of a lengthy argument somewhere on the Internet that nobody else is going to read in its entirety, so what? Much more compelling to me, in terms of my own survival, is the value of discovering the truth and using it to live.
Today: Corrected a typo and changed several instances of the word, "argument" to terms like, "position" or "notion." One can use argument to mean position, as I was in those instances, but doing so confuses the following issue: One can argue poorly for correct positions and vice versa.