Thursday, November 03, 2011
Over at A Smart Bear is an interesting post by Jason Cohen about a phenomenon I hadn't seen labeled before, although I once blogged about the example (Scroll down to Item 4.) he opens with. Cohen calls the phenomenon "survivor bias", but Wikipedia calls it "survivorship bias". In a nutshell, Cohen says that if you're learning only from the successful, you may be missing out, although your error is certainly understandable:
Do you read business blogs where the author has failed three times without success?Cohen then gives us a clear example:
No, because you want to learn from success, not hear about "lessons learned" from a guy who hasn't yet learned those lessons himself.
However, the fact that you are learning only from success is a deeper problem than you imagine.
Some stories will expose the enormity of this fallacy. [bold in original]
During World War II the English sent daily bombing raids into Germany. Many planes never returned; those that did were often riddled with bullet holes from anti-air machine guns and German fighters.That's a neat story, but one thing I like about Cohen's post is that he looks around -- and finds -- plenty of other examples of this type of failure to see the full context of a problem one is considering, from science, pseudo-science, and business advice.
Wanting to improve the odds of getting a crew home alive, English engineers studied the locations of the bullet holes. Where the planes were hit most, they reasoned, is where they should attach heavy armor plating. Sure enough, a pattern emerged: Bullets clustered on the wings, tail, and rear gunner's station. Few bullets were found in the main cockpit or fuel tanks.
The logical conclusion is that they should add armor plating to the spots that get hit most often by bullets. But that’s wrong.
Planes with bullets in the cockpit or fuel tanks didn't make it home; the bullet holes in returning planes were "found" in places that were by definition relatively benign. The real data is in the planes that were shot down, not the ones that survived. [bold added]
Another thing I appreciate is what Cohen does with this observation, which differs starkly from an approach I have seen recently, once at Slate and once (in a less slick form) at Cracked. Focusing on the first, a review of Daniel Kahneman's Thinking Fast and Slow, consider what Kahneman prescribes as a cure for the various errors in thinking he discusses in his book:
... Again and again he reminds us that having the means to describe your own bias won't do much to help you overcome it. If we want to enforce rational behavior in society, he argues, then we all need to cooperate. Since it's easier to recognize someone else's errors than our own, we should all be harassing our friends about their poor judgments and making fun of their mistakes. Kahneman thinks we'd be better off in a society of inveterate nags who spout off at the water-cooler like overzealous subscribers to Psychology Today. Each chapter of the book closes with a series of quotes -- many suggested by the author's daughter -- that are supposed to help kick off these enriching conversations: You might snipe to a colleague, for example, that "All she is going by is the halo effect"; or maybe you'd interrupt a meeting to cry out, "Nice example of the affect heuristic," or "Let's not follow the law of small numbers."How does Kahneman know that knowing about a cognitive bias won't do an individual much good? And why would "we" want to enforce anything in society other than recognition of our own rights? Are we wholly incapable of introspection, and is our own well-being not reason enough to develop better self-awareness? Granted, many -- perhaps most -- people are quite content to muddle along in second-hand fashion, but the practice of indiscriminately hectoring the obdurate strikes me as a waste of my time. In the hands of the well-intended, his advice creates annoyance, and in other hands, it excuses mindlessness by attacking all certainty as mere bias. (See the second article. Our minds working in a certain way is not the same thing as our minds being "programmed".)
Cohen offers advice in what I am increasingly beginning to think of as the only acceptable way: when it is sought out (including as, in this example, by a reader whose interest Cohen has succeeded in winning, and whose mind he engaged). Why do I think this? First, I disagree with several premises Kahneman seems to hold implicitly: (1) that we are incapable of self-correction; (2) we have some kind of obligation to indiscriminately correct others; and (3) that we are merely parts of some kind of social collective, and thus at the mercy of any error by anyone else. But there's still a deeper issue here: Someone who isn't seeking advice isn't ready, for whatever reason, to hear it.
Setting aside intellectual sloth (which hectoring usually makes resolute anyway), the reason someone might be unready for good advice is simple: He hasn't the requisite intellectual context to appreciate the need for that advice. For example: He doesn't have the problem you purport to solve, at all. He doesn't realize he has the problem you purport to solve. He doesn't know enough about the problem you hope to solve to even evaluate whether he has the problem or whether your advice is any good. Aside from the obligation to actually have good advice to offer, if you care enough about it to broadcast it, you have to take into account the cognitive context of your potential audience. This means, for a couple of extreme examples I have personally encountered, not patronizing your audience or presenting your advice so poorly that it sounds ridiculous.
Although mistakes like the above are not, per se, proof that unsolicited, poorly-presented advice is bad, they raise my hackles. Why, to someone who claims to want to persuade me, is my mind being treated like an obstacle to be overcome, rather than a potential ally? Isn't advice supposed to enhance my grasp of reality, rather than replace it?
11-5-11: Taking a look again at the John Cook post on the bomber data, I see that, in fact, he referred to this kind of error as selection bias.