Advice vs. (Survivor) Bias

Thursday, November 03, 2011

Over at A Smart Bear is an interesting post by Jason Cohen about a phenomenon I hadn't seen labeled before, although I once blogged about the example (Scroll down to Item 4.) he opens with. Cohen calls the phenomenon "survivor bias", but Wikipedia calls it "survivorship bias". In a nutshell, Cohen says that if you're learning only from the successful, you may be missing out, although your error is certainly understandable:

Do you read business blogs where the author has failed three times without success?

No, because you want to learn from success, not hear about "lessons learned" from a guy who hasn't yet learned those lessons himself.

However, the fact that you are learning only from success is a deeper problem than you imagine.

Some stories will expose the enormity of this fallacy. [bold in original]
Cohen then gives us a clear example:
During World War II the English sent daily bombing raids into Germany. Many planes never returned; those that did were often riddled with bullet holes from anti-air machine guns and German fighters.

Wanting to improve the odds of getting a crew home alive, English engineers studied the locations of the bullet holes. Where the planes were hit most, they reasoned, is where they should attach heavy armor plating. Sure enough, a pattern emerged: Bullets clustered on the wings, tail, and rear gunner's station. Few bullets were found in the main cockpit or fuel tanks.

The logical conclusion is that they should add armor plating to the spots that get hit most often by bullets. But that’s wrong.

Planes with bullets in the cockpit or fuel tanks didn't make it home; the bullet holes in returning planes were "found" in places that were by definition relatively benign. The real data is in the planes that were shot down, not the ones that survived. [bold added]
That's a neat story, but one thing I like about Cohen's post is that he looks around -- and finds -- plenty of other examples of this type of failure to see the full context of a problem one is considering, from science, pseudo-science, and business advice.

Another thing I appreciate is what Cohen does with this observation, which differs starkly from an approach I have seen recently, once at Slate and once (in a less slick form) at Cracked. Focusing on the first, a review of Daniel Kahneman's Thinking Fast and Slow, consider what Kahneman prescribes as a cure for the various errors in thinking he discusses in his book:
... Again and again he reminds us that having the means to describe your own bias won't do much to help you overcome it. If we want to enforce rational behavior in society, he argues, then we all need to cooperate. Since it's easier to recognize someone else's errors than our own, we should all be harassing our friends about their poor judgments and making fun of their mistakes. Kahneman thinks we'd be better off in a society of inveterate nags who spout off at the water-cooler like overzealous subscribers to Psychology Today. Each chapter of the book closes with a series of quotes -- many suggested by the author's daughter -- that are supposed to help kick off these enriching conversations: You might snipe to a colleague, for example, that "All she is going by is the halo effect"; or maybe you'd interrupt a meeting to cry out, "Nice example of the affect heuristic," or "Let's not follow the law of small numbers."
How does Kahneman know that knowing about a cognitive bias won't do an individual much good? And why would "we" want to enforce anything in society other than recognition of our own rights? Are we wholly incapable of introspection, and is our own well-being not reason enough to develop better self-awareness? Granted, many -- perhaps most -- people are quite content to muddle along in  second-hand fashion, but the practice of indiscriminately hectoring the obdurate strikes me as a waste of my time. In the hands of the well-intended, his advice creates annoyance, and in other hands, it excuses mindlessness by attacking all certainty as mere bias. (See the second article. Our minds working in a certain way is not the same thing as our minds being "programmed".)

Cohen offers advice in what I am increasingly beginning to think of as the only acceptable way: when it is sought out (including as, in this example, by a reader whose interest Cohen has succeeded in winning, and whose mind he engaged). Why do I think this? First, I disagree with several premises Kahneman seems to hold implicitly: (1) that we are incapable of self-correction; (2) we have some kind of obligation to indiscriminately correct others; and (3) that we are merely parts of some kind of social collective, and thus at the mercy of any error by anyone else. But there's still a deeper issue here: Someone who isn't seeking advice isn't ready, for whatever reason, to hear it.

Setting aside intellectual sloth (which hectoring usually makes resolute anyway), the reason someone might be unready for good advice is simple: He hasn't the requisite intellectual context to appreciate the need for that advice. For example: He doesn't have the problem you purport to solve, at all. He doesn't realize he has the problem you purport to solve. He doesn't know enough about the problem you hope to solve to even evaluate whether he has the problem or whether your advice is any good. Aside from the obligation to actually have good advice to offer, if you care enough about it to broadcast it, you have to take into account the cognitive context of your potential audience. This means, for a couple of extreme examples I have personally encountered, not patronizing your audience or presenting your advice so poorly that it sounds ridiculous.

Although mistakes like the above are not, per se, proof that unsolicited, poorly-presented advice is bad, they raise my hackles. Why, to someone who claims to want to persuade me, is my mind being treated like an obstacle to be overcome, rather than a potential ally? Isn't advice supposed to enhance my grasp of reality, rather than replace it?

-- CAV


11-5-11: Taking a look again at the John Cook post on the bomber data, I see that, in fact, he referred to this kind of error as  selection bias.


Snedcat said...

Yo, Gus, you write, "How does Kahneman know that knowing about a cognitive bias won't do an individual much good?" By coincidence, I ran across a remarkably similar book review over at the AV Club. Above all it peddles as well as praises the same sickly tone as Kahneman's, who got mentioned in the comments. One commentor asked an interesting question: "In the last couple of years I have read their work repeatedly interpreted, so why are they suddenly in vogue? Most of the studies cited were done in the eighties[.]" Indeed--it seems to be the latest fad of the pseudointelligentsia to attack a strawman version of homo economicus to discredit the efficacy of reason, and thereby to call for greater government planning. (Any wonder then that Kahneman won the Nobel Prize?)

Gus Van Horn said...

That book title in the AVC review reminds me of a fictitious book in an Ayn Rand novel -- *Why You Think You Think* -- which makes sense on many levels, given how much else lately also seems like t could be in an AR novel...

Katrina said...

I think there'd be a lot of benefits to the "heckling" method. For one thing, it hones your own reasoning skills to analyze and dissect the reasoning of others. For another, you're benefited by improved reasoning skills in those you interact with. I think the first benefit alone makes it worth it. For the second benefit, I'm imagining a world where my coworkers become better thinkers every day... Yeah, I want to go to there.

Gus Van Horn said...

(1) You don't have to annoy other people: Just analyze their foolishness for yourself, and say nothing, unless it is actually relevant to your work to correct someone.

(2) A better approach is to help people see that they have a problem or why they ought to correct it, rather than just slamming them out of the blue, which, depending on how out-of-context your hectoring is, might as well be arbitrary.

I am not saying, "Never challenge anyone": I am saying to do so in a way that is likely to actually engage and change their minds.

kelleyn said...

Advocates and practitioners of the hectoring method would, of course, be as prone to the same effects, biases and so forth as their targets; and they'd be no less prone when doing the hectoring. This would inevitably lead to people being called out gratuitously and unfairly. Either the helpful hectors are setting themselves above their poor biased neighbors, or they're condoning a culture in which unjust ridicule can be slung about with impunity. This supports my suspicion of what mentalities such as Kahneman's and McRaney's are really after: the second-handed smugness of unearned intellectual superiority.

Gus Van Horn said...

Exactly -- and they get to preen about being "unbiased" while they're doing it, too.

Jennifer Snow said...

Unsolicited advice is the bane of my existence, although it can be pretty amusing after I've gotten past being furious at people for their presumption. Why amusing? It's funny to see the ridiculous things people believe about me--even people who theoretically "ought" to know me pretty well.

My favorite was my grandmother advising me to get a substitute teaching license. I *hate* children. Might as well advise me to put my hand in a meat grinder for a living.

In addition, pointing out that someone has a bias doesn't necessarily inform them of how they should fix their *problem*. The would-be advisor simply gets to sound smart while actually not making any useful contribution.

Snedcat said...

Yo, Gus, I agree completely about hectoring or heckling types since I've worked with one or two. In a nutshell, "I think there'd be a lot of benefits to the "heckling"'re benefited by improved reasoning skills in those you interact with" is blind innocence at best, and almost always comes across instead as an expression of that Ineffective Daily Affirmation, "I will willingly share my experience and wisdom with my fellows, for there are no sweeter words than 'I told you so.'"

The "heckling" method assumes at best that any such disagreement is due purely to errors in reasoning by an intellectual inferior who happens to share every single one of your own premises. In practice, I find the coworkers who do indulge in the "heckling method" are browbeating busy-body know-nothing know-it-alls who fail to see their own urgent need for remedial education on the most basic level--frequently combined with a politely smiling backstabbing duplicity that merits firing, not respect.

For example: the book store coworker who would hector customers by correcting their true statements with ridiculous falsehoods--for instance, lecturing a professor of English literature that George Sand was an Englishwoman, not George Eliot, who was (believe it or not!) born in Australia or maybe New Zealand but migrated to England in her 20s (which makes one wonder what amusing or startling revelations she'd offer the world about Katherine Mansfield). She was also given to lying about her coworkers to get them fired and herself appointed assistant manager when she wasn't berating them for essentially doing a better job than she would ever be capable of. (Such as the time I filled in on very short notice after a glass of wine with dinner, which she told the owner was me coming in to work drunk--and when I wasn't fired she told this to other local businessmen. Or the time she hovered over my shoulder as I rang up a customer and was such a drooling, blithering idiot that she didn't realize tht you most easily give a 10% discount by multiplying the subtotal by 0.9--note too that she didn't tell management this but instead followed the customer into the lobby, checked their receipt, and told them to complain to the manager about me...she didn't like it when the customer called management all right, but only to complain about her abyssmal math skills: "Is this the sort of innumeracy you want in a cashier?") Natch, after she was fired she soon went to work for city government, where she has risen steadily through the ranks.

So no, heckling coworkers will only get their heads chewed off if they try it on me, since I've never seen a case in which the heckler actually has anything like the cognitive abilities, work experience, intelligence, commmon sense or mother wit to actually correct my supposed failings. In my experience, heckling like that exclusively indicates serious character failings, not the second coming of Howard Roark. (Indeed, imagine Howard Roark heckling his coworkers. Rather far out of character, wouldn't you say?) I simply do not suffer fools, gladly or otherwise--but I don't heckle them either.

Gus Van Horn said...


I know exactly what you mean by how amusing what hecklers presume to know about me an be. Even more amusing is the fact that, by such presumption, they unwittingly tell me that their brilliant advice, founded on epistemological quicksand, can be safely ignored.


Good point. A Howard Roark or a John Galt would have far better things to do than nag anyone who stumbled into earshot.


Steve D said...

"The real data is in the planes that were shot down, not the ones that survived."

Actually, the real data IS in the planes, which survived. It just was not interpreted properly.

Gus Van Horn said...

If you concede that the shot-down planes were crucial to correctly interpreting the data from the surviving planes, I'll concede the point!

Steve D said...

Of course, one can sit back in an armchair and imagine how the shot down planes might look...
These types of stories always leave me flabbergasted. These were not stupid people who made that very poor deduction! To test this, I brought the story up with a few coworkers and no one missed the point. Of course they had the advantage of seeing my body language which probably indicated I was testing them – not a double blind test by any stretch.
So why do very smart people make very stupid conclusions? (That’s would be a good topic for a blog post)