Measuring Self-Awareness

Thursday, May 24, 2012

Over at Slate is an interesting interview with Dylan Evans, who has coined the term "risk intelligence", which really turns out to be a kind of self-awareness regarding what one does and does not know.

[Risk intelligence] is the ability to estimate probabilities accurately, it's about having the right amount of certainty to make educated guesses. That's the simple definition. But this apparently simple skill turns out to be quite complex. It ends up being a rather deep thing about how to work on the basis of limited information and cope with an uncertain world, about knowing yourself and your limitations.
In my limited exposure to the concept, it seems to be mis-named, and I would guess it might be because the author does not regard absolute certainty as possible for philosophical reasons.

Intrigued that someone seemed to be attempting to measure a quality I have often thought others to be lacking in, I took the test. Here is the writeup of my results.
The RQ score ranges from 0 (low RQ) to 100 (high RQ). Your score is 79.57. Such a score is high. Risk intelligence can be measured by calculating something called a "calibration curve". The red line displayed to your right is your calibration curve. A perfect calibration curve would lie exactly on the blue diagonal line, so the area between the curve and the diagonal would be zero. Nobody is perfectly calibrated, but people with high risk intelligence come very close to this ideal.

By now, you may have realized there is an easy way to game this test. If you always select the 50% category unless you are pretty certain that a statement is true or false - and if the test contains equal numbers of true and false statements - you will score very highly, perhaps very near 100.
What did I find myself doing during the test? I considered why I agreed or disagreed with a given statement -- or thought I had no basis for doing so, for the questions I realized I knew nothing about. In doing so, I usually ended up with some rough estimate of how confident I was in my (dis)agreement, as well as what else I would need to know to be more sure one way or the other. This was really an attempt to use a percentage as a metaphor for what I thought my level of knowledge about a topic was, rather than an actual estimate of a probability.

Based on past experience, I think it is possible for people with this kind of self-awareness to appear to be less confident than they actually are. This can happen when one communicates his level of uncertainty poorly (or too conservatively) or when one is dealing with someone who lacks this form of self-awareness (and hence equates certainty with confidence or views admitted uncertainty with suspicion). I can even recall doing the former while in the latter situation quite a few times when I was younger. (This ultimately culminated in me being ignored after correcting someone I absolutely knew to be wrong! Fortunately, the stakes were low.)

-- CAV

4 comments:

Vigilis said...

Interesting, Gus. The concept of "risk intelligence" can be relatively simple or futile depending upon an individual's (or a team's) grasp of constants and variables entailed.

The reasons should be simple enough for almost anyone to understand, and has been stated so simply that it was instantly ridiculed by media intelligensia and naturally ignored by the public:
"...because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones." - Donald H. Rumsfeld, Department of Defense news briefing, February 12, 2002

An excellent, example is Facebook's recent IPO. No insider knowledge whatever was required for astute investors to shun that investment. Some infatuated FB users with poor understanding of investment fundamentals were motivated solely by greed. Yet these same folk might do better than others in predicting outcomes in an area of their intimate familiarity.

Gus Van Horn said...

What you call greed, I'd call magical thinking. You remind me that there's a decent article about why the IPO didn't go so well.

Steve D said...

That was an interesting article. Two statements in particular intrigued me:
(Mass social media)...is an inherent contradiction...and...Would I trust Facebook to keep these confidences? Never.

These are in fact, my main reasons not to use FaceBook. It doesn't seem to offer anything more than other means of communication and has the potential to waste a whole of time- still I have to admit, many people love it. ResearchGate and LinkedIn are much more focused ‘social media’ and therefore more useful.

Gus Van Horn said...

Those are pretty similar to my reasons for not bothering with Facebook. The fact that it looked like a huge time suck pretty much killed it for me even BEFORE the baby came along.