No Nudge Required

Tuesday, August 31, 2010

Going through a small backlog of HBL installments yesterday, I found a John Stossel piece that makes explicit a point I didn't quite make in Friday's post. There, I merely noted that the post-Katrina school system in New Orleans probably has valuable lessons to teach advocates of capitalism. I'll borrow Stossel's wording to summarize that lesson:

Contracting out to private enterprise isn't the same thing as letting fully competitive free markets operate, but it still works better than government.
This is a mouthful, but saying it avoids the pitfall I noted long ago of misapplying the label "privatization" to such situations. He even discusses the same example I once did -- Indiana's privately operated toll roads.

The Stossel piece is also noteworthy for providing more evidence of the practicality of privatizing roads, including the fact that such roads would be safer for a variety of reasons. One that surprised even me was that, in some cases, this could be the result of less signage and fewer of the safety features that governments now indiscriminately add to roads:
It's Friedrich Hayek's "spontaneous" order in action: Instead of sitting at a mechanized light waiting to be told when to go, drivers meet in an intersection and negotiate their way through by making eye contact and gesturing. The secret is that drivers must pay attention to their surroundings -- to pedestrians and other cars -- rather than just to signs and signals. It demonstrates the "Peltzman Effect" (named after retired University of Chicago economist Sam Peltzman): People tend to behave more recklessly when their sense of safety is increased. By removing signs, lights and barriers, drivers feel less safe, so they drive more carefully. They pay more attention.
Stossel cites two examples of exactly this occurring where it was tried in Europe. Of course, there are also cases of safety features that take advantage of quirks in human perception that are known to work. Private companies would be free to implement them, of course -- but they would, thanks to the profit motive, be more attuned to whether a given measure (or none at all) is what a given situation calls for.

Contra libertarian paternalists like Richard Thaler and Cass Sunstein, there is no role for government trickery in achieving public safety. The promise of greater profits and the desire to remain alive will "nudge" companies and individuals in such a direction, and the almost-forgotten practice of taking charge of one's own welfare will make both good at it.

-- CAV

8 comments:

Steve D said...

“One that surprised even me was that, in some cases, this could be the result of less signage and fewer of the safety features that governments now indiscriminately add to roads:”

I think a similar effect was observed for motorcycle helmet laws where they found that while they reduced the number of head injuries the number of other injuries went up. A perfect example of the Peltzman effect.

There is another related phenomenon known in sociology where to much attention to some particular idea actually lowers the attention people pay to it. (I don't remember what this is called). So for example during campaigns to reduce smoking it was observed that there was an optimal amount of negative advertising after which levels of smoking actually went up again. I see this with the obsession about safety at my work where at some point people begin to tune out the constant preaching and actually pay less attention to safety.

More generally we are a rule obsessed society, private or government. We make rules for the sake of making rules often with little thought as to what negative consequences might ensue. In many cases rules which do not make sense or are never explained are put in lists with rules which are probably useful (package deal). We often make rules just in case of bad behavior which really essentially sends the message that we don’t expect to people to behave well. Another case is the assumption of problems which do not exist - and the time and resources which go into trying to solve them.

Is this an example of the malevolent universe premise or our people simply projecting their desired behavior on others?

Gus Van Horn said...

"I think a similar effect was observed for motorcycle helmet laws where they found that while they reduced the number of head injuries the number of other injuries went up."

Humorous aside: What a dilemma for the Sunsteins of the world! Fewer head injuries -- or more organ donations?

"I see this with the obsession about safety at my work where at some point people begin to tune out the constant preaching and actually pay less attention to safety."

I call it "warning fatigue" and once fell "victim" to it myself.

"[A]t some point people begin to tune out the constant preaching and actually pay less attention to safety."

I see the tuning out of the message as a phenomenon akin to sensory adaptation, in which one adjusts to a constant "background" level of a cognitive input that fails to convey new or useful information by "zeroing it out."

But are people really paying less attention to safety in every case? Some probably are safer (or are less safe if, say, new rules seem reasonable, but have some undetected flaw), some never needed to be told to act safely (or be educated about safe practices), and some never really got the message for whatever reason (for any combination of message effectiveness, attention, and mental acuity) or chose to ignore it to begin with.

"Is this an example of the malevolent universe premise or our people simply projecting their desired behavior on others?"

I see rule obsession (and the desire to control the behavior of others) as often being a symptom of a malevolent universe premise. That isn't to say that there aren't (or can't be) general rules or principles for safety practices, or that one should never be concerned about how others behave, but at a certain point, it does seem to me that such things are something that individuals should feel motivated to learn about on their own. It strikes me as very odd (not to mention highly suspicious) when people crusade for such rules above and beyond what their context calls for.

Gus

Mo said...

I noticed this with alcohol alot as well. Sometimes the nannies will even ignore research to the contrary and just stick to their recycled dogmas. take this little gem from a research professor at our beloved university:

“The public needs to know what low-risk drinking is. The government should be telling us all, using the best science possible.”

Gus Van Horn said...

Nice. Presumably, by "low risk drinking," he means, "teetotaling." Oddly, I seem to recall something in the press just this morning to the effect that even too much drinking was better, at least in terms of longevity, than none at all, on average.

Andrew Dalton said...

The central folly of our risk-obsessed culture (aside from the government coercion that comes along with it) is that it flouts the cognitive requirements of the human mind.

First, there is the crow epistemology. People cannot be expected to keep track of risk-du-jour #114 on top of all of the other risks they've been told to care about. Second, there is the hierarchy of values, which limits our motivation (thankfully!) to care about more than a handful of these risks at any given time.

On top of that, add people's widespread understanding (more accurate than "experts" will admit) that many of these fashionable risks are lies or exaggerations -- and you have the perfect recipe for people not giving a damn.

Gus Van Horn said...

I completely agree -- and like that graphic.

I think the phenomenon you cite is one kind of reaction to the dishonest tactics employed by certain hysteria-fanning "experts" of George Monbiot's ilk. They know that we can't all be experts on everything, but rather than put their theories out there for others to evaluate on their own time and in line with their rational priorities, they try to scare people into accepting their say-so as trusted experts.

Some will sense this for what it is on a sense-of-life level and, rather than forfeiting their independence, they will understandably choose to dismiss whatever such "experts" say, taking the high-pressure tactics as a prima facie sign that something is amiss.

On the other side of the same coin, some will generously mistake urgency for sincerity, which will become for them a prima facie reason to TRUST such a self-proclaimed expert.

Both kinds of reactions can result in people improperly assessing whether something constitutes new factual information and, if so, its actual value. (And this can occur whether a given piece of advice is valid or not, or comes from a real expert or not.) And both illustrate the paramount responsibility of those who have such specialized knowledge to (1) know what they're talking about, and (2) present it in a way that does not effectively demand acceptance on faith.

Steve D said...

“But are people really paying less attention to safety in every case?”

There is a difference of course between formulating new rules, especially those which have some semblance of being reasonable and just reiteration of the same rules over and over again. As in many other areas, different people will react quite differently.

“That isn't to say that there aren't (or can't be) general rules or principles for safety practices”

Of course safety is just one example. One organization of instance sets rules for who can be in what role in order to try to ‘encourage’ more people to participate when all it does is create problems assigning positions and in the end the same people end up doing all the work anyway. In another cases different governments make rules which are at cross purposes. Etcetera.

” The public needs to know what low-risk drinking is.”

Hmm…This sentence can be interpreted two ways.

“The government should be telling us all, using the best science possible”

It would be even better if the government simply allowed the alcohol manufacturers to tell us the truth. I can see it now. Warning: The surgeon general has determined that moderate consumption of this beverage may improve your health. Or better: Warning: The surgeon general has generously consented to let us tell you that science has determined that moderate consumption of this beverage may improve your health.

“they will understandably choose to dismiss whatever such "experts" say”

Unfortunately even those few experts who really care correct.

“central folly of our risk-obsessed culture”

I would agree. This applies as well to our rule-obsessed culture. What is the central message of all these irrational (and semi-rational) rules? - that we can’t think and need them to behave properly.

Gus Van Horn said...

"Unfortunately even those few experts who really [are] correct."

This is an occasional tragedy of "warning fatigue:" When someone is actually right, but gets the reception of a chicken little.

"What is the central message of all these irrational (and semi-rational) rules? - that we can't think and need them to behave properly."

To a limited degree -- for people crippled by pragmatism, to whatever extent they are -- this is true. Such rules end up being used in the stead of actual principles, in many cases.