Tuesday, February 12, 2013
Reader Snedcat sends me a link to an article by Henry I. Miller of the Hoover
Institution about government funding of bad science. I think that the government's role in funding
scientific research should be far more limited than the Miller does, but I do agree
that, if the government is going to fund scientific research at all, it should
at least make sure that what is being funded is science.
If, for the sake of argument, we accept the premise that the government should be funding scientific research, we can see that the article offers value on several fronts.
First, the piece takes a needed look at both indiscriminate critics who are too ready to condemn practically all research that receives government funding and Pollyannas who are too ready to sweep under the rug the real problem of government waste in the name of science. The former category is epitomized by Sarah Palin, who "didn't know what she didn't know" when she ridiculed fruit fly research. The latter category includes a group of congressmen who present "Golden Goose Awards" to scientists whose "valuable federally funded research project may sound funny, but [whose] purpose is no laughing matter." These politcians are guilty of a formal fallacy, asserting the consequent.
The Golden Goose Award makes use of a formal fallacy, a pattern of reasoning that is illogical and wrong, called "asserting the consequent." It takes the form of: "If A, then B. B, therefore, A." An example would be: "If Warren Buffett owned the British Crown Jewels, he would be rich. Buffett is rich; therefore, he owns the Crown Jewels." The rationale for the award seems to be, "Some criticism of federally-funded research projects has been uninformed and ill-advised. People continue to criticize federally funded projects; therefore, their views are uninformed and ill-advised."Regarding any criticism of government funding of science, it is important to realize that, while the government has not yet succeeded in destroying science, it is, in fact, (and among other things) spending money that could be used more productively on things ranging from duplicated effort to pseudoscience and even fraud.
Second, the piece gives several reasons why the government can't be in charge of overall scientific funding, although I think the author would call them something like "obstacles to good funding policy". (Miller, to my knowledge, doesn't advocate getting the government almost entirely out of science, as I do.) Some of these problems are cultural and educational, like the surprising degree of ignorance among the general public about such matters as the Earth revolving around the Sun each year, or what a molecule is. Others are more intractable, such as the problem of "rational ignorance":
There's a good reason that people generally are not science and technology savvy -- a phenomenon that has been dubbed "rational ignorance," which comes into play when the cost of sufficiently informing oneself about an issue to make an informed decision on it outweighs any potential benefit one could reasonably expect from that decision. Citizens occupied with the concerns of daily living -- families, jobs, health -- may not consider it to be cost-effective to study the potential risks and benefits of genetic engineering or nanotechnology.If that sounds familiar, I discussed it here (although not specifically about science) years ago, but I never had a good name for the phenomenon.
Finally, the article provides examples. I'll quote my favorite, which is about what you'd expect when breeding the welfare state with the wisdom of the crowd:
Some of the projects funded by NSF are less flagrant but real examples of waste or abuse. For example, the agency has funded a series of "citizens technology forums," at which previously uninformed, ordinary Americans were brought together to solve a thorny question of technology policy.Miller doesn't say this, so I will: Since the government can force you to abide by its policies, even when they are wrong, to formulate policy this way would be like being accosted by someone with a gun on your way to the doctor's office for a medical consultation -- and being made to listen to (and follow!) the medical opinion of some waitress, instead.
Participants were informed by "a 61-page background document -- vetted by experts -- to read prior to deliberating." (The experts once again reflected the viewpoints of the organizers, no doubt.) They produced a hodgepodge of conclusions and recommendations, including "concern over the effectiveness of regulations" and "reduced certainty about the benefits of human enhancement technologies" but wanted "the government to guarantee access to them if they prove too expensive for the average American." (Surprise! The participants didn't understand the risks and benefits of the new technology but wanted the government to provide them with entitlements so they could avail themselves of the products of nanotechnology!)
The output of the citizens' technology forums illustrates that such undertakings have limitations in both theory and practice; nonexperts are too often subject to their own prejudices and to the specific choice of background materials and the advocates to whom they are exposed. Both of these groups yielded just what one would expect: opinions that were based on a slanted and incomplete understanding of the subject.
Getting policy recommendations on obscure and complex technical questions from groups of citizen nonexperts is like going from your cardiologist's office to a café, explaining to the waitress the therapeutic options for your chest pain, and asking her whether you should have the angioplasty or just take medication. [bold added]
In addition to examples of bad science getting funds, Miller provides some dollar figures regarding the amount of money that the government is most obviously mis-allocating.
Miller's article is quite valuable, and arguably more so than he realizes himself.