Current issues of ACP Journal Club are published in Annals of Internal Medicine


Letter

Cost-effectiveness analysis: are the outputs worth the inputs?

ACP J Club. 1996 May-June;124:82. doi:10.7326/ACPJC-1996-124-3-082



To the Editor

I enjoyed Dr. Naylor's comments on cost-effectiveness analysis (1). As a shameless proselytizer for what he has termed the “religion” of cost-effectiveness, however, I feel morally compelled to defend the honor of my chosen faith. Dr. Naylor's 7 warnings fall into 2 categories. The first is that economic research is often poorly done. The second category includes a litany of the well-known, unresolved, fundamental methodologic problems inherent in economic evaluation. In response to these 7 criticisms, I offer the 4 Spiritual Laws of Clinical Research:

1. Poorly done research is the rule, not the exception. This characterization applies to every type of clinical and health services research, although perhaps especially to economic evaluation. As moral philosophers remind us, however, “is” is a different fish than “ought.” We generally do not draw the inference that because randomized trials often are poorly done, they ought not be done at all.

2. Outcomes that matter to patients are more difficult to measure than biological and clinical events. Quality-of-life is more difficult to measure than cholesterol or reperfusion rates. Discerning what human beings value and how they make choices (the domain of economics) is even more difficult. Because economists are by inclination or training not empirically oriented, the measurement task is further complicated, although not rendered less important. Small wonder, then, that methodologic controversy abounds.

3. The importance of a question is proportional to its generality but inversely proportional to the ease with which it can be answered. Whether immunizing infants for hepatitis B produces a good immunologic response is easily answered but not very important on its own. Whether we ought to launch a nationwide campaign of infant vaccination involves additional issues of effectiveness, cost, quality of life, risk attitude, attitudes toward intragenerational and intergenerational resource allocation, and logistics. An attempt to combine some or most of these parameters into a single index is a difficult methodologic task but has the potential to dramatically simplify the cognitive burden of decision making.

4. Paradigms shift, problems stay. Scientific paradigms change with time. It may be that the utilitarian underpinnings of conventional economic evaluation will eventually doom it to be rejected in favor of a “justice-based” or other approach. The issues of unlimited demand, limited resources, and related ideas such as “value for money,” however, probably will not go away in our lifetime. For all their warts, current approaches do offer some useful direction in approaching these issues.

Economic evaluation is an immature and, at present, insufficiently empirical discipline. But as pilgrims on the path to wisdom in the allocation of our dwindling health care resources, we are ill advised to cast aside any source of illumination, however imperfect and unsteady.

Murray Krahn, MD, MSc
The Toronto HospitalUniversity of Toronto
Toronto, Ontario, Canada

In response

I am grateful to Dr. Krahn for offering his 4 Spiritual Laws of Clinical Research to those who seek absolution from the 7 Deadly Sins of Cost-effectiveness Analysis outlined in my editorial. Point-by-point responses follow.

1. I am more optimistic than my colleague about the state of clinical research. But if he is right, and poorly done clinical research is indeed the rule, consider the consequences for economic analysis. Poorly done clinical research leads to biased or imprecise assessments of how a given technology affects patients' health status. Cost-effectiveness analysis basically takes these “facts,” adds “values” in the form of costs and patients' preferences that may also be mismeasured, extrapolates or interpolates as needed to make up for absent empirical data, and produces a set of ratios that are supposed to guide decision making.

2. I agree fully with Dr. Krahn's second point on the difficulties of measuring what human beings value and understanding how they make choices. This difficulty surely argues, however, for caution in interpreting cost-effectiveness analyses—which was the central theme of my editorial.

3. Dr. Krahn's third law is elegant and incisive, but the ensuing argument is almost tautological. He suggests that since important health care issues are complex, we should embrace cost-effectiveness analysis because it distills many inputs into a single index with “the potential to dramatically simplify the cognitive burden of decision making.” Here is the seductive reductionism of cost-effectiveness analysis in a nutshell. I would rather take my cue from the aphorism commonly attributed to H.L. Mencken: “For every very complex question, there is a very simple answer, and it is dead wrong.”

4. Agreed, especially as regards the dermatology of cost-effectiveness analysis.

Despite his initial self-characterization as a “shameless proselytizer,” Dr. Krahn's closing remarks bespeak a pilgrim who seeks empirical enlightenment and rejects much of the present dogma surrounding cost-effectiveness analysis. I applaud his near-heresies and remain unrepentant in my apostasy.

C. David Naylor, MD, D Phil
Institute for Clinical Evaluative Sciences
Toronto, Ontario, Canada


Reference

1. Naylor CD.Cost-effectiveness analysis: are the outputs worth the inputs? ACP J Club. 1996 Jan-Feb;124:A12-4.