PMDI7: ARE PUBLISHED COST-UTILITY ANALYSES IMPROVING?

PMDI7: ARE PUBLISHED COST-UTILITY ANALYSES IMPROVING?

297 Abstracts QALY, and $100,000/QALY. The proportions of sensitivity analyses reporting quantitative results that crossed the threshold above the ba...

60KB Sizes 1 Downloads 33 Views

297

Abstracts QALY, and $100,000/QALY. The proportions of sensitivity analyses reporting quantitative results that crossed the threshold above the base case CU ratio were 23% for cost sensitivity analyses, 38% for quality-of-life sensitivity analyses, and 15% for discount rate sensitivity analyses. There was no difference in quality ratings between CUAs that reported sensitivity analysis results that exceeded the thresholds (N = 17) and those that did not, but the overall quality and completeness ratings were only moderate. CONCLUSIONS: Sensitivity analyses for economic parameters are widely reported and can be used to identify whether choosing different assumptions leads to a different decision. Different decisions occur more frequently for cost and quality-of-life assumptions than for discount rate assumptions. Sensitivity analyses for cost and quality-of-life parameters should be used to test alternative guideline recommendations, but sensitivity analyses for discount rates do not have the same import. Adhering to recommendations on performing costeffectiveness analyses would improve the overall quality of these types of studies.

PMD16

FACILITATING USER INTERACTION WITH PHARMACOECONOMIC MODELS: MODEL-IT® Caro J, Magno J, Ward AJ Caro Research Institute, Concord, MA, USA Although a pharmacoeconomic model is usually created for one setting, there is often interest in using it in other jurisdictions. This requires that it be modifiable by other users, and given the complexity of most models, this may be difficult to do. OBJECTIVE: To develop a low cost tool that standardizes model inputs and outputs and allows models to be easily edited and analyzed. METHODS: An electronic viewer (MODEL-IT®) was developed as a “container” that allows display and interaction with disease models. The tool is programmed to work as a stand-alone application in a Windows© environment. It is designed to read any model that has been formatted according to a simple set of rules. The model engine itself can be in any format, including EXCEL. The screens were developed to maintain a consistent format yet be able to display inputs and outcomes pertinent to the specific model. Tool functions are accessed by selfexplanatory buttons. RESULTS: MODEL-IT® classifies inputs into specific categories including population characteristics, disease parameters, model controls (e.g. number of replications), treatment details and costs. All fields are editable. Outcomes are model-specific but are also classified into costs, effectiveness, survival and costeffectiveness. Model versions can be saved for later use, and all screens can be printed or exported to other programs. Model documentation can be incorporated as a help file. The Model-IT® viewer is available free of charge. CONCLUSIONS: A viewer has been developed to allow users to interact with models in a standard format and to increase interdisciplinary access and under-

standing of models in order to support their wider use in decision-making about new pharmaceuticals.

PMD17

ARE PUBLISHED COST-UTILITY ANALYSES IMPROVING? Neumann PJ1, Olchanski NV1, Rosen AB1, Greenberg D1, Chapman R1, Stone PW2, Nadai J1 1 Harvard University, Boston, MA, USA; 2Columbia University, New York, NY, USA OBJECTIVES: Our objectives were to investigate: 1) whether methods and reporting of published cost-utility analyses (CUAs) have improved over time; and 2) whether quality is higher in journals that published more CUAs. METHODS: A systematic search of the Englishlanguage medical literature identified 522 original CUAs published from 1976 through 2001. Each study was independently audited by two trained readers for a core set of data elements on study methodology and reporting, and a subjective assessment of overall study quality on a scale from 1 (low) to 7 (high)—data available at: http://www.hsph.harvard.edu/cearegistry/. High-volume journals were defined as those publishing 4 or more CUAs from 1976–2001. This study updates our previous analysis, which examined the quality of CUAs from 1976 to 1997. RESULTS: Several key elements improved over time. Comparing the 1998–2001 period (n = 294) to 1976–1997 (n = 228), articles improved in: clearly presenting the study perspective (73% vs. 52%, p < 0.001); performing sensitivity analyses (93% vs. 89%, p = 0.092); discounting both costs and QALYs (82% vs. 72%, p = 0.016); and calculating and reporting incremental ratios (69% vs. 46%, p < 0.001). More studies in the latter period took the societal perspective (30% vs. 23%). The overall quality score improved as well, though the change was not significant (4.25 vs. 4.10, p = 0.19). The proportion of studies disclosing funding sources did not change (64% vs. 65%, p = 0.88). Average quality score is greater in higher- vs. lower-volume journals (4.5 vs. 3.7, p < 0.001). CONCLUSION: Published CUAs have improved over time, though many still omit basic elements. Clinical journals, particularly those with little experience publishing CUAs, need to adopt and enforce standard protocols for conducting and reporting.

PMD18

ECONOMIC EVALUATION IN CLINIC TRIALS: CALCULATING HOSPITALIZATION COSTS FROM CLINIC TRIAL DATA Henk HJ University of Wisconsin, Madison, WI, USA OBJECTIVES: Economic evaluation is increasing common in clinical trials. Often, individuals’ health care costs are not observed in these trials, rather health care cost estimates are often calculated from observed resource