Primary open-angle glaucoma currently affects at least 2.2 million individuals over the age of 40 in the United States.1 Considering the disproportionate growth of the elderly population, the incidence of glaucoma is expected to rise significantly, with 3 million projected US cases by 2020.2,3

As the population of glaucoma patients increases, so will the associated costs. The total cost of glaucoma-related health care expenditures was estimated at $2.5 billion annually in 1988.4 With the rising number of glaucoma patients, coupled with the vast array of therapeutic modalities and the multitude of diagnostic tools, this figure can be expected to grow dramatically.

Under the looming shadow of health care reform comes a realization of the finite nature of health care funds. Resource allocation will become an increasingly popular buzzword. Specifically, we will have to decide if a specific resource (eg, one particular diagnostic test) has enough value that we are willing to give up another option. In the very near future, we physicians will be charged with not only determining which tests are clinically relevant but also whether they are cost-effective.

RESEARCH INTO COST-EFFECTIVENESS

As might be expected, identifying cost-effective diagnostic tests is a bit more straightforward in theory than it is in practice. Decisions should be made using highquality, evidence-based data from well-designed studies. The ideal way to evaluate the cost-effectiveness of a specific diagnostic modality would be to design a randomized clinical trial in which one group undergoes a standard workup (eg, tonometry, photographs, and visual field testing) and the second arm receives the standard workup plus another test (eg, optical coherence tomography). We could subsequently evaluate whether the additional test improves patients' overall outcomes (ie, slowed visual loss) and/or quality of life.5

Unfortunately, there is a dearth of clinical studies that evaluate the differential cost-effectiveness of the available glaucoma-related diagnostic technologies. The only applicable study used a Markov cost-effectiveness simulation model to determine whether tonometry should be performed with (1) all initial patients, (2) high-risk patients, or (3) no one during the clinical workup for ocular hypertension and primary open-angle glaucoma. Not surprisingly, the researchers found that this diagnostic modality was indeed a cost-effective practice that should be included in all ophthalmic evaluations.6

Although the optimal clinical trials are lacking, studies from several countries demonstrate that direct glaucomarelated costs increase as the severity of disease worsens.7-9 Based on these findings, we can surmise that the efficient diagnosis and successful treatment of early glaucoma will significantly reduce health care expenditures related to glaucoma, while simultaneously preserving a robust quality of life for our patients. The challenge becomes finding the most efficient way to achieve these goals. From a broader perspective, the baseline evaluation and subsequent observation of a glaucoma patient should follow the strategy delineated in the AAO's Preferred Practice Patterns. It is a well-conceived and thorough approach, without being redundant or excessive. The beauty of this protocol is that it provides a comprehensive framework of recommendations that gives us leeway to tailor the frequency of testing based on the severity of the patient's disease and his or her risk factors for progression.10

STRUCTURAL IMAGING

Although the recommendations in the AAO's Preferred Practice Patterns may seem intuitive, many of the key elements are often overlooked in routine practice. In fact, in a study by Fremont and colleagues, nearly half of the patients did not have their optic nerve photographed or drawn during their initial evaluation. This omission is of great concern, because the absence of a baseline image of the optic nerve head precludes longitudinal evaluation and increases the risk of a delayed diagnosis.11 If we consider all of the patients who developed glaucoma in the Ocular Hypertension Treatment Study (OHTS), 55% were diagnosed based on changes in the appearance of their optic nerve head. These diagnoses all would have been missed without baseline photographs,12 a situation leading to more advanced disease and thus an increased economic burden on society.

Retinal nerve fiber layer loss is one of the earliest signs of glaucoma.13 In vivo structural imaging can accurately and reliably produce objective, quantitative measurements of the optic nerve and retinal nerve fiber layer loss. Although structural imaging with confocal scanning laser ophthalmoscopy, scanning laser polarimetry, and optical coherence tomography has been readily available since the 1990s, its use lagged until improved software and strategies for data analysis were developed that made them clinically relevant and easy to comprehend.14 The benefits of these techniques include rapid image acquisition, the need for limited technical skill, and ease of interpretation.

Unfortunately, this technology is very expensive, and several studies have documented that it is not any better than stereo photographs at predicting future changes in standard achromatic perimetry (SAP) or distinguishing early glaucoma from normalcy.15,16 Moreover, the technology is changing so rapidly that it has been difficult for us to become acclimated to one iteration before a higherresolution version arrives, thus minimizing the utility of the previous data. These changes come with significant costs, including new machines, hardware, software, and the need for additional testing to acquire baseline data. It is possible that the new imaging technology will become cost-effective if we have a single apparatus that we can use longitudinally to reliably pick up early signs of glaucomatous progression.17 Until then, the cost-efficient use of this imaging is relegated to detecting early stages of glaucomatous damage, resolving apparent disparities between photographs and SAP, corroborating findings that are in question, and facilitating the evaluation of small optic nerves (Figures 1 and 2).

FUNCTIONAL TESTING

Because the presence of a visual field defect is intrinsic to glaucoma, perimetry is an essential component of the examination. In fact, it has been reported that up to onethird of all cases of glaucoma would be missed if routine perimetry were ignored.18 Based on the aforementioned studies that established the financial benefits of diagnosing glaucoma as early as possible, the economic utility of perimetry is absolute. The nagging unanswered questions are (1) how often do we need to order visual fields and (2) is our current technology acceptable? The former question is very difficult to answer, particularly when we consider its economic implications. The best recommendation would be to schedule visual field testing according to the AAO's preferred practice guidelines, with the caveat that perimetry should be performed more frequently based on the severity of the disease, the rate of progression, the IOP, and the number of other significant risk factors.10 Certainly, any evidence of visual field progression deserves repeat testing, since 86% of the visual field abnormalities in OHTS reverted to normalcy on repeat testing.19 Repeating SAP and confirming a change in the visual field is less expensive than a lifetime of potentially unnecessary treatment.

Despite all of SAP's benefits, a significant percentage of the retinal ganglion cells are lost before this testing reveals defects.20 Selective functional tests can now detect visual field defects with a smaller proportion of retinal ganglion cells lost compared with SAP. For example, short-wavelength automated perimetry has been shown to detect subsequent visual field loss in patients with ocular hypertension roughly 3 years earlier than SAP.21 Unfortunately, artifacts in eyes with lenticular changes and high short- and long-term variability limit the utility of this test.

The second-generation frequency doubling technology (FDT) perimeter—the FDT2 Humphrey Matrix (Carl Zeiss Meditec, Inc., Dublin, CA)—uses a 6° stimulus and performs 24-2 and 30-2 testing patterns. The smaller stimulus and additional testing points theoretically improve the likelihood of detecting a small visual field defect. Moreover, the Humphrey Matrix has an excellent testretest variability, particularly in regions with early damage.22 Based on these findings, the economic implications of using the Humphrey Matrix are twofold. First, the device has the potential to be an excellent screening tool for glaucoma by detecting disease early when it is still relatively inexpensive to treat and follow. Second, the FDT2 has been shown to more reliably detect early visual field loss. It therefore may be used to validate possible visual field defects in SAP, which would obviate the need for multiple retests, as in OHTS.

Clearly, the judicious use of perimetric evaluation is essential to effectively monitoring the glaucoma patient and is well worth its cost to prevent unnecessary progression.

CONCLUSION

As the population ages, the economic burden on our health care system will grow. Resource allocation is not an option; it is inevitable. We physicians must be proactive in our efforts to practice conscientious medicine and employ interventions that are both clinically meaningful and cost-effective.

Jeffrey A. Kammer, MD, is an assistant professor of ophthalmology & visual sciences, and he is the chief of the Glaucoma Service at Vanderbilt Eye Institute in Nashville, Tennessee. He acknowledged no financial interest in the products or company mentioned herein. Dr. Kammer may be reached at (615) 936-7190; jeff.kammer@vanderbilt.edu.