HIV Articles  
Back 
 
 
Resistance Testing for Treatment-Naives is Recommended  
 
 
  "Should Resistance Testing Be Performed for Treatment-Naive HIV-Infected Patients? A Cost-Effectiveness Analysis"
 
Clinical Infectious Diseases 2005;41:000 (advance Oct 2005)
 
"....Recent studies from the United States and from other countries in North America and Europe indicate a prevalence of antiretroviral resistance among treatment-naive patients of 8%-10%. Although transmission of drug-resistant virus has been recognized for several years, resistance testing has not been recommended for chronically infected patients before they initiate therapy, unless HIV infection was recently acquired or the prevalence of resistance is known to exceed 5%.....
 
....we utilized a previously published model of HIV disease to project the long-term clinical and cost outcomes for a cohort of HIV-infected, antiretroviral-naive patients who undergo pretreatment resistance testing......Resistance testing at the time of diagnosis should be the standard of care....
 
.... This analysis suggests that, with a baseline prevalence of resistance of >1%, resistance testing at the time of HIV diagnosis-and, thus, before antiretroviral therapy is started-is a cost-effective strategy that can lead to selection of a more effective initial antiretroviral regimen and likely longer survival for patients who have drug-resistant virus.... We found that the cost-effectiveness of primary resistance testing increases with the prevalence of resistance....."
 
See EDITORIAL by Robert Grant & Frederick Hecht at end of this article: "....The cost-effectiveness analysis by Sax et al....supports the use of antiretroviral drug-resistance testing for all drug-naive patients.... there is now sufficiently strong information to recommend genotypic resistance testing for all drug-naive patients at the time of diagnosis... The utility of drug-resistance testing may be greater in drug-naive persons because partial resistance is more common, leaving effective drug regimens that can be selected with the right information.... the article by Sax et al. shows that it is also cost-effective."
 
Authors: Paul E. Sax,1 Runa Islam,2 Rochelle P. Walensky,1,2,3 Elena Losina,2,5 Milton C. Weinstein,4 Sue J. Goldie,4 Sara N. Sadownik,2 and Kenneth A. Freedberg2,3,4,5
 
1Division of Infectious Diseases, Brigham and Women's Hospital, Divisions of 2General Medicine and 3Infectious Diseases, Massachusetts General Hospital, Harvard Medical School, 4Harvard Center for Risk Analysis, Harvard School of Public Health, and 5Departments of Biostatistics and Epidemiology, Boston University School of Public Health, Boston, Massachusetts
 
Financial support. National Institute of Allergy and Infectious Diseases (K23AI01794, K24AI062476, K25AI50436, R01-AI42006, and P30-AI42857) and the Centers for Disease Control and Prevention (cooperative agreement U64/CCU 114927 to the Cost-effectiveness of Preventing AIDS Complications project).
 
Potential conflicts of interest. All authors: no conflicts.
 
BACKGROUND
Resistance to antiretroviral agents among HIV-infected, antiretroviral-naive individuals, or "primary resistance," results from transmission of HIV from treatment-experienced individuals who are infected with antiretroviral-resistant strains [1-3]. Among the clinically important consequences of primary resistance are a longer time to virologic suppression and a higher risk of treatment failure [1, 4-7]. A recent 10-city surveillance study of primary resistance reported an overall prevalence of 8.3% in the United States [2].
 
Genotype-resistance testing can help identify new, more-effective antiretroviral regimens for patients for whom therapy is failing [8-10] and has been incorporated into clinical treatment guidelines [11, 12]. Previous analyses have also shown that genotype-resistance testing before initiation of a new antiretroviral regimen in patients for whom therapy is failing appears to be cost-effective [13, 14].
 
The use of resistance testing to guide the choice of a first treatment regimen is more controversial. No controlled clinical studies have assessed the benefits of resistance testing in treatment-naive patients, and there are limited data on treatment outcomes in patients with primary resistance [5, 6, 15]. Clinicians may also be reluctant to order resistance testing for treatment-naive patients because of the high relative cost of the test, compared with the costs of other routine blood tests [12]. However, given recent findings showing the long-term persistence of transmitted resistance mutations [1, 16-19], detection of preexisting resistance mutations in untreated patients may help clinicians choose more-effective initial regimens [12]. The most recent guidelines from the International AIDS Society-USA recommend that resistance testing be performed before treatment is started for patients infected within the previous 2 years-and possibly longer, if the prevalence of baseline resistance exceeds 5% [12]. Our objective was to estimate the clinical benefits, costs, and cost-effectiveness of using genotype resistance testing to guide the choice of initial antiretroviral regimen in patients with newly diagnosed, chronic HIV infection.
 
ABSTRACT
Background. Data from the United States and Europe show a population prevalence of baseline drug resistance of 8%-10% among human immunodeficiency virus (HIV)-infected patients who are antiretroviral naive. Our objective was to determine the clinical impact and cost-effectiveness of genotype resistance testing for treatment-naive patients with chronic HIV infection.
 
Methods. We utilized a state-transition model of HIV disease to project life expectancy, costs, and cost-effectiveness in a hypothetical cohort of antiretroviral-naive patients with chronic HIV infection. On the basis of a US survey of treatment-naive patients from the Centers for Disease Control and Prevention, we used a baseline prevalence of drug resistance of 8.3%.
 
Results. A strategy of genotype-resistance testing at initial diagnosis of HIV infection increased per-person quality-adjusted life expectancy by 1.0 months, with an incremental cost-effectiveness ratio of $23,900 per quality-adjusted life-year gained, compared with no genotype testing. The cost-effectiveness ratio for resistance testing remained less than $50,000 per quality-adjusted life-year gained, unless the prevalence of resistance was <1%, a level lower than those reported in most regions of the United States and Europe. In sensitivity analyses, the cost-effectiveness remained favorable through wide variations in baseline assumptions, including variations in genotype cost, prevalence of resistance overall and to individual drug classes, and sensitivity of resistance testing.
 
Conclusions. Genotype-resistance testing of chronically HIV-infected, antiretroviral-naive patients is likely to improve clinical outcomes and is cost-effective, compared with other HIV care in the United States. Resistance testing at the time of diagnosis should be the standard of care.
 
DISCUSSION
Recent studies from the United States and from other countries in North America and Europe indicate a prevalence of antiretroviral resistance among treatment-naive patients of 8%-10% [2, 3]. Although transmission of drug-resistant virus has been recognized for several years, resistance testing has not been recommended for chronically infected patients before they initiate therapy, unless HIV infection was recently acquired or the prevalence of resistance is known to exceed 5% [12]. The hesitation to employ resistance testing for all individuals with new diagnoses arises from multiple factors, most notably the cost of the test and the absence of controlled data showing that this testing strategy improves outcomes. However, patients who harbor a drug-resistant virus may have lower rates of virologic suppression when treated with combination therapy [6, 7], and it has become clear that transmitted resistance mutations may be detectable for months and even years after acquisition [1, 16-19].
 
This analysis suggests that, with a baseline prevalence of resistance of >1%, resistance testing at the time of HIV diagnosis-and, thus, before antiretroviral therapy is started-is a cost-effective strategy that can lead to selection of a more effective initial antiretroviral regimen and likely longer survival for patients who have drug-resistant virus. The relatively modest gains in mean quality-adjusted life expectancy (1.0 QALMs at a resistance prevalence of 8.3%) reflect the fact that, for the large majority of patients without primary resistance, testing provides no benefit; but for those with primary resistance, testing provides substantial benefit. Furthermore, the relatively low cost of the onetime test renders primary resistance testing extremely cost-effective, given the overall high cost of treating HIV infection over the course of a patient's lifetime. At a per test cost of $400, this test represents <5% of annual costs for antiretroviral therapy and <0.2% of total lifetime care costs incurred by HIV-infected patients [44].
 
We found that the cost-effectiveness of primary resistance testing increases with the prevalence of resistance. In a population in which 5% of patients have baseline resistance, the cost-effectiveness ratio remained less than $50,000 per QALY gained, even when the cost of genotype testing was more than tripled. In the base case analysis, with a prevalence of primary resistance of 8.3%, resistance testing for individuals with new diagnoses had a cost-effectiveness ratio of $23,900 per QALY gained. This result is in the range of other commonly accepted HIV interventions, including resistance testing after treatment failure ($20,200 per QALY gained) [13], combination antiretroviral therapy ($25,900 per QALY gained) [20], and prophylaxis for Mycobacterium avium complex infection ($63,100 per QALY gained) [47]. We used published data regarding the prevalence of primary resistance and the distribution of resistance among drug classes [2]. Importantly, a recent update of this study cited a prevalence of drug resistance of 15.2%, with NNRTI resistance rates now at 9.1% [48]. At these higher rates, resistance testing at the time of diagnosis of HIV infection would be even more cost-effective.
 
There are several limitations to this study. The input data informing model assumptions about the virologic effectiveness of antiretroviral therapy were derived from heterogeneous sources, including both cohort and interventional studies, which were not specifically designed to examine outcomes related to primary resistance testing. Additionally, we did not consider the role of multiple-class resistance in antiretroviral-naive patients. However, because treatment options are already limited for these patients, including them in the analysis would likely accentuate the benefits of primary resistance testing and render it even more cost-effective. Nonetheless, these people represent a small subset of all patients who are infected with drug-resistant virus [2], and inclusion of them in the study would likely have a small impact on overall cost-effectiveness. Given that data on antiretroviral therapy efficacy includes no more than 10 years of follow-up, we have conservatively modeled a 10-year limit on efficacy for each antiretroviral regimen that a patient may receive. With continued advances in the duration of effective therapy, the benefits of primary resistance testing would likely increase if the test led to the appropriate choice of an initial durable regimen. Finally, we did not specifically examine the development of new drug classes (e.g., integrase inhibitors and CCR5 antagonists), which may broaden the available choices for initial antiretroviral therapy and may be equally effective in patients infected with virus with and virus without primary resistance to currently available drug classes. However, current treatment guidelines recommend initial therapy with 2 NRTIs and either an NNRTI or a PI, and they are likely to do so for the near future [11, 29].
 
Resistance testing at the time of diagnosis of HIV infection should not be limited only to those with recently acquired infection, because data from both the United States and Europe do not show significantly different rates of resistance in infected individuals depending on when they contracted HIV infection [2, 3, 18]. Furthermore, it is rarely possible to accurately assess the time of acquisition of HIV infection in a patient who received a new diagnosis [49]. Implications of this analysis go beyond the most recent International AIDS Society-USA guidelines recommending that primary resistance testing be considered if the expected prevalence of resistance is >5% [12], particularly because clinicians usually cannot estimate the prevalence of drug resistance in their communities. On the basis of the available evidence, and considering both clinical benefits and costs, genotype resistance testing should be performed for all patients with newly diagnosed HIV infection in the United States, with the results used to guide the choice of antiretroviral regimen when treatment is indicated.
 
RESULTS
 
Main Results
In the absence of primary resistance testing, patients had a discounted projected mean quality-adjusted life expectancy of 168.3 months and a total discounted lifetime cost of $336,600. With resistance testing at the time of initial diagnosis, the mean quality-adjusted life expectancy increased to 169.3 months, and total costs increased to $338,600. With this 1.0 quality-adjusted life-month (QALM) increase in survival and $2000 increase in cost, the incremental cost-effectiveness ratio associated with primary resistance testing was $23,900 per QALY gained, compared with no testing.
 
The magnitude of clinical benefit varied among the subgroups of patients who exhibited baseline drug class-specific resistance. Patients infected with NNRTI- and NRTI-resistant strains benefited the most from resistance testing, with a mean gain of 14.1 and 13.8 QALMs, compared with those who underwent no testing. Patients infected with PI-resistant virus gained only 4.9 QALMs with testing. This difference reflects the assumption that patients with baseline PI-resistant infection were only eligible for 4 lines of therapy; in addition, boosted PI-based therapy retains activity against many PI-resistant viruses, so the difference between tested and untested subjects is relatively small.
 
Sensitivity Analyses
Prevalence of resistance.
With the ratio of NRTI, NNRTI, and PI resistance held constant, as the overall prevalence of resistance was varied from 0.25% to 10.0%, the clinical benefits of primary resistance testing ranged from 0.03 to 1.2 QALMs, compared with no testing (table 4). Similarly, the incremental cost-effectiveness ratios associated with resistance testing varied from $175,400 per QALY gained, when the overall prevalence of primary resistance was 0.25%, to $23,100 per QALY gained, when the overall prevalence of primary resistance was 10%. At a cost of $400 per test, the cost-effectiveness ratio for testing remained less than $50,000 per QALY gained until the prevalence of resistance decreased to 1.0%. Varying the proportion of patients infected with NRTI-, NNRTI-, and PI-resistant virus within clinically plausible ranges had minimal effect on cost-effectiveness ratios. The use of proportions of drug class resistance found in a European multiple-site study [3] led to a slightly higher baseline cost-effectiveness ratio of $24,200 per QALY gained.
 
Cost of resistance testing.
The cost-effectiveness ratio associated with primary resistance testing remained favorable through wide ranges in a 2-way sensitivity analysis in which both prevalence of resistance and test cost varied (table 4). At a prevalence of resistance of 5.0%, the level at which testing is recommended in the International AIDS Society-USA guidelines [12], when the test cost was varied between $200 and $800, the cost-effectiveness ratio for resistance testing ranged from $23,100 to $34,800 per QALY gained. At the baseline prevalence of resistance of 8.3%, the cost-effectiveness ratio for testing did not exceed $50,000 per QALY gained until the cost per test exceeded $2600.
 
Efficacy of resistance testing in improving virologic suppression. Primary resistance testing may underestimate the true prevalence of underlying resistance due to low-level minority populations of resistant virus that are not identified by testing. If the sensitivity of resistance testing is lowered, the test will be less effective at improving outcomes among those who undergo testing. In addition, treatment outcomes in patients with drug-resistant infection who do not undergo testing may be better than in our baseline assumptions. Therefore, we varied the incremental benefit of resistance testing on achieving virologic suppression from 10% to 90% of the estimated benefit. Over this range of test performance, the cost-effectiveness ratio for resistance testing varied from $66,300 to $24,400 per QALY gained. The cost-effectiveness ratio did not exceed $50,000 per QALY gained until the efficacy of resistance testing decreased to 15% of the baseline expected benefit.
 
Initial antiretroviral strategy.
The baseline analysis assumed that 75% of patients started receiving an efavirenz-based regimen and 25% started receiving a lopinavir/ritonavir-based regimen. When we varied the proportion who started efavirenz-based treatment from 100% to 0%, with the remaining starting lopinavir/ritonavir-based treatment, the cost-effectiveness ratios associated with resistance testing were minimally affected, ranging from $23,600 per QALY gained when all patients started receiving an efavirenz-based regimen to $25,700 per QALY gained when all patients started receiving a lopinavir/ritonavir-based regimen.
 
Other parameters.
Results of the analysis were relatively insensitive to clinically plausible variations in many parameters in the model, including (1) the use of 1 PI-based regimen rather than 2 for all patients, (2) the use of enfuvirtide as the fourth versus the fifth line of treatment, and (3) the efficacy of NNRTI-based therapy in patients with primary or acquired NRTI resistance. Cost-effectiveness ratios remained less than $50,000 per QALY gained in all of these sensitivity analyses.
 
METHODS
Analytic overview. On the basis of available data from clinical trials, observational cohort studies, surveillance studies of primary resistance, and national cost surveys, we utilized a previously published model of HIV disease [13, 20] to project the long-term clinical and cost outcomes for a cohort of HIV-infected, antiretroviral-naive patients who undergo pretreatment resistance testing. The target population consisted of chronically HIV-infected patients with characteristics (age, sex, CD4 cell count, and HIV RNA level) similar to those of the general population in the AIDS Clinical Trial Group 384 study [21].
 
In accordance with the recommendations of the Panel on Cost-Effectiveness in Health and Medicine, we adopted a societal perspective and expressed clinical benefits in quality-adjusted life-years (QALYs) gained, to reflect the potential gains in both longevity and quality of life associated with the use of primary resistance testing [23]. Future costs and life-years were discounted at an annual rate of 3%. Model outcomes included life expectancy, quality-adjusted life expectancy, and lifetime costs (2001 US$). The results of the cost-effectiveness analysis are summarized using the cost-effectiveness ratio, in which each strategy with testing is compared incrementally with a strategy that does not employ resistance testing. With use of sensitivity analyses, we examined the impact of varying key model parameters, as described below.
 
The Cost-Effectiveness of Preventing AIDS Complications model.
The Cost-Effectiveness of Preventing AIDS Complications model is a previously published simulation model of the progression and outcomes of HIV disease in a population of chronically infected individuals [13, 20]. Details regarding the model structure and derivation of natural history parameters from the Multicenter AIDS Cohort Study [24] are available in previous publications [25]. Briefly, in this model, each simulated patient's clinical progression is tracked on a month-by-month basis. In any given month, the patient resides in 1 of 3 health states: chronic infection, acute clinical event, or death. We used incidence density analysis to compute monthly risks of HIV disease progression, including rates of decrease in the CD4 cell count, probabilities of acute opportunistic infections, and rates of death due to HIV disease and other causes [13, 20, 26].
 
In untreated patients and in patients for whom antiretroviral therapy is failing, HIV RNA level determines the rate of the CD4 cell count decrease, which in turn determines the probability that the patient will experience acute opportunistic infections [27]. To incorporate data from the current treatment era showing that antiretroviral therapy has a protective effect independent of current CD4 cell count, we decreased the CD4-specific risk of opportunistic infections and death to 54% of that for untreated patients with the same CD4 cell count [28]. On the basis of clinical trial data, patients who are receiving antiretroviral therapy have a regimen-specific probability of achieving viral suppression and a concomitant increase in CD4 cell count. In this analysis, CD4 cell count or development of an opportunistic infection are used to determine when a patient should start opportunistic infection prophylaxis and antiretroviral therapy [11, 29]. Both CD4 cell count and HIV RNA level are used to determine when a patient should switch antiretroviral regimens.
 
The model simulates and records individual patient histories regarding time spent in each health state, time receiving therapy, duration of survival, and quality-adjusted survival duration after model entry, as well as total lifetime costs. We modeled 1 million patient simulations for each of 2 strategies (no primary resistance testing and primary resistance testing) to obtain stable estimates of average long-term outcomes. The model was programmed in C and compiled in Visual C++ 6.0 (Microsoft). Additional details regarding model structure, data, and assumptions are available in several publications with technical appendices [13, 20, 25, 30].
 
Prevalence of primary resistance and efficacies of antiretroviral therapy. In the base case, the overall prevalence of antiretroviral resistance among treatment-naive patients was 8.3% [2]. This estimate included 4.7%, 1.7%, and 1.9% for resistance to nucleoside reverse-transcriptase inhibitors (NRTIs), nonnucleoside reverse-transcriptase inhibitors (NNRTIs), and protease inhibitors (PIs), respectively. Because of a lack of data on treatment outcomes for multiple-class resistance and the difficulty of modeling antiretroviral strategies for such patients, we focused the analysis on patients with single-class resistance patterns. To derive the prevalence of resistance to single-drug classes, we assumed that NRTI resistance would generally be a component of multidrug resistance. Thus, we subtracted the prevalence rates of resistance to NNRTIs and PIs from the total resistance prevalence rate to produce the rate of NRTI resistance. In addition, because of changes in rates of resistance among drug classes, we vary these baseline assumptions in sensitivity analyses.
 
On the basis of published literature, we estimated the likely virologic suppression rate of the first through fifth sequential line of therapy in patients with and patients without virus with genotypic resistance who were starting an efavirenz-based or a lopinavir/ritonavir-based initial regimen [31-38]. Because limited data exist regarding efficacy of therapy with primary resistance, we estimated outcomes using studies of treatment-experienced patients and then varied the outcomes in sensitivity analyses. When outcomes were not reported to 48 weeks, we extrapolated data as reported elsewhere [13]. To account for heterogeneity in the standard of care for initial therapy, we estimated that 75% of the untested population received efavirenz-based therapy, whereas the other 25% received lopinavir/ritonavir-based therapy [11, 29]. For the tested population, the choice of initial regimen was based on the results of the genotype test. Efficacy of first-line therapy ranged from a high of 75% viral suppression at 48 weeks in patients with no PI or NRTI resistance who start receiving a lopinavir/ritonavir-based initial regimen [34] to 19% viral suppression at 48 weeks in patients with baseline NNRTI resistance who start receiving an efavirenz-based initial regimen [33].
 
We assumed that patients without resistance could receive 5 sequential lines of treatment: 1 line of NNRTI-based treatment, 2 lines of PI-based treatment (with minimal PI resistance after the first treatment failure [42]), 1 line of enfuvirtide-based treatment, and a final salvage regimen without enfuvirtide. In general, later regimens had lower efficacies than did earlier regimens. Tested patients with baseline PI resistance were only eligible for 1 line of PI-based treatment, and tested patients with baseline NNRTI resistance were not eligible for NNRTI-based treatment. Untested patients with baseline NNRTI or PI resistance would receive limited efficacy from the initial regimen in the corresponding drug class; likewise, untested patients with NRTI resistance would receive a less efficacious initial regimen (table 2).
 
Costs and quality-of-life data.
Direct costs of treatment for both routine medical care and for acute illnesses were estimated from charge data from the AIDS Cost and Services Utilization Survey and were varied to incorporate the more recent national HIV Cost and Services Utilization Study [43, 44]. All reported charges were converted to costs using a national cost-to-charge ratio [25, 45]. We assumed that the cost of a genotype resistance test was $400 on the basis of data from the payment office of the Brigham and Women's Hospital (Boston, MA). Drug costs were based on average wholesale prices listed in the 2001 Drug Topics Red Book [22]. Data on health-related quality of life were derived from the HIV Cost and Services Utilization Study, and utility weights from the SF-6D were applied [46].
 
EDITORIAL in CID 2005;41:000
 
Resistance Testing in Drug-Naive HIV-Infected Patients: Is it Time?
 
Frederick M. Hecht1 and Robert M. Grant1,2 1HIV-AIDS Division, San Francisco General Hospital, Department of Medicine, University of California, San Francisco, and 2Gladstone Institute of Virology and Immunology, San Francisco, California
 
Early recommendations for the use of antiretroviral drug resistance testing in 1998 cautiously endorsed its use for persons who have experienced treatment failure, with a possible role for persons identified with recent HIV infection [1]. More recent recommendations now include clear recommendations for resistance testing in persons with treatment failure and in those with HIV infection of <2 years' duration, and suggest considering resistance testing for drug-naive patients in areas with a prevalence of resistance of ⩾5% [2]. In this issue of the journal, Sax et al. [3] present a cost-effectiveness analysis that supports the use of genotypic drug-resistance testing for all drug-naive patients in most settings. Is it time to make this change?
 
Initial recommendations to perform antiretroviral drug resistance testing in recently HIV-infected persons but not in those with chronic HIV infection were based on an assumption that, within 1-2 years, most drug-resistance mutations would be overgrown by wild-type virus [1]. Resistance testing for patients with chronic infection might provide little but false assurance that resistance was not present in those with low levels of drug-resistant virus that could emerge rapidly when treatment was initiated. Although some resistance mutations do revert to wild-type virus within a year [4, 5], recent studies of persons with primary drug resistance (i.e., resistance acquired through transmission of virus from a source with drug resistance) indicate that most resistance mutations persist at detectable levels considerably longer and may be stable for many years [5-7]. Early assumptions that drug-resistance mutations would be lost more quickly were based in part on experience with persons who acquired drug-resistance mutations while receiving antiretroviral therapy. In these individuals, drug-susceptible virus usually has a fitness advantage, and cessation of antiretroviral therapy often leads to overgrowth of drug-resistant virus that obscures the detection of mutations within months [8]. In persons with primary HIV drug resistance, viral evolution appears to have a different pattern: evidence of primary resistance is lost more slowly and typically involves reversion of mutations one-by-one, rather than larger viral genetic shifts involving decreased frequency of several mutations at the same time. This pattern is consistent with transmission of only a few HIV-1 variants such that no drug-susceptible virus is present to compete with the drug-resistant virus, and the emergence of wild-type virus depends on a much slower process of backward mutations to the wild-type genotype.
 
This likely explains why recent reports indicate that the prevalence of drug resistance in drug-naive patients is relatively high regardless of whether they are recently or chronically infected. In a study of >1000 drug-naive individuals in 10 US cities enrolled during the period of 1997-2001, Weinstock et al. [9] found that 8.3% had at least 1 resistance mutation. This makes sense in light of data on the persistence of transmitted drug-resistance mutations and the frequency of transmission of drug-resistant HIV, which has varied over time and population, but which has consistently been ⩾8% during the past decade in the United States [9-11] and elsewhere in the developed world [12-16].
 
The cost-effectiveness analysis by Sax et al. [3] provides a third piece of important information that supports the use of antiretroviral drug-resistance testing for all drug-naive patients. This analysis found that the cost-effectiveness ratio for resistance testing of drug-naive patients before commencement of antiretroviral therapy remained less than $50,000 per quality-adjusted life-year, a commonly accepted threshold below which medical interventions are agreed to be cost-effective, as long as the prevalence of drug resistance was ⩾1%. This remained true in sensitivity analyses that varied factors, such as the cost of assays and the benefit of resistance testing in improving outcomes, over wide ranges.
 
We believe that there is now sufficiently strong information to recommend genotypic resistance testing for all drug-naive patients at the time of diagnosis. Because HIV drug-resistance testing has little risk of causing harm to patients (other than the anxiety caused by knowledge that one has a drug-resistant variant), cost-effectiveness considerations are key. The analysis by Sax et al. [3] suggests that such testing will be well within accepted parameters of medical interventions believed to be cost-effective over most plausible scenarios. The baseline assumptions in the model are appropriately conservative and may underestimate the benefit of drug resistance testing in drug-naive persons. For example, the model bases the utility of resistance testing on trial data from patients with treatment failure. The utility of drug-resistance testing may be greater in drug-naive persons because partial resistance is more common, leaving effective drug regimens that can be selected with the right information. In contrast, some patients in trials of these assays have been infected with highly drug-resistant HIV and have had no highly effective regimen choices that can be selected using resistance data. Furthermore, in situations involving salvage therapy, the ability of experienced clinicians to use antiretroviral history to select optimal regimens competes with the utility of resistance testing. In contrast, the treatment history of transmission partners is not usually available to guide regimen selection in drug-naive persons.
 
There are several questions that remain. Accepted levels of cost-effectiveness in well-resourced regions may not apply to resource-poor areas where investments in job development, clean water, and provision of basic HIV/AIDS treatment are urgently needed. Development of novel and less-costly strategies for drug-resistance testing will be important. In addition, there is emerging information that, although many drug-resistance mutations remain detectable after several years of infection, there are others that wane below the limit of detection of standard resistance assays but remain detectable using novel minor variant assays. Application of assays capable of detecting minor drug-resistant variants in chronically infected persons appears to increase detection of primary resistance in a significant number of patients [17, 18]. Validation of these assays for clinical use is going to be challenging, because normal viral variation at primer-binding sites has complex effects on assay performance. Furthermore, there will be questions about whether the additional cost is justified.
 
For now, this work addresses a perplexing problem faced by clinicians. Prior recommendations and current reimbursement in many programs restrict resistance testing for drug-naive persons to those who are recently infected. In most patients, however, the duration of infection cannot be discerned from the history or clinically available laboratory tests. Current suggestions to perform resistance testing when the prevalence in drug-naive patients is expected to be ⩾5% assumes that this information is available to clinicians, which in most communities is not true. The recommendation of genotypic resistance testing for all drug-naive persons with HIV is more easily implemented, and the article by Sax et al. shows that it is also cost-effective.
 
 
 
 
  icon paper stack View Older Articles   Back to Top   www.natap.org