HIV Articles  
The Gap between Human Immunodeficiency Virus (HIV) Infection and Advances in HIV Treatment - EDITORIAL
  Clinical Infectious Diseases June 1 2010;50:1512-1520
Cynthia L. Gay
Division of Infectious Diseases, Department of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
"we have failed to significantly narrow the gap between HIV-infected individuals and effective HIV therapy.....African Americans and heterosexuals had the smallest increases in estimated mean CD4+ T cell count over the study period.....The finding of the least immunological improvement among African Americans over the study period is disappointing because African Americans are increasingly and disproportionately affected by the epidemic"
In 2006, the Centers for Disease Control and Prevention released guidelines recommending that all patients 13-64 years of age be routinely screened for human immunodeficiency virus (HIV) [1], which was followed in 2009 by similar recommendations from the American College of Physicians [2]. Although the overall impact of these statements on HIV testing and diagnosis remains unclear, the 2 statements warrant reflection in light of the findings by Althoff et al [3] published in this issue of Clinical Infectious Diseases. Although few medical conditions have seen such significant and rapid improvements in morbidity and mortality outcomes due to advances in HIV treatment, a large percentage of HIV-infected patients do not benefit if they remain undiagnosed or without HIV care. The poignant question that Althoff et al [3] consider is whether our ability to apply advances in HIV antiretroviral therapy (ART) to our HIV-infected patients has improved along with the effectiveness of therapy.
Althoff et al [3] present an analysis of late presentation to HIV care, defined as the degree of immunosuppression reflected in the initial CD4+ T cell count among outpatient patients enrolled in clinical cohorts participating in the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD) cohort. The authors report the primary finding of an increase in the median CD4+ T cell count among patients entering into HIV care (from 235 cells/mm3 in 1996 to 327 cells/mm3 in 2007) but emphasize correctly that the latter CD4+ T cell count still falls below current guidelines for ART initiation in developed countries. They also astutely interpret that the overall mean CD4+ T cell count increase of 6 cells/mm3 per year likely translates into little clinical significance given the standard deviation of CD4+ T cell counts. The reported percentage increase from 38% to 46% of patients presenting with a CD4+ T cell count 350 cells/mm3 represents only an additional 387 patients initiating HIV care in 2007 without a CD4+ T cell count-based indication for ART across 13 clinical sites. It is hard to interpret this trend as a significant improvement toward earlier HIV care given the 11-year study span and the estimated 325,000 individuals in the United States and Canada who remain unaware of their status [4, 5]. Accordingly, the study is a sobering indication that we have failed to significantly narrow the gap between HIV-infected individuals and effective HIV therapy.
Althoff et al [3] found that the largest increase in mean CD4+ T cell count over time occurred among Latinos, which suggests increased testing and/or access to care for this group. In contrast, African Americans and heterosexuals had the smallest increases in estimated mean CD4+ T cell count over the study period. The study also found that HIV-infected patients are presenting to care at older ages, and that older individuals and heterosexuals are at increased risk for presenting late to care. The finding of the least immunological improvement among African Americans over the study period is disappointing because African Americans are increasingly and disproportionately affected by the epidemic. A higher risk of late presentation among heterosexuals, compared with men who have sex with men and/or infection drug users, has been reported in other studies [6-8] as well as in the study by Althoff et al [3].
Prior studies of late presentation to care have also reported male sex and older age as risk factors for late entry into HIV care [6, 9-11], but both results were not confirmed in other studies [7, 8, 12, 13]. The finding in Althoff et al [3] that more HIV-infected individuals are presenting at older ages will likely only compound older age as a risk factor for late presentation [6-11, 14], unless we initiate more widespread HIV screening among older individuals. In contrast, the lack of association between male sex and late presentation in the present study is notable given the strength of the study design. This finding may suggest that sex varies as a risk factor across distinct populations, or it could reflect the inability to distinguish women diagnosed via prenatal screening in the NA-ACCORD, possibly introducing a bias toward earlier diagnosis among pregnant women. Regrettably, other factors previously found to be associated with late presentation (such as type of insurance, substance abuse other than injection drug use, rural vs urban residence, and nonindigenous persons) were not available for all participants in the NA-ACCORD and could not be evaluated in this study.
Late presentation for HIV care previously has been studied as a delay to HIV testing and diagnosis or delayed presentation to care following a positive HIV test. The study by Althoff et al [3] does not make this distinction but focuses simply on immunosuppression on entry to care. The study represents a real contribution to the topic of late presentation for HIV care because the analysis includes a substantially large sample size. In combination with the inclusion of multiple, geographically diverse clinical cohorts across the United States and Canada, the data greatly improve the generalizability of results to other populations. In contrast, the majority of other studies on late presentation to HIV care have been conducted with much smaller sample sizes and within closed cohorts, limiting the translation of findings to other populations. Furthermore, the inclusion of several sites within the southeastern United States is notable because this region has reported the highest rate of AIDS cases [15].
Why does the failure of getting HIV treatment to patients earlier in their HIV disease course matter? The individual benefits of earlier HIV diagnosis and care are clear. ART initiation prior to a CD4+ T cell count decline below 200 cells/mm3 dramatically decreases HIV mortality and clinical events [16, 17]. More recent data indicate a benefit to starting ART before CD4+ T cell counts fall below 350 cells/mm3 [18] and led to a revision in the widely followed guidelines by the Department of Health and Human Services [19] on when to initiate ART. In addition, the public health benefits of earlier diagnosis and care are substantial and entail decreased HIV transmission to partners via prevention counseling, reduced costs of care, and improved healthcare system planning. In Canada, Krentz et al [20] reported that the mean medical cost in the year following HIV diagnosis for late presenters (CD4+ T cell count <200 cells/mm3) was more than twice that for patients who did not present late. The issues of cost and healthcare planning are even more relevant today given ongoing economical constraints, as exemplified by the recent return to wait lists among AIDS drug assistance programs in several US states, leaving an increasing number of HIV-infected patients without prompt access to HIV treatment. The findings by Althoff et al [3] reveal that, despite such compelling data, there is much room for improving our ability to link more HIV-infected individuals with effective treatment prior to immunological deterioration.
Few studies of late presentation, including the one published in this issue, have evaluated HIV RNA levels at the time of presentation for HIV care. Because HIV RNA levels correlate with risk of sexual transmission [21], the data in studies of late presentation could provide additional rationale for increased HIV testing, since those presenting with high viral loads reflect missed opportunities to stem onward transmission with treatment and prevention measures. Whether we need HIV RNA information in addition to the overwhelming data that indicate a benefit to earlier diagnosis and care is debatable, but it is worthy of consideration in future studies of late presentation to HIV care.
Earlier in the HIV epidemic, many predicted that improvements in HIV therapy and outcomes would translate into increased HIV testing, and thus earlier diagnosis and care. The work of Althoff et al [3] and others refutes this. The debates are just beginning on the "test and treat" strategy that incorporates the benefit of earlier initiation of ART, regardless of CD4+ T cell count, to prevent ongoing HIV transmission. However, the success of any secondary preventative or treatment strategy is dependent on the ability to test, diagnosis, and link HIV-infected individuals into care. We cannot "treat" until we "test." Accordingly, with such compelling data, we have to ask why we fail to help more HIV-infected individuals access treatment earlier in their disease course. Regardless of the CD4+ T cell count cutoff used (or not used) in national and international guidelines, they will have no impact on HIV-infected individuals who remain unaware of their status. The robust data presented by Althoff et al [3] provide additional incentive for increased investments in more widespread screening for HIV, as recommended in recent guidelines [1, 2, 19], especially among, but not limited to, older, heterosexual, and African American individuals in the United States and Canada.
  iconpaperstack view older Articles   Back to Top