The changing virulence of HIV........"more frequent testing among people at risk and earlier ART initiation, and the need for an HIV vaccine or other interventions to reduce set-point viral load and the viral reservoir"......."These findings emphasise the important need for early HIV testing and engagement with care given the implications for treatment and prevention"
Download the PDF here
Download the PDF here
The Lancet HIV Dec 2014
Joshua Herbeck, Connie Celum
International Clinical Research Center, Department of GlobalHealth, University of Washington, Seattle, WA 98104, USA
Trends in HIV virulence have been studied since the beginning of the epidemic. Early studies examined direct clinical events, but the introduction of combination antiretroviral therapy (ART) necessitated the shift to studies of prognostic markers measured before initiation of treatment. Commonly used markers are the set-point viral load, CD4 T-cell count at seroconversion, and rate of CD4 cell decline, which are typically regressed against seroconversion dates to assess trends. Increasing set-point viral loads or decreasing CD4 cell counts can be interpreted as increasing virulence, and (for increasing set-point viral load) increasing HIV transmissibility.1, 2
In The Lancet HIV, Pantazis and colleagues3 examined trends in set-point viral load 1 year after seroconversion, CD4 cell count at seroconversion, and individual CD4 cell decline in CASCADE, which is a collaboration of 28 cohorts of mostly European individuals with estimated dates of HIV seroconversion.
Overall, the trends are consistent with increasing HIV virulence and transmissibility; the authors estimated that from 1979 to 2008, median set-point viral load increased by about 0·4 log10 viral RNA copies per mL (from 4·05 to 4·5 log10 copies per mL), and CD4 cell counts at seroconversion decreased by about 200 cells per μL (from about 770 to about 570 cells per μL). Notably, trends in all three markers plateaued beginning around 2002, after which HIV virulence seems to decrease. (from Jules: see full text of original publication below - "Rates of CD4 cell loss were stable up to 1996 becoming faster between 1996 and 2002 and returning back to slower rates thereafter (figure 2)......Figure 1 Estimated CD4 cell count at seroconversion by calendar year of seroconversion......Figure 3 Estimated plasma viral load set-point by calendar year of seroconversion"
The analysis included 15 875 HIV-infected individuals, contributing a total of 110 168 CD4 cell counts and 88 205 HIV plasma viral load measurements, making this study by far the largest of its kind to date. The individuals represented about 62% of CASCADE, excluding children, individuals without CD4 or viral load data before AIDS or ART initiation, individuals who seroconverted after 2008 (due to insufficient follow-up time to estimate marker values), and the two African cohorts in CASCADE. CD4 cell count slopes and viral load trajectories were calculated for each individual, with point estimates for set-point viral load and CD4 cell count subsequently used for tests of association with seroconversion date. Potential confounders included age at seroconversion, sex, transmission risk group, ethnic origin, year entered into the cohort, and method of seroconversion ascertainment. Sensitivity analyses of potential confounders (viral load assay; censoring because of AIDS, death, or ART initiation; risk subgroups; length of intervals between HIV tests) revealed similar trends.
No consensus exists on whether HIV virulence has changed, and reports from among more than 30 published studies have shown stable,4, 5 increasing,6, 7 and decreasing virulence.8, 9 Perhaps the results from the analysis by Pantazis and colleagues3 will generate such consensus. First, the study was substantially larger than the next largest of its kind (2174 individuals in the US Military HIV Natural History Study10), and included 26 unique HIV cohorts from western Europe, Australia, and Canada. Trends estimated from a large combined database such as this are likely to be less biased than individual cohorts or study populations. Second, when Pantazis and colleagues3 assumed a strictly linear relation between seroconversion date and set-point viral load, the estimated trend was 0·016 log10 copies per mL per year (95% CI 0·013-0·019; p<0·0001). This trend is consistent with the summary estimate from a recent meta-analysis of HIV prognostic markers,11 0·013 log10 copies per mL per year (95% CI -0·001 to 0·027; p=0·07).
If the observed set-point viral load and CD4 cell count trends are an accurate and unbiased reflection of HIV virulence changes, what is the most likely biological explanation? Pantazis and colleagues3 offer the possibility for adaptive viral evolution toward an optimum level of virulence that balances transmission with mortality.12 The apparent plateau in virulence starting in 2002 is consistent with adaptive evolution of HIV virulence in the early stages of an epidemic.13, 14 Alternatively, widespread use of combination ART might modify the optimum virulence, either by shortening the available timeframe for transmission and thus selecting more virulent viruses, or through the preferential early treatment of individuals with high set-point viral load (likely to be symptomatic) and thus selecting for decreased virulence.
The public health implication of increasing HIV virulence, manifested by higher set-point viral load and lower CD4 cell count at seroconversion, is that infectiousness is increasing since the beginning of the epidemic. This observation supports the implementation of more frequent testing among people at risk and earlier ART initiation, and the need for an HIV vaccine 15 or other interventions to reduce set-point viral load and the viral reservoir.
Temporal trends in prognostic markers of HIV-1 virulence and transmissibility: an observational cohort study
Lancet HIV Nov 18 2014
Nikos Pantazis, Kholoud Porter, Dominique Costagliola, Andrea De Luca, Jade Ghosn, Marguerite Guiguet, Anne M Johnson, Anthony D Kelleher,
Charles Morrison, Rodolphe Thiebaut, Linda Wittkop, Giota Touloumi, on behalf of CASCADE Collaboration in EuroCoord
Published Online: 17 November 2014
"These results suggest that time needed to cross the 350 CD4 cells per μL threshold decreased by up to about 50% (ie, from a mean of 7·0 years for a person who seroconverted in 1980 to 3·4 years for a person with the same characteristics who seroconverted in 2004). The potentially halved time to AIDS indicates a change in the natural history of HIV infection from that in the pre-antiretroviral era and data from individuals infected in the 1980s and 1990s.20 Moreover, according to the estimates of Lingappa and colleagues,15 our estimated increase of 0·4 log10 copies per mL in set-point plasma viral load corresponds to a potential 44% increase in virus transmissibility."
Measures of CD4 T-cell count and HIV-1 plasma viral load before antiretroviral therapy are proxies for virulence. Whether these proxies are changing over time has implications for prevention and treatment. The aim of this study was to investigate those trends.
Data were derived from the Concerted Action on SeroConversion to AIDS and Death in Europe (CASCADE) collaboration of mainly European seroconverter cohorts. Longitudinal CD4 cell counts and plasma viral load measurements before the initiation of antiretroviral therapy or AIDS onset were analysed by use of linear or fractional polynomials mixed models adjusting for all available potential confounders. Calendar time effects were modelled through natural cubic splines.
15 875 individuals seroconverting from 1979 to 2008 fulfilled the inclusion criteria; 3215 (20·3%) were women; median follow-up was 31 months (IQR 14-62); dropout before starting antiretroviral therapy or AIDS onset was 8·1%. Estimated CD4 counts at seroconversion for a typical individual declined from about 770 cells per μL (95% CI 750-800) in the early 1980s to a plateau of about 570 cells per μL (555-585) after 2002. CD4 cell rate of loss increased up to 2002. Estimated set-point plasma viral loads increased from 4·05 log10 copies per mL (95% CI 3·98-4·12) in 1980 to 4·50 log10 copies per mL (4·45-4·54) in 2002 with a tendency of returning to lower loads thereafter. Results were similar when we restricted analyses to various subsets, including adjusting for plasma viral load assay, censored follow-up at 3 years, or used variations of the main statistical approach.
Our results provide strong indications of increased HIV-1 virulence and transmissibility during the course of the epidemic and a potential plateau effect after about 2002.
European Union Seventh Framework Programme.
HIV-1 is characterised by great genetic diversity.1 Its continuous evolution, since the beginning of the pandemic, has given rise to many subtypes and circulating recombinant forms.2 The widespread use of and selective pressure imposed by antiretroviral therapy has also contributed to this diversity with drug-resistant mutations spreading.3
Evidence suggests that different HIV-1 strains might differ in virulence (ie, capacity of virus to cause disease).4 However, despite implications for treatment and prevention, whether virulence has changed over time is unclear. Several studies attempting to answer this question have had conflicting results-showing that virulence is decreasing,5, 6, 7 stable,8, 9, 10 or increasing11, 12, 13 over time.
Older studies have used direct measures of virulence, such as time between infection and AIDS or death.5, 6, 7, 8 The introduction of combination antiretroviral therapy in 1995 and substantial decreases in mortality and progression to AIDS have rendered this approach obsolete. Therefore, most studies have assessed virulence through proxies such as set-point plasma viral load, CD4 T-cell count at seroconversion, and, more rarely, CD4 cell decline rate.9, 10, 11, 12, 13 Because these markers are significantly correlated with the rate of disease progression in the absence of antiretroviral therapy,14 higher set-point plasma viral load, lower initial CD4 cell counts, and faster CD4 cell loss are usually interpreted as indicative of increased HIV-1 virulence. Moreover, as plasma viral load is associated with risk of viral transmission,15 the study of temporal trends in set-point plasma viral load is of additional interest.
We used Concerted Action on SeroConversion to AIDS and Death in Europe (CASCADE) data16 to estimate whether these measures have changed over the previous 30 years.
CASCADE is a collaboration of 28 cohorts of individuals with well estimated dates of HIV seroconversion (seroconverters).16 We used data pooled in September, 2011, within EuroCoord. All collaborating cohorts received approval from their regulatory or national ethics review boards (appendix).
Seroconversion dates were estimated as the midpoint between the last documented negative and first positive HIV antibody test dates for most participants (84·6%) with the interval between tests being 3 years or less. For the remaining individuals, seroconversion date was estimated through laboratory methods (PCR positivity in the absence of HIV antibodies or antigen positivity with four or fewer bands on western blot), or as the date of seroconversion illness with both an earlier negative and a later positive HIV test done within a time interval of 3 years or less. All plasma viral load and CD4 testing was done within laboratories in industrialised countries.
Eligible individuals were those who seroconverted by Dec 31, 2008, while aged 15 years or older, and with one or more CD4 cell counts and plasma viral load measurements available before the initiation of antiretroviral therapy or clinical AIDS onset. We excluded more recent seroconverters because their follow-up would not be long enough to accurately estimate their set-point plasma viral load or their CD4 cell loss. Children were excluded because they are known to have different CD4 cell counts compared with older HIV-positive individuals. Seroconverters from two African cohorts were also excluded because treatment guidelines and availability of antiretroviral therapy differ substantially in Africa,17 which results in different patterns of follow-up, initiation of antiretroviral therapy, and AIDS onset, from those in resource-rich countries. Moreover, in HIV-1 infected individuals, CD4 cell counts during natural history are substantially different in sub-Saharan Africa compared with Europe.18 Thus, our analyses are based on data from cohorts in western European countries, Canada, and Australia. CD4 cell counts and plasma viral load measurements taken after initiation of antiretroviral therapy or clinical AIDS were excluded from all analyses.
Exploratory data analyses for CD4 cell counts were based on available measurements within 3-6 months after the estimated seroconversion date, or the estimated (through patient-specific regressions) initial values and slopes. Plasma viral load exploratory analyses were based on patient-specific estimates of the set-point viral load derived through the available measurements taken 6-18 months after seroconversion. Longitudinal CD4 cell counts and plasma viral load measurements were analysed with mixed models with square-root and log10 transformations to normalise distributions, respectively. Average change in CD4 cell count was assumed linear (on the square-root scale), as has been shown in most natural history studies, whereas for plasma viral load a fractional polynomial was used to capture non-linearity because plasma viral load concentrations tend to decrease more rapidly in the first months after seroconversion, stabilising or increasing slowly thereafter.19
Calendar time of seroconversion was allowed to affect initial marker concentrations and slopes through a four-knot natural cubic spline. This choice was based on visual and formal assessment of results from the exploratory analysis and the fit of similar models with varying number of knots (two to eight).
The list of potential confounders included sex, age at seroconversion, risk group, ethnic origin, time enrolled into the constituent CASCADE cohort, and method of seroconversion determination combined with the length of the interval between HIV tests (greater than 12 weeks vs 12 weeks or less). CD4 cell counts and plasma viral load set-point (mean 1 year after seroconversion) related predictions were derived from the fitted models. Details of the statistical methods are given in the appendix.
Similar models were fitted to the subgroup of white men, infected through sex between men (MSM), the subgroup of individuals with a midpoint method of seroconversion determination and a test interval less than 180 days, and data with artificial censoring at 3 years after seroconversion (to minimise the potential effects of censoring because of AIDS, death, or initiation of antiretroviral therapy). These subsets will be referred to as white MSM, short HIV test interval, and censored at 3 years, respectively.
A set of additional sensitivity analyses were done: the plasma viral load model was refitted including adjustments for the quantification assays, and again after transformation of measurements into a common scale using published formulas; models were refitted after the exclusion of individuals whose region of origin was recorded as a non-industrialised country; alternative methods for statistical analysis of lower and higher complexity were used; and a stratified by sex version of the main analysis has been done.
Role of the funding source
The funder had no role in study design, data collection, data analysis, data interpretation, or writing of the report. The corresponding author had full access to all the data in the study, and had final responsibility for the decision to submit for publication.
Of 25 629 individuals in CASCADE, 9754 were excluded from all analyses (79% because of a lack of the required CD4 cell count and plasma viral load measurements). Compared with those included in the analyses, excluded individuals were more likely to have been infected in earlier years or through injection drug use, and more likely to be women, non-white, and slightly younger.
Median overall follow-up of the 15 875 included individuals (table 1) was 31 months (IQR 14-62) and 1287 (8·1%) individuals dropped out of the study before initiating antiretroviral therapy or progressing to AIDS. The highest proportion of women was recorded in the early 1990s and fell thereafter. Median age at seroconversion was higher in recent seroconverters. Infection through injection drug use was less likely in recent years, whereas infection arising in MSM was more likely. Ethnic origin was unknown for a substantial proportion of patients, and trends could not be analysed. The proportion with laboratory evidence of acute seroconversion increased over time, whereas the length of the seroconversion test interval was stable.
The individuals included contributed a total of 110 168 CD4 cell count measurements (table 2). Exploratory data analyses showed a negative association between CD4 cell count and calendar year of seroconversion, whereas trends of CD4 cell count slopes were unclear. Estimated CD4 cell count at seroconversion steadily decreased from about 770 cells per μL (95% CI 750-800) in the early 1980s to a plateau of about 570 cells per μL (555-585) after 2002 for a typical individual (figure 1). Assuming a linear effect of calendar time on initial CD4 cell count for the whole study period, the estimated decrease per year was 0·162 (95% CI 0·144-0·179) on the square-root scale corresponding to between 7·5 cells per μL (6·8-8·3) and 9·1 cells per μL (8·0-10·1) lower baseline CD4 cell count for each subsequent calendar year (p<0·0001). Restriction of analyses to the white MSM, short HIV test interval, and censored at 3 years subsets yielded similar results (figure 1).
Rates of CD4 cell loss were stable up to 1996 becoming faster between 1996 and 2002 and returning back to slower rates thereafter (figure 2). Assuming a linear effect of calendar time of seroconversion on the CD4 cell loss rate, the estimated change in CD4 cell slope (square-root scale) per calendar year was -0·011 (cells per μL)1/2 per year (95% CI -0·016 to -0·007). This corresponds to an addition of about 6 CD4 cells per μL (3-8) to the yearly rate of CD4 cell loss for each subsequent decade of seroconversion (p<0·0001). Restriction of the analysis to the white MSM and short HIV test interval subsets gave similar results, but without the trend for returning to less marked decay rates after 2002 for the former (figure 2). Results from the analysis of the censored at 3 years (figure 2) subset were similar to those of the main analysis with the exception of a downward shift of the estimated CD4 cell slopes showing faster CD4 cell loss across the whole study period.
Similar trends of both CD4 cell counts at seroconversion and CD4 cell loss rate were recorded when the analysis was restricted to people originating from industrialised countries. Variations of the main statistical methods led to similar results. The stratified-by-sex analysis revealed that trends of CD4 cell count at seroconversion were comparable between men and women but changes of CD4 cell rate of reduction over time were more pronounced in women (appendix).
We analysed 88 205 HIV-RNA plasma viral load measurements (table 2). Initial exploratory analyses revealed a trend of increasing plasma viral load set-points among recent seroconverters with indications for a potential plateau effect after about 2002.
Estimated plasma viral load set-points for a typical individual (figure 3) increased from 4·05 log10 copies per mL (95% CI 3·98-4·12) in 1980 to 4·50 log10 copies per mL (4·45-4·54) in 2002. Set-point viral loads seemed stable between 2002 and 2006 but with a tendency for returning to lower loads thereafter. Assuming a linear effect of calendar time on the set-point viral loads, the estimated increase per subsequent decade of seroconversion was 0·16 log10 copies per mL (95% CI 0·13-0·19; p<0·0001). Restriction of the analysis to the white MSM, short HIV test interval, and censored at 3 years subsets led to similar results but with slightly less pronounced changes over time for the last two cases (figure 3).
Adjustment for differences in plasma viral load assays, fitting of the same model to individuals from industrialised countries, and use of variations of the main statistical methods did not change the main findings. Stratification of the analysis by sex showed that changes over time in set-point plasma viral load were similar to those found in the main analysis in both men and women (appendix).
During 1979-2008, CD4 cell counts after seroconversion decreased by almost 200 cells per μL and set-point plasma viral load increased by about 0·4 log10 copies per mL. Rate of CD4 cell loss also showed signs of increasing in that period. These results suggest that time needed to cross the 350 CD4 cells per μL threshold decreased by up to about 50% (ie, from a mean of 7·0 years for a person who seroconverted in 1980 to 3·4 years for a person with the same characteristics who seroconverted in 2004). The potentially halved time to AIDS indicates a change in the natural history of HIV infection from that in the pre-antiretroviral era and data from individuals infected in the 1980s and 1990s.20 Moreover, according to the estimates of Lingappa and colleagues,15 our estimated increase of 0·4 log10 copies per mL in set-point plasma viral load corresponds to a potential 44% increase in virus transmissibility.
Findings from previous studies5, 6, 7, 8, 9, 10, 11, 12, 13 have been discordant for several reasons (panel). Most studies estimate CD4 cell counts at seroconversion and set-point plasma viral load by averaging marker measurements within a specific time interval. These techniques might suffer from selection bias and reduced power introduced by the irregular timing of the marker measurements and their high variability within the first year after seroconversion. Many studies had no information about seroconversion dates, which obviously precludes an accurate estimation of set-point plasma viral load and baseline CD4 cell counts. Diversity in the time periods covered by these studies, the availability and handling of important confounders, and the use of different statistical methods could, at least partly, explain the variation in results. A recent meta-analysis21 examined patterns of baseline CD4 cell count (at seroconversion, if available, otherwise at HIV diagnosis) and set-point plasma viral load. Even though most of the included studies were of individuals with unknown dates of HIV seroconversion, pooled estimates from this meta-analysis, extrapolated over a 30 year period, were close to our own.
Research in context
We initially searched PubMed from Jan 1, 1990, to July 15, 2014, for English language publications with the following query: HIV AND ("set point" OR "CD4 count" OR "virulence") AND ("trends" OR "evolution" OR "temporal"). The search yielded 435 articles and we identified those that addressed changes in HIV-1 virulence over time either directly (ie, through survival or AIDS-free time)5, 6, 7, 8 or indirectly through post-seroconversion CD4 cell counts and slope or viral load set-point.9, 10, 11, 12, 13 The search was expanded by browsing related citations including those from virulence studies. A recent meta-analysis21 was also thoroughly checked for potential additions to the list of studies with objectives comparable with those of our work. Findings from these studies were not consistent because some suggested an increase11, 12, 13 and others a decrease in HIV-1 virulence over time.5, 6, 7 Some studies did not detect any significant temporal changes.8, 9, 10 Results from the meta-analysis of Herbeck and colleagues21 were consistent with increased virulence of HIV-1 over the course of the epidemic. Our results showed that the HIV viral set-point has increased by about 0·5 log10 copies per mL during the 1980s and 1990s and has become stable thereafter. During the same period, post-seroconversion CD4 cell counts have decreased by about 200 cells per μL.
In this study, the largest to investigate changes in HIV virulence, we show systematic changes in two of the most important prognostic markers of HIV infection in the last 30 years, consistent with increased HIV-1 virulence and transmission risk. Our results and those of other studies are in agreement with the hypothesis of adaptive evolution of HIV-1 in the human population.22, 23, 24 These findings emphasise the important need for early HIV testing and engagement with care given the implications for treatment and prevention.
To our knowledge, this study is the largest seroconverter study exploring this issue, has very wide calendar times of seroconversion coverage, and uses a unified approach to explore temporal trends in the proxies of interest. We used models that included adjustments for several factors but the available data do not include all cofactors that potentially modulate disease progression and thus, as in all observational studies, residual confounding cannot be excluded.
A potential source of bias in our study is the diversity of assays and platforms used to quantify plasma viral load and CD4 cell count over the course of the epidemic across the different collaborating cohorts. However, over time viral load assays have targeted lower thresholds of quantification, whereas in our study, most of the measured plasma viral loads were at much higher concentrations where the concordance between assays is high.25 Moreover, according to a study that compared older with modern, more sensitive assays,26 set-point plasma viral load measurements by the former were slightly higher, which if true, would mean that our estimates are conservative. Thus, a systematic shift over time towards overestimation of plasma viral load seems an unlikely explanation for our findings. Relevant sensitivity analyses yielded practically the same results as the main analysis. Similarly, a systematic bias towards underestimation of CD4 cell count in more recent years does not seem likely.
Deterioration of HIV-RNA in stored samples is another potential issue: HIV-RNA quantification has been introduced into the clinical routine since 1996, thus most of the plasma viral load measurements for the initial years of the epidemic were done retrospectively on frozen samples. However, the increasing trends in set-point plasma viral load seen in our study continue after 1996, and the estimates of the deterioration rate in frozen samples are low,27 and are not enough to fully explain changes before 1996.
Another issue is the high percentage of participants with missing information about HIV-1 subtype (72%) or ethnic origin (54%) in our data, because these factors can affect baseline CD4 cell count and plasma viral loads and how they change over time.18, 28 The increasing proportion of non-white individuals (mainly of African ancestry) infected with non-B HIV-1 subtypes, and the likely changing socioeconomic profile of our study population could have some effect on our results. However, when we refitted our models restricted to white MSM, a group most probably dominated by subtype B infection, or to individuals originating from industrialised countries, we found no major deviations from our main findings.
Our study included data only from seroincident cohorts because all participants have well estimated dates of seroconversion. Thus, biases that could result from individuals entering the cohorts at different stages of the disease are unlikely. However, there are differences in the methods used to determine the seroconversion date in our study with laboratory evidence of acute infections being more frequent in recent years. These differences could lead to an increased proportion of individuals entering the cohorts with very high plasma viral loads. This mechanism though cannot fully explain the observed trends because the models we used included adjustments for the method of seroconversion determination and the length of the interval between the last negative and first positive HIV test, which is a marker for acute infection.29 Additionally, the use of all available measurements from all study participants within models that take into account the longitudinal evolution of plasma viral load and CD4 cell count from seroconversion onwards should be protective against biases caused by differences in the accuracy of the seroconversion date determination or the entry time to the studies. Relevant sensitivity analyses gave results that were similar to those obtained by the main analysis. However, representativeness of patients included in the CASCADE cohorts cannot be fully assessed, so potential systematic changes in the sampled population cannot be excluded as a potential source of bias.
Finally, as in most prospective observational studies, there is the issue of censoring or premature termination of measurements. In the earlier years of the epidemic, AIDS and death were the primary sources of censoring, whereas in more recent years, censoring occurs mainly because of the initiation of antiretroviral therapy. AIDS and death are most probably informative censoring events,19 whereas censoring due to initiation of antiretroviral therapy can probably be ignored when mixed models are used because it is based mainly on already observed marker values.30 Thus, the exclusion of measurements after the start of antiretroviral therapy should, not in theory, introduce bias in our results. However, antiretroviral therapy guidelines, provider practices, and patients' willingness to initiate antiretroviral therapy have changed during the past two decades as treatment regimens have become more potent and with fewer toxic effects. Censoring mechanisms will probably have a minimum effect on estimates of baseline marker concentrations.19 A relevant sensitivity analysis showed that estimated temporal trends of baseline CD4 cell counts and set-point viral load remained practically the same as those obtained from the main analysis.
Most of the participants in our study had access to combination antiretroviral therapy immediately after seroconversion, so AIDS and death rates were very low. Direct assessment of HIV-1 virulence based on hard clinical events was not possible so we used a pragmatic approach with the use of marker-based proxies associated with the rate of disease progression. However, values of such proxies at the individual level are the result of complex interactions between viral, host, and environmental factors.
Non-viral factors have been well studied but the contribution of the virus genotype to HIV virulence has only recently been shown with results from several studies suggesting substantial heritability (ie, the proportion of variance in the viral load that is explained by viral genetic factors) of varying magnitude.31 HIV-1 might have evolved towards achieving a set-point viral load that maximises its overall transmission potential by reaching an optimum trade-off between infectiousness and host death.22 Our estimates of viral load for the past decade are close to the estimated optimum (4·52 log10 copies per mL) and are also indicative of a potential plateau effect. HIV-1 mutants selected under the pressure of antiretroviral therapy might be associated with higher plasma viral load and lower CD4 cell counts.32 Despite our efforts to control for non-viral factors, our findings do not provide direct evidence for attributing the observed trends to virus evolution but are consistent with an increased virulence because of adaptive evolution of HIV-1 in the human population or selective pressure from antiretroviral therapy.22, 23, 24, 32, 33
These findings have important public health implications that emphasise the need for increased awareness of one's HIV status through intensification of testing programmes because higher viraemia is associated with increased risk of transmission. Furthermore, the substantially shortened time between HIV-1 infection and the crucial CD4 cell count threshold of 350 cells per μL, at which there is universal agreement that combination antiretroviral therapy should be initiated, emphasises that individuals at risk of acquiring HIV cannot afford to delay testing for HIV. Continued monitoring of these trends and more studies in other populations and geographical regions are needed to detect further changes and to assess the consistency of our findings.