icon-    folder.gif   Conference Reports for NATAP  
 
  Conference on Retroviruses
and Opportunistic Infections (CROI)
February 13-16, 2017, Seattle WA
Back grey_arrow_rt.gif
 
 
 
The Kidney at CROI 2017
 
 
  Christina Wyatt MD - Associate Professor, Medicine/ Nephrology Icahn School of Medicine at Mount Sinai New York, NY
 
Gilead presented 144-week data on the safety of TAF versus TDF in co-formulation with elvitegravir/ cobicistat/ emtricitabine (E/C/F) from studies 104 and 111 (Arribas et al, poster 453). Consistent with previously published findings at weeks 48 and 96, E/C/F/TAF had a more favorable impact on urine protein: creatinine ratio (UPCR) and on the urine biomarkers retinol binding protein and beta2 microglobulin. There were also more investigator-reported renal adverse events leading to discontinuation prior to 144 weeks in the E/C/F/TDF arms (11 versus 0), including 4 cases of proximal tubulopathy. Discontinuation for loss of bone mineral density was also more common with E/C/F/TDF (6 versus 0). The majority of these clinical events occurred after week 48. Because renal adverse events were very rare in the pre-marketing trials of TDF, this raises the question of whether cumulative toxicity was simply more apparent in the recent trials because of the long follow-up, or whether the risk of TDF toxicity is higher in the setting of cobicistat boosting and increased tenofovir exposure.
 
In a real-world cohort of ART-treated patients, co-administration of TDF with elvitegravir/ cobicistat was associated with significantly higher plasma tenofovir concentrations compared to co-administration of TDF with other antiretroviral drug classes, including ritonavir-boosted PIs (Gervasoni et al, poster 411). Consistent with prior PK studies and with the proposed mechanism for improved safety with TAF versus TDF, a small cross-over study demonstrated that plasma concentrations of tenofovir decreased substantially in participants switched from TDF to TAF (both co-formulated with E/C/F) (Podany et al, poster 408). Intracellular levels of tenofovir diphosphate were also higher after the switch.
 
Drug-drug interactions between TDF and ledipasvir/ sofosbuvir were evaluated in secondary analyses of two clinical trials, SWIFT-C and ION-4. In both trials, the co-administration of ledipasvir/ sofosbuvir with TDF was associated with increased tenofovir exposure. In SWIFT-C trial, concomitant use of the direct acting antiviral ledipasvir/ sofosbuvir with TDF was shown to increase both the plasma concentration and the intracellular concentration of tenofovir in red blood cells (MacBrayne et al, poster 404). The observed increase in tenofovir concentration from baseline to week 8 was nearly 2-fold in plasma and 16-fold in red blood cells. The clinical significance of red cell loading is not known. While the prescribing information currently recommends avoiding the combination of TDF and a boosted PI with ledipasvir, these and other studies suggest that the risk of TDF toxicity may be substantially increased even in the absence of a PI. In a post hoc analysis of data from ION-4, levels of the proximal tubular biomarkers retinol binding protein and beta2-microglobulin correlated with tenfovir exposure as measured by area under the curve (AUC). Baseline levels of both biomarkers were predictive of incident proteinuria during ledipasvir/ sofosbuvir treatment (Chan et al, poster 138). This analysis also demonstrated a high incidence of grade 1 proteinuria, suggesting the potential for clinically relevant toxicity in less selected "real world" populations.
 
Prior studies of potential genetic susceptibility to TDF toxicity have largely focused on polymorphisms that are associated with clinical evidence of toxicity, such as proximal tubulopathy. A small pharmagenomic study from the UK instead focused on polymorphisms that predict plasma tenofovir exposure, identifying a polymorphism in the ABCG2 gene that is significantly associated with lower tenofovir concentration in both plasma and in urine (Bracchi et al, poster 418). In their sample, co-administration of elvitegravir/ cobicistat was also associated with lower urine tenofovir concentration as compared to coadminstration of raltegravir; perhaps because of the small sample size, the authors did not detect a significant difference in plasma tenofovir concentrations between the two regimens in this small study.
 
Several observational cohorts evaluated the impact of long-term ART use on kidney function. In an analysis of data from nearly 17,000 participants followed in the UK-CHIC, both TDF and atazanavir were associated with a negative eGFR slope. After adjusting for demographics, nearly 16% of participants on TDF and 22% of those on atazanavir experienced a rapid decline in eGFR, defined as an eGFR decline > 3ml/min/1.73m2 per year (Hamzah et al, poster 650). Factors that were independently associated with rapid eGFR decline (> 3ml/min/1.73m2 or > 5ml/min/1.73m2 per year) were evaluated in multivariate logistic regression models, with separate models created for participants on TDF and for participants on atazanavir; 3,378 participants co-administered TDF and atazanavir were included in both models. Factors that were significantly and consistently associated with rapid eGFR decline in the adjusted analyses included black race and lower baseline eGFR, traditional risk factors for CKD progression. Unfortunately, the analysis could not be adjusted for other traditional risk factors such as diabetes and hypertension, or for concomitant non-ART medications, as these data are not collected in UK-CHIC. In addition, the atazanavir group included both participants on atazanavir/r and unboosted atazanavir, so ritonavir-boosting could not be considered as a potential confounder. Among participants on TDF (n=16,172), co-administration of atazanavir, darunavir, and lopinavir/r was also associated with rapid eGFR decline, regardless of the definition. Among participants on atazanavir, the majority received concomitant TDF (3,378 versus 784 on non-TDF regimens). Concomitant use of TDF was significantly associated with the risk of eGFR decline > 3 ml/min/1.73m2 among participants on atazanavir, but was not an independent predictor of more rapid eGFR decline > 5 ml/min/1.73m2. The authors did not find an association between rapid eGFR decline and mortality in this study. There are several possible explanations for this unexpected finding, including methodological challenges inherent in this type of analysis. Because the link between low eGFR and mortality is largely driven by increased cardiovascular mortality, the young age of this cohort (mean age 37 years) may have also masked an effect. Alternatively, it is possible that drug-induced declines in eGFR do not have the same clinical significance with respect to non-renal outcomes as eGFR decline secondary to traditional CKD risk factors.
 
[From Jules: and then there is this: Atazanavir & Creatinine - (02/26/17) ]
 
In contrast, a smaller retrospective cohort from Australia demonstrated no association between current or cumulative TDF use and the development of CKD as defined by confirmed GFR < 60 ml/min/1.73m2 (Woolnough et al, poster 687). Interestingly, only 56% of the cohort was using TDF at baseline, which is lower than many of the other studies and may suggest more judicious use of TDF in patients at higher risk of toxicity. Traditional risk factors for CKD progression, including diabetes, older age, lower baseline GFR, and pre-existing proteinuria, were strong predictors of confirmed GFR < 60 ml/min/1.73m2. Previously published CKD risk scores derived in D:A:D and the US Veterans population, respectively, were also predictive of CKD risk in this population.
 
While several ritonavir-boosted protease inhibitors, including atazanavir, lopinavir, and indinavir, have been reproducibly linked to lower eGFR in multiple studies, the effect on darunavir has been less clear. An analysis from the D:A:D cohort demonstrated no increased risk of clinically significant eGFR decline associated with cumulative use of darunavir/r (Ryom et al, poster 653). The previously observed risk associated with atazanavir/r was confirmed. A smaller effect size for this effect in later years of the study period may have been the result of increasing awareness and more common discontinuation of atazanavir as observed in individuals with or at increased risk of kidney function decline, as defined by the D:A:D CKD risk score.
 
Data on the renal safety of ART in adolescents and in pregnant women are scarce. In a small cohort of perinatally infected adolescents and young adults and HIV-uninfected controls followed prospectively at the NIH, urine protein: creatinine ratio and urine albumin: creatinine ratio were significantly higher among HIV-infected versus control subjects (Mattingly et al, poster 649). While there was no significant decline in eGFR overall, decline in eGFR during followup was modestly correlated with cumulative use of TDF (r=-0.42, p=0.02). In a prospective cohort of 246 Malawian women newly diagnosed with HIV during pregnancy, baseline creatinine clearance was normal in nearly all women (Melhado et al, poster 686). Because creatinine clearance is known to substantially overestimate true kidney function in the setting of pregnancy, it is difficult to evaluate the clinical relevance of the observed decline in creatinine clearance over 11 months of therapy with TDF-containing ART. There were no clinical events requiring TDF discontinuation, but after 11 months more than 11% of the population had a mildly decreased creatinine clearance < 90 ml/min. These data do not strongly support the authors' conclusion that clinical monitoring in the absence of laboratory testing is adequate to minimize the risk of TDF kidney toxicity; however, the individual and public health benefits of increasing access to ART likely outweigh any potential risk of kidney injury.
 
Two abstracts focused on kidney transplantation in HIV-positive individuals. In a small study comparing the outcomes of paired deceased donor kidneys where one was transplanted into an HIV-positive recipient and the other into an HIV-negative recipient, acute allograft rejection was more common and allograft survival was worse in the HIV-positive recipients (Gathogo et al, poster 688). After adjusting for race/ ethnicity and HLA matching, these difference were attenuated. Overall patient survival was similar between HIV-positive and HIV-negative recipients.
 
A survey study evaluated the preparation of US transplant centers to perform solid organ transplants using HIV-positive donors (Rasmussen et al, poster 689). Slightly more than half of US transplant centers responded that they plan to offer HIV-positive donor kidneys to HIV-positive recipients, and nearly 90% of centers supported the potential use of HIV-positive donors in this setting. Centers that are currently planning to accept HIV-positive donor organs tended to be higher volume centers with a higher local prevalence of HIV infection.