#NephMadness 2018: Transplantation Region
Submit your picks! | NephMadness 2018 | #NephMadness | #TransplantRegion

Copyright: Kotin / Shutterstock
Selection Committee Member for the Transplantation Region:
Dorry Segev, MD
Dr. Segev is the Marjory K. and Thomas Pozefsky Professor of Surgery and Epidemiology and Associate Vice Chair of Surgery at Johns Hopkins University. With a graduate degree in biostatistics, he focuses on novel statistical and mathematical methods for simulation of medical data, analysis of large healthcare datasets, and outcomes research. Follow him @Dorry_Segev.
Competitors for the Transplantation Region
Pathogenic DSAs vs The Untransplantables
Kidney Donor Risk vs Virally Infected Kidneys
Pathogenic DSAs

Copyright: Promotive / Shutterstock
Pathogenic donor specific antibodies (DSAs) are produced when a patient is exposed to foreign cells containing polymorphic human leukocyte antigen (HLA) molecules, so-called “sensitizing events,” namely pregnancy, blood transfusion, or prior transplantation. Anti-HLA antibodies directed against the donor are a major barrier to transplantation. DSAs may also be directed against minor histocompatibility antigens, with major-histocompatibility-complex (MHC) class I–related chain A (MICA) being the best known example, or against non-HLA antigens expressed on endothelial cells or other proteins.
We will focus on the star of Team Pathogenic DSAs, anti-HLA antibodies. The advent of single bead techniques have allowed transplant clinicians to identify the specific antigens against which an antibody is directed, as well as providing some quantitative data on the amount of antibody present. However, our ability to understand the large amount of data we now have at our disposal has not kept pace with the technological developments in the tests themselves.
We know that preformed DSAs (from sensitizing events before the transplant) will lead to challenges finding a compatible donor and make antibody mediated rejection (ABMR) more likely. We also know that de novo DSAs (arising after the transplant) are associated with rejection and allograft failure. In general, ABMR from pre-formed DSA occurs early, while that from de novo DSA occurs late. However, many patients with preformed DSAs do not experience ABMR and many who develop de novo DSAs have stable allograft function. How we characterize this further and classify risk more precisely is a controversial area and is an ideal topic area for a NephMadness team. Team Captain “Antibody Strength” is an imprecise player, who may be interpreted in various ways.
The presence of a DSA with sufficient potency/pathogenicity to cause a positive crossmatch is most hazardous, with a complement-dependent cytotoxicity (CDC) positive crossmatch being at greater risk for ABMR than a flow positive crossmatch (see image below, and data on 22 US centers here), and most programs not transplanting across a positive CDC. Crossmatch-negative transplants may still have a DSA detected by single-bead testing and these may be deleterious, albeit generally less so than if crossmatch-positive. An excellent primer on crossmatching for the general nephrologist may be found here.
The semi-quantitative data provided by single-bead techniques is the mean fluorescence intensity (MFI). MFI cut offs are arbitrary (for example, often >1000 is deemed significant) and defined by each lab separately, meaning comparison between centers is not straightforward. Moreover, MFI values will vary depending on batch-to-batch variability and the amount of antigen expressed on a specific bead, as well as antibodies that bind to multiple different beads due to shared epitopes (actual binding sites) across distinct HLA antigens (see NephMadness 2016, Team HLA Epitope Matching). It is also worth noting that these single bead assays are not approved as quantitative tests. For a great overview of the issues surrounding DSA testing, see here.

3D representations of HLA-A*02 showing the polymorphic regions accessible for antibody binding. β2-microglobulin (red), HLA class I α-chain (grey) with epitopes highlighted in blue. Most polymorphic regions are focused around the peptide (pink). Image courtesy of Dr. Richard Battle, Histocompatibility & Immunogenetics Laboratory, Royal Infirmary of Edinburgh. Created using HLA Matchmaker.
While this semi-quantitative data is useful, given the issues identified above, absolute cut-offs cannot be rigorously applied for individual patients. The assigned MFI value does glean some information on the amount of antibody present and various studies show an association between MFI and outcome, albeit a generally weak association.
Wehmeier et al demonstrated in a European cohort that peak individual MFI (or the cumulative MFI value of all DSAs) was associated with the occurrence of ABMR, but not with allograft loss. Moreover, MFI cut-off varied between 988-5371 depending on the rejection phenotype studied (subclinical, ABMR, any).
Another issue with MFI is that it tells you about the antibody at that time point only. Historic values should form part of the assessment. We’ve all had patients with current low-level or absent DSA who mount an aggressive memory response after re-sensitization post-transplant, and the Paris Transplant Group reported data showing that peak MFI is more predictive of ABMR than time of transplant MFI (see image below).

Lefaucheur et al, JASN
The study by Wehmeier also demonstrated nicely that it is DSA, rather than the broadness of sensitization, that confers risk. Therefore, when high calculated panel reactive antibodies (cPRAs) (ie having anti-HLA antibodies against a lot of the donor pool) are reported as a risk for adverse outcomes, the cPRAs are acting as a surrogate for DSA.
In other words, if you have no DSA to a specific donor, despite being sensitized, you are not at high risk for that transplant. So, we’re convinced that DSA is the problem but apart from a crude assessment of antibody strength, we still don’t know how to further risk-stratify these patients.
Here’s who else makes up Team Pathogenicity of DSA:
- HLA Class. Class I antigens are present on all nucleated cells with Class II antigens present on antigen presenting cells. Presence of these antibodies have traditionally been inferred by T & B cell cross-matching, as T cells express Class I only, but B cells express Class I & II HLA (antigen presenting cell). This has been largely replaced by single bead techniques allowing specific antigen detection and increased sensitivity. Any DSA may be deleterious, including non-classic antibodies (ie against HLA-C or HLA-DP). Overall, however, preformed DSA class has not been found to predict ABMR, although the presence of combined class I & II DSA may be associated with additional risk. De novo DSA frequently has a predominance of Class II antibodies, commonly directed against DQ antigens, although the class effect again doesn’t appear to be helpful in predicting outcome.
- Complement fixing status of DSA burst onto the scene in 2013 when the Paris Group published their paper describing the impact of C1q-binding DSA post transplantation on allograft outcome. C1q+ DSA was associated with higher risk of ABMR, C4d+ in allograft biopsies, and allograft failure. The risk with C1q- DSA was similar to that of patients without DSA. A follow-up study from another group suggested that the C1q effect was largely due to antibody strength, and that by concentrating and diluting antibody samples, complement fixation could be induced and lost, respectively. While complement fixing status may mirror MFI level to some degree, it is not absolute. Moreover, it has been demonstrated that after treatment of ABMR, a persistently negative C1q status was more important than the drop in MFI at predicting response. The Paris Group has published an abstract describing the use of pre-transplant eculizumab to reduce ABMR in patients with preformed C1q+ DSA. They have also employed the molecular microscope (another NephMadness 2016 team), which showed an improved transcriptomic ABMR profile in patients receiving the drug. If the results hold up, it is an elegant use of combining modern antibody assays and omics techniques with individualized treatment for high risk patients.
- Adding to the controversy is another paper from France that shows in their group, C3d fixing DSA at the time of AMR was more discriminatory for allograft loss than C1q status. This debate will run on and on.
-
Loupy et al, NEJM
-
- IgG subclass. Single antigen techniques measure total IgG but as IgG1 & IgG3 have the strongest ability to fix complement, knowing which IgG subclass may be important. Some evidence suggests that IgG3-DSA is particularly detrimental to the kidney allograft. In a cohort of 125/635 consecutive transplants that developed DSA, 28% of the immuno-dominant DSA (highest MFI antibody) was IgG3 subclass. A multivariate model including HLA class, MFI level, C1q status, and IgG subclass showed that IgG3 antibodies were independently predictive of graft failure. However, identifying subclass is cumbersome and is mostly performed in a research setting currently.
The pathogenicity of DSA is a hot topic but continues to be an enigma. It is a team with many supporters and we certainly know a lot more about what makes it tick than we used to. Questions remain, however, and many filling out their bracket this year may think it will be a few years before this team comes of age.
The Untransplantables

Copyright: Leremy / Shutterstock
A natural enemy of “Pathogenic Antibodies,” this young team contains a group of novel strategies, many untested in the big leagues, which have the potential to transform the competition as well as many peoples’ lives. As described above, preformed DSA due to prior sensitization may render a potential transplant pair HLA incompatible. ABO blood group incompatibility (ABOi), which will affect about 1/3 of living donor pairs in the US and UK, is another barrier to transplantation. Progress in desensitization has allowed us to transplant across these barriers in many cases. Moreover, the use of kidney sharing schemes, often in combination with desensitization, is another option for living donors with an incompatible donor.
ABO Incompatibility (ABOi)
ABOi transplantation, once a major barrier to living donor transplantation, was pioneered in Japan, where cultural factors have led to living donors being the predominant donor type. It is now flourishing in the US, Europe, and elsewhere due to excellent reported outcomes, particularly in the era of tacrolimus/mycophenolic acid based immunosuppression. The use of rituximab has been associated with low levels of ABMR and has replaced the need for splenectomy.
When ABMR does occur, it appears to be a different beast than ABMR post-HLA incompatible transplantation, with more favorable outcomes in ABOi-ABMR despite similar treatment protocols. Interestingly, Anti-A/B titers do not appear to correlate with the development of ABMR post-ABOi transplantation. Excellent results, indeed comparable to ABO compatible transplantation, have led to an increasing provision of ABOi transplantation outside of Japan.
An improved understanding of the risk with ABOi, including the significance of Anti-A/B titers, A1/A2 status in group A, and the co-existence of anti-HLA antibodies, has allowed the development of a tailored approach to desensitization and also questioning whether rituximab is necessary at all for standard cases. The group at Johns Hopkins, where anti-CD20 therapy is not routinely administered, have demonstrated excellent results with an ABMR rate of 15%. Selective anti-A/B removal using immunoadsorption columns is available in Europe.
HLA Incompatibility
HLA incompatible transplantation is usually not as straightforward as ABOi, and is considered a more aggressive player. The difficulties with HLA sensitization are well described in Team Pathogenic DSA. Desensitization generally occurs in the context of an HLA incompatible living donor, where donor cells are available for repeat crossmatching, and transplantation may proceed when the crossmatch has become negative. A variety of regimes have been described including the use of high dose IVIg and plasmapheresis plus low dose IVIg, with or without additional agents such as rituximab.
High dose IVIg (generally 2g/kg) was initially reported over 20 years ago in small numbers of patients either with a high PRA or an HLA incompatible living donor. Monthly infusions were associated with a drop in PRA or a negative cross-match facilitating transplantation. IVIg was continued post-transplantation. One year results appeared reasonable. It should be noted that in one report of 15 patients, 13/15 were transplanted using this approach and 1 had an early graft thrombosis. This may be coincidental but is worth bearing in mind given the pro-thrombotic tendency induced by IVIg.
The addition of rituximab to high dose IVIg has also been reported in 20 sensitized, long waiters. Mean PRA dropped from 77% to 44% and 16 were transplanted (6 from deceased donors) after a mean of 5 months. All grafts were functioning at 1 year; however, the precise contributions of IVIg and rituximab are unclear from this study.
Low dose IVIg plus plasmapheresis has become more popular. The IVIg dose is generally 100 mg/kg after each plasmapheresis using human albumin (to prevent potential sensitization using frozen plasma). Plasmapheresis may be continued post-transplant, especially if there is a rebound in DSA. Many regimes also include rituximab, making a determination of the benefit of individual interventions difficult.
Only one study has compared high dose IVIg with low dose IVIg plus plasmapheresis, although there are several limitations. Three sequential protocols at the Mayo Clinic were compared (n=13 high dose IVIg; n=32 low dose IVIg, rituximab, plasmapheresis; and n=16 low dose IVIg, rituximab, plasmapheresis plus ATG). Additionally, pre-transplant splenectomy was performed in 19 patients from group 2. Both plasmapheresis groups were significantly more likely to achieve a negative crossmatch, and less likely to develop ABMR. If the effect was real, was it the rituximab or the plasmapheresis which added benefit?
The data are somewhat difficult to interpret due to lack of randomization, small sample sizes, generally single-center studies, comparison to historical controls, the variety of treatments used, and the case mix. For example, some patients had high PRA and some had a definite DSA with or without a positive crossmatch. If crossmatch was positive, was it a flow or CDC positive? Was the CDC crossmatch enhanced by anti-human globulin (common in the US)? The studies are uncontrolled and describe a mixed bag of phenotypes with a mixed bag of interventions.
Patients undergoing HLA desensitization still have a high risk of rejection, particularly if the crossmatch remains positive (even weakly). Moreover, they have a high risk of infectious complications given the intense immunosuppression administered, as demonstrated by a doubled risk of BK virus disease in the Hopkins cohort. Questions remain regarding the optimum regime for specific patients. What should be done if the crossmatch remains positive after desensitization? How should we monitor patients post-transplant (DSA, protocol biopsies)?
The potential benefit associated with HLA desensitization was demonstrated in a study by the Hopkins group who examined HLA-sensitized patients stratified based on attempt at desensitization. There was a survival advantage for desensitization compared to those not desensitized who remained on dialysis. More importantly, they also showed a survival advantage for those desensitized compared to those not (therefore including those who remained on dialysis or were subsequently transplanted with an HLA-compatible organ). The effect was consistent for “sensitization,” defined by the presence of DSA only, flow crossmatch positive, or CDC crossmatch positive.
A follow-up study extending to 22 North American centers confirmed the findings. It is notable, however, that a similar study performed in the UK population comparing HLA-incompatible living donor transplant recipients with wait-listed patients, who may or may not have been transplanted, did not confirm a survival advantage with HLA desensitization. The study was limited to crossmatch positive cases only; “less incompatible” pairs such as those with a DSA by single antigen testing only were not included. Moreover, dialysis survival in the USA appears to be worse than in the UK, which may account for the discordant findings.

Montgomery et al, NEJM, 2011 & Orandi et al, NEJM, 2016
These findings, as well as the added cost and morbidity burden associated with HLA incompatible transplantation, have led to a preferential use of the paired exchange schemes (see below image) in the US and UK for incompatible pairs with (at least in the UK) a corresponding decrease in HLA incompatible transplants performed. However, paired exchange programs, by their very nature, become enriched over time with highly sensitized pairs, so a combination approach employing desensitization with use of paired exchange to achieve a “less incompatible” donor may be successfully used.
IdeS: New Team Recruit
The new kid on the block in Team Transplanting the Un-transplantable is IgG Endopeptidase, an IgG-degrading enzyme derived from Streptococcus pyogenes, which goes by the moniker of IdeS in the locker room. IdeS tears up the opposition by cleaving human IgG into F(ab′)2 and Fc fragments, which prevents complement-dependent cytotoxicity and antibody-dependent cellular cytotoxicity.
Two early trials using IdeS for desensitization were recently reported in NEJM. A combined 25 highly sensitized patients from Sweden and the US (22 of whom had an actual DSA at transplant) were administered IdeS prior to transplantation. The early results appear spectacular with complete and immediate obliteration of the recipient HLA (and total IgG) antibodies which lasted for 1-2 weeks. One allograft failure occurred early and 10 cases of ABMR occurred which responded to treatment. This fascinating report brings up as many questions as answers, and longer term data will be needed to properly assess the need for and timing of additional immunosuppression, the risk of infection, and rebound antibody response (and will this be prevented by IVIg or rituximab given after IdeS has acted?).
Kidney Donor Risk

Copyright: Ben Schonewille / Shutterstock
Living donor kidney transplantation provides the best outcomes for patients with ESRD and is considered safe for the donor when they are screened effectively. However, although we have data demonstrating low morbidity and mortality associated with living donor nephrectomy, longer term data are more difficult to collect. Individuals who are accepted as living donors have passed rigorous screening and are therefore a fit group, likely more so than matched general population controls. Therefore, finding controls that can act as appropriate comparators has been a challenge.
More recently, attempts have been made to include controls for those deemed likely fit to donate a kidney, rather than being comprised of an unscreened “general population” cohort. Garg et al showed no increased cardiovascular risk in Canadian donors compared to a selected healthy cohort over a median 6.5 year follow-up. Was the follow up period long enough? Perhaps not, especially when you consider that the mechanism for increased cardiovascular risk is presumably a complex interplay between lower GFR, hyperfiltration injury, albuminuria, hypertension, CKD, and left ventricular hypertrophy, processes that take time to develop.
Next came a follow-up Norwegian study which included over 1,900 kidney donors from 1963 to 2007. Controls (n>32,000) were selected from a cohort with blood pressure <140/90 mm Hg (on no antihypertensives), BMI <30 kg/m2, and who rated their health as “good” or “excellent.” Median follow-up was 15 years for donors and 25 years for controls. Individuals were excluded if they had diabetes or cardiovascular disease although there was no data on kidney function or albuminuria. Donors had a significantly increased long-term risk for ESRD (hazard ratio > 11, which might seem frightening, but we have to remember that the baseline control group risk is nearly 0, so x11 is still a very low number), cardiovascular mortality (HR 1.40) and all-cause mortality (HR 1.30). Of note, all cases of ESRD occurred in living related donors and the etiology was immunological in nature, suggesting a genetic component to the donor kidney disease. It is worth noting that controls began observation between 1984 & 1988. The discrepancy in era may have influenced the result, potentially magnifying the apparent risk compared to donors from an older era.
Hot on the heels of the Norwegian study came a large US study looking at ESRD risk and including data on >96,000 donors between 1994 & 2011. Controls were gathered from the NHANES III cohort by excluding those with identified contraindications to kidney donation (9364/>20,000 qualified as eligible). Bootstrapping was used to expand control numbers and therefore events. They were matched by age, sex, race, educational background, BMI, blood pressure, and smoking history. Over a median follow up of 7.6 years, the donors had an increased risk of ESRD. Specifically, the risk of ESRD was 30.8/10,000 in donors versus 3.9/10,000 in the matched non-donor controls (P <0.001). The risk was higher in African American individuals (risk of 74.7/10,000 in donors V 23.9/10,000 in non-donors). Interestingly, white donors (22.7/10,000) had a similar risk of ESRD to the African American non-donors (see image) highlighting the very low risk of ESRD in white non-donors in this cohort. The lifetime risk of ESRD in donors was still significantly lower than the unscreened general population at 90/10,000 versus 326/10,000.

Muzaale et al, JAMA
While these data are a marked improvement on what came before, the following issues which highlight the difficulty in performing these comparison studies should be noted:
- Controls were a very well group and were excluded if they had any previous cancer or kidney stones (fair enough perhaps), but also if they were hypertensive (there may have been some leeway here) or if asthmatic (appears strict).
- No cases of ESRD occurred in white non-donors. Again, was follow up long enough?
- When attempting to apply this data to patients outside of the US, it must be kept in mind that rates of ESRD in the US are higher. White Europeans have about 1/3 the incidence of ESRD of white Americans (approximately 300 pmp versus 100 pmp).
An ESRD risk calculator has been developed based largely on the above data (publication; online calculator). This tool helps with the assessment and counselling of potential living donors.
While these data tells us about ESRD, we lack good data on what happens prior to this. A longitudinal study of almost 4000 white donors has highlighted the need for long follow up to properly assess risk. In their cohort, 6.1% developed proteinuria (at a median 16.6 years), 6.3% developed diabetes mellitus (at a median of 18.5 years), 27% hypertension (at 13.8 years). Obese patients were more likely to develop proteinuria and an eGFR <30mls/min/1.73mg2. These data on weight have been expanded upon in US donors, who have demonstrated the risk of ESRD associated with overweight donors, starting at a somewhat modest BMI of 27. The absolute risk was still low but an 86% adjusted increased risk was observed with a BMI >30.
Another helpful piece of data in live donors has been an analysis by Garg et al of donors who subsequently become pregnant, which demonstrated an increased risk of gestational hypertension and pre-eclampsia. The outcome was significant when examined as a composite, the overall risk was low and feto-maternal outcomes were excellent overall. While the data may be seen as reassuring despite the small observed risk, it does add an extra layer of complexity to assessment of female potential donors who plan on future pregnancies.
Updated KDIDO guidelines
Recently, the KDIGO Living Kidney Donor Work Group published guidelines on the evaluation of living kidney donors. This was a significant piece of work and well worth reading. It includes original analyses to produce risk models supporting the recommendations of the group. A central tenet of the work was the following: “The guideline work group concluded that a comprehensive approach to risk assessment should replace decisions based on assessments of single risk factors in isolation.” The guidelines support the notion that a young, fit donor without co-morbidity may have a higher lifetime risk of ESRD than an older hypertensive donor.

Lentine et al, Transplantation
APOL1
APOL1, a 2-time NephMadness champion, has recently been drafted by Team Living Donor Risk. APOL1 risk polymorphisms, named G1 & G2, are a well-known factor in the increased risk of ESRD in African Americans, and evidence suggests receiving a kidney from a deceased donor harboring G1 & G2 is associated with early allograft failure. This leads to obvious concerns with residual kidney function in donors who have risk alleles, with APOL1 genotyping being suggested as a tool to stratify risk in potential African American donors.
This concept was illustrated well, albeit tragically, in a case report of a young Afro-Caribbean monozygotic twin transplant pair. The recipient had FSGS and the donor had a normal workup. There was clinical and histological evidence of FSGS at 30 months post-transplant and allograft failure occurred early. The donor subsequently developed proteinuria and kidney dysfunction, likely aggravated by his reduced nephron mass. The twins were later confirmed to carry both APOL1 risk variants.
An American Society of Transplantation group recently convened to discuss APOL1 genotyping of donors and their thoughts may be found here. The group did not recommend routine testing but did suggest that APOL1 status be considered, discussed with donors, and testing offered if deemed appropriate. This story will continue to evolve, particularly with the NIH-funded APOL1 Long-term Transplantation Outcomes Network (APOLLO).
Somewhat related to the APOL1 theory is the observation that subclinical kidney disease existing at donation subsequently contributes to allograft outcome. Recipients of allografts from donors who developed ESRD had increased graft loss and mortality compared to recipients of allografts from donors who did not develop ESRD. This finding of risk following the kidney supports the view that pre-donation kidney disease is a major mechanism of CKD post-donation. Supporting this is another recent study of African American donors demonstrating that the APOL1 high-risk genotype is associated with lower pre- and post-donation eGFR and a faster decline in kidney function compared to the low-risk genotype. It was also notable that 2 of 19 donors carrying the high-risk genotypes developed ESRD, at 10 and 18 years post-donation, compared to none of the 117 low-risk genotype donors.
With a deluge of data in recent years, we are currently in a much better position to counsel our donors. For example, as discussed in the new KDIGO guidelines, it is worth noting that older donors may often be considered lower lifetime risks. They have proved robustness by already surviving many years and have relatively fewer years remaining in which to develop complications, especially as post-donation ESRD takes many years to develop. Age itself should not preclude donation, despite what some potential donors may be told. The current data demonstrates increased risk of ESRD and mortality with kidney donation, which makes sense, yet the risk appears modest for most. However, the data must be interpreted carefully with consideration of its limitations. Moreover, more marginal donors are being assessed than previous eras so this risk will be greater for some, and risk modeling may help stratify this further. A thoughtful assessment of donor risk with consideration of all factors together rather than in isolation and careful counseling of our potential donors is paramount.
Virally Infected Kidneys

Copyright: bluebay / Shutterstock
With increasing waiting lists for kidney transplantation, the transplant community is exploring methods to expand the availability of donor organs. This includes promotion of living donation with a focus on altruistic donors and use of paired exchange schemes, increasing use of DCD, and expanded criteria organs. Another novel method of expanding the donor pool is using virally infected organs for diseases where we now have good treatments to achieve viral clearance or control.
The 2 big name players in this team are Hepatitis C (HCV) and HIV.
HCV
Team captain HCV has undergone a transformation over recent seasons, with novel direct acting antiviral drugs (DAA’s) leading to sustained virological response (SVR) in >95% of carriers. We will hopefully see less HCV-related kidney disease moving forward and the implications for kidney transplantation are also significant. HCV was traditionally associated with worse outcomes post-kidney transplantation, more post-transplant diabetes, and limited viral treatment options. Ribavirin is particularly likely to cause hemolytic anemia with lower GFR, and aggressive acute rejection could be expected with interferon-based therapy. The past, present, and future of HCV and kidney transplantation is comprehensively discussed in a recent AJKD blog post.
The highlights of recent data in this realm are the following:
- The C-SURFER Trial demonstrated a high SVR rate using grazoprevir/elbasvir in patients with advanced CKD, including >3/4 on dialysis.
- There are RCTs and case series that demonstrate high SVR rate post-transplantation. In the latter series, patients receiving a HCV+ kidney had equally impressive viral clearances compared to those receiving a HCV- organ. Crucially, they had a significantly shorter wait time, suggesting that delayed treatment until after transplant may be the optimal approach for many. Particular groups that may benefit include those with a predicted long wait time or patients with multiple comorbidities who have a window of opportunity for transplantation.
If SVR is so impressive, the natural lead-in question is, “Why not transplant HCV+ organs into HCV- recipients?” This may appear to be a brave endeavor but early data is available and has been very encouraging (THINKER trial). The visual abstract summarizes the exciting findings. Ten HCV- patients with expected long waits rapidly received high quality but HCV+ kidneys. All were viremic on day 3 and all had SVR after 12 weeks of treatment and excellent graft function.
Another study (EXPANDER-1) reported similar findings. In this study, genotyping was performed retrospectively and the treatment regime modified depending on results. In THINKER, rapid genotyping was performed pre-transplant (likely not available in most centers), with only genotype 1 donors being selected.
The genotype issue is important. Until recently, there were no FDA-approved DAAs for the treatment of patients with advanced CKD and non-1 or 4 genotypes. Sofosbuvir, which is included in many DAA regimes, is not considered safe, with a GFR < 30mls/min/1.73m2. However, there are treatment options for these strains, including this study where 29% of patients had a non-1 or 4 genotype, all had advanced CKD (82% total cohort on dialysis), and a SVR of 98% was achieved with a Glecaprevir/Pibrentasvir combination. Glecaprevir and Pibrentasvir have biliary (rather than renal) clearance and based on this study, the combination has recently been approved by the FDA for treatment of HCV genotypes 1-6 in patients with advanced kidney impairment including those on dialysis.
A report of the accidental transmission of HCV to 5 recipients from a donor apparently at low risk and with negative anti-HCV IgG showed that all were cured by DAAs. This highlights another string in the bow of this exciting team.
Cost is an important point to consider and this team has traditionally been very expensive to put together. A 12-week course may still cost $60,000-$90,000 depending on the combination used. Harvoni (sofosbuvir/ledipasvir), commonly used for genotype 1 disease, may cost up to $94,000 for a 12-week course in the US but about £5,000 in the UK, down from £39,000 a year ago. Something needs to be done about this disparity in pricing.
In the UK, exploratory work is being done to consider using HCV+ organs for HCV- recipients. Up to 20 donors a year are declined due to HCV virology despite being good quality otherwise, while in the US this number will certainly be much larger. Despite treatment with DAAs, the cost of which continues to fall, use of these organs would be cost-neutral in the UK analysis.
HIV
HIV infection was traditionally an absolute contraindication to kidney transplantation because of fears that immunosuppression would lead to an excess of opportunistic infection and acceleration of the disease. The advent of antiretroviral therapy (ART), now with widespread use in the developed world, has led to a dramatic improvement of prognosis, a near normal life expectancy, and a re-evaluation of the transplant option for ESRD.
This was bolstered by data showing excellent graft and patient outcomes in carefully selected HIV+ patients undergoing kidney transplantation, albeit with a higher than expected acute rejection rate (AR; 31% at 1 year). Reports from Europe also demonstrated good results, with some calcineurin inhibitor-ART interactions but an acceptable AR rate (15%). While HIV+ recipients have outcomes similar to HIV- patients, co-infection with HCV has demonstrated worse allograft and patient survival. Presumably this may well change with the introduction of DAAs, as discussed above.
Other considerations for transplanting HIV+ patients are as follows:
- Induction therapy – An apparent higher risk of AR must be balanced with concerns regarding opportunistic infection, although this should be similar to general kidney transplant patient given excellent HIV control. Like induction therapy in general, there is a lack of consensus. Our program uses induction as per traditional immunological risk in HIV+ patients, for the small numbers of patients we have transplanted.
- Pneumocystis prophylaxis is continued lifelong in some programs.
- Drug-drug interactions may be problematic (more here). We aim to perform a tacrolimus trial with area under curve pre-transplant to guide post-transplant dosing.
Similar to its HCV teammate, the use of organs from HIV+ donors has been explored given the supply and demand disparity in organ transplantation. In the US, this was specifically forbidden in the National Organ Transplantation Act (1984). Positive experience with kidney transplantation from HIV+ donors into HIV+ recipients was initially gathered in South Africa, a country with a high rate of infection. Carefully selected recipients, who had been stable on ART with good CD4 counts and undetectable viral loads, were paired with deceased donors without opportunistic infection and normal kidney biopsies. ATG induction was employed, given concerns regarding higher risk of AR, and outcomes, both infectious and transplant-related were good. A follow-up study from the same authors reported favorable 3-5 year outcomes on 27 patients. AR was 8% at one year but 22% at 3 years, which is reasonable, but late AR when patients are being seen less frequently can be problematic.

President Obama signs HOPE Act in Oval Office (Nov 2013)
Based on these encouraging reports, the US Congress passed the HOPE Act (HIV Organ Policy Equity) in 2013, permitting HIV+ to HIV+ transplants on an experimental basis. The first liver transplant was performed at Johns Hopkins in March 2016, with the first kidney implanted the next day by the same team. Thus far, no HIV+ living donor transplant has occurred. While there would be additional concerns for the donor in this setting, given the risk of HIV-associated kidney disease, the HOPE Act permits living donation under strict conditions including a donor kidney biopsy as part of the work-up. For deceased donors, there are no CD4 or viral load criteria, but donors must lack a history of opportunistic infection and a pre-implantation donor biopsy is a prerequisite. We await data from these initial studies but if they are favorable, we look forward to more widespread implementation of such transplants across other centers.
Pushing the boundaries of organ availability by using HCV+ and HIV+ kidneys for transplantation once seemed brave, and while far from mainstream, it is becoming accepted as a valid method to expand donor numbers. Like everything in transplant medicine, the risk of an endeavour must be balanced against the risk of the status quo, which for some is more perilous. This new team is built around two exciting freshmen, but is certainly one to watch.
– Post prepared by Paul Phelan. Follow him @paulphel.
How to Claim CME
US-based physicians can earn 1.0 CME credit for reading this region. Please register/log in at the NKF PERC portal. Click on “Continue,” click on the “Transplant Region,” then click on “Continue” to access the evaluation. You’ll need to click on “Continue” again to complete the evaluation, after which you can claim 1.0 credit and print your certificate. The CME activity will expire on June 15th, 2018.
Submit your picks! | NephMadness 2018 | #NephMadness | #TransplantRegion
Leave a Reply