User login
Interferon-free regimen benefits HCV-infected liver transplant recipients
An oral, interferon-free drug regimen produced a 97% rate of sustained virologic response in liver transplant recipients who had recurrent hepatitis C viral infection – “an historically difficult-to-treat population” at high risk of death who have extremely limited treatment options, according to a study reported at the annual meeting of the American Association for the Study of Liver Diseases.
In an industry-sponsored, open-label phase II trial involving 34 adults with recurrent HCV infection following liver transplantation, 24 weeks of daily ombitasvir plus the ritonavir-boosted protease inhibitor ABT-450 (ABT-50/r), added to dasabuvir and ribavirin, eradicated every patient’s HCV RNA levels within 4 months. Only one patient had a relapse during a further 24 weeks of follow-up, said Dr. Parvez Mantry of the Liver Institute at Methodist Dallas, who presented the data at the meeting.*
Results of the study, which was conducted at 10 transplant centers in the United States and Spain, were presented at the meeting and simultaneously published online Nov. 11 in the New England Journal of Medicine (N. Engl. J. Med. 2014 Nov. 11 [doi: 10.1056/NEJMoa1408921]).
The standard of care for treating recurrent HCV infection after liver transplantation has been 48 weeks of peginterferon with ribavirin, but response rates are relatively low (13%-43%) because of interferon’s toxic effects. Moreover, the agent is known to induce graft injury, reducing both graft and patient survival.
The investigators assessed the safety and efficacy of a tablet formulation combining ombitasvir, a potent NS5A inhibitor, with ABT-50/r, a protease inhibitor that increases peak, trough, and overall drug exposure and allows once-daily dosing. To this was added standard dasabuvir and ribavirin, with ribavirin dosing adjusted according to the treating physician’s discretion to avert adverse hematologic effects in these immunosuppressed transplant recipients. Modified doses of standard calcineurin inhibitors (cyclosporine or tacrolimus) also were recommended for all patients, and low-dose glucocorticoids were permitted as needed.
The study participants were 18-70 years of age (mean age, 59.6 years) and had received liver transplants because of chronic HCV infection a minimum of 1 year previously. They had no or only mild liver fibrosis, were receiving stable cyclosporine- or tacrolimus-based immunosuppression, and were not coinfected with HIV or hepatitis B.
The primary efficacy endpoint was a sustained virologic response (SVR) 12 weeks after treatment was completed. All the study participants achieved an SVR by week 4 of treatment, which persisted in all of them until treatment was completed. At that time, 1 patient relapsed, so the overall SVR rate was 97%. This same SVR rate was sustained through final follow-up at post-treatment week 24.
In the patient who relapsed, HCV DNA showed resistance-associated genetic variants that had not been present at baseline. This patient also had been unresponsive to previous peginterferon-ribavirin therapy.
Adverse events were common, although the majority were mild or moderate in severity. Fatigue, headache, and cough were the most frequent adverse events. Grade 2 elevations in total bilirubin developed in two patients (6%), with no jaundice or scleral icterus. Nine patients showed grade 2 decreases in hemoglobin; none required a blood transfusion, and five required erythropoietin. There were no deaths and no cases of graft rejection.
One patient discontinued the study drug at week 18 after developing moderate rash, memory impairment, and anxiety deemed to be possibly drug related. However, that patient had already achieved an SVR before discontinuing treatment, and that SVR persisted at final follow-up 12 weeks later.
However, this study was not large enough to allow adequate assessment of adverse event rates or comparison of them with rates for other treatments, the investigators noted.
The researchers also noted that these study participants were easier to treat than the general population of liver transplant recipients with recurrent HCV, because they did not have advanced fibrosis or comorbid infections. In addition, patients with early, aggressive forms of recurrent HCV, such as fibrosing cholestatic hepatitis, were excluded from this study, as were patients maintained on immunosuppressive agents other than cyclosporine or tacrolimus.
This trial was sponsored by AbbVie, whose employees also designed the study, gathered and analyzed the data, and wrote the report. Study investigator Dr. Paul Y. Kwo reported receiving personal fees and grants from, and serving on advisory boards for, AbbVie, Bristol-Myers Squibb, and other companies. His associates reported ties to numerous industry sources.
*Clarification, 11/11/14: A previous version of this story did not state that the data were presented by Dr. Mantry
An oral, interferon-free drug regimen produced a 97% rate of sustained virologic response in liver transplant recipients who had recurrent hepatitis C viral infection – “an historically difficult-to-treat population” at high risk of death who have extremely limited treatment options, according to a study reported at the annual meeting of the American Association for the Study of Liver Diseases.
In an industry-sponsored, open-label phase II trial involving 34 adults with recurrent HCV infection following liver transplantation, 24 weeks of daily ombitasvir plus the ritonavir-boosted protease inhibitor ABT-450 (ABT-50/r), added to dasabuvir and ribavirin, eradicated every patient’s HCV RNA levels within 4 months. Only one patient had a relapse during a further 24 weeks of follow-up, said Dr. Parvez Mantry of the Liver Institute at Methodist Dallas, who presented the data at the meeting.*
Results of the study, which was conducted at 10 transplant centers in the United States and Spain, were presented at the meeting and simultaneously published online Nov. 11 in the New England Journal of Medicine (N. Engl. J. Med. 2014 Nov. 11 [doi: 10.1056/NEJMoa1408921]).
The standard of care for treating recurrent HCV infection after liver transplantation has been 48 weeks of peginterferon with ribavirin, but response rates are relatively low (13%-43%) because of interferon’s toxic effects. Moreover, the agent is known to induce graft injury, reducing both graft and patient survival.
The investigators assessed the safety and efficacy of a tablet formulation combining ombitasvir, a potent NS5A inhibitor, with ABT-50/r, a protease inhibitor that increases peak, trough, and overall drug exposure and allows once-daily dosing. To this was added standard dasabuvir and ribavirin, with ribavirin dosing adjusted according to the treating physician’s discretion to avert adverse hematologic effects in these immunosuppressed transplant recipients. Modified doses of standard calcineurin inhibitors (cyclosporine or tacrolimus) also were recommended for all patients, and low-dose glucocorticoids were permitted as needed.
The study participants were 18-70 years of age (mean age, 59.6 years) and had received liver transplants because of chronic HCV infection a minimum of 1 year previously. They had no or only mild liver fibrosis, were receiving stable cyclosporine- or tacrolimus-based immunosuppression, and were not coinfected with HIV or hepatitis B.
The primary efficacy endpoint was a sustained virologic response (SVR) 12 weeks after treatment was completed. All the study participants achieved an SVR by week 4 of treatment, which persisted in all of them until treatment was completed. At that time, 1 patient relapsed, so the overall SVR rate was 97%. This same SVR rate was sustained through final follow-up at post-treatment week 24.
In the patient who relapsed, HCV DNA showed resistance-associated genetic variants that had not been present at baseline. This patient also had been unresponsive to previous peginterferon-ribavirin therapy.
Adverse events were common, although the majority were mild or moderate in severity. Fatigue, headache, and cough were the most frequent adverse events. Grade 2 elevations in total bilirubin developed in two patients (6%), with no jaundice or scleral icterus. Nine patients showed grade 2 decreases in hemoglobin; none required a blood transfusion, and five required erythropoietin. There were no deaths and no cases of graft rejection.
One patient discontinued the study drug at week 18 after developing moderate rash, memory impairment, and anxiety deemed to be possibly drug related. However, that patient had already achieved an SVR before discontinuing treatment, and that SVR persisted at final follow-up 12 weeks later.
However, this study was not large enough to allow adequate assessment of adverse event rates or comparison of them with rates for other treatments, the investigators noted.
The researchers also noted that these study participants were easier to treat than the general population of liver transplant recipients with recurrent HCV, because they did not have advanced fibrosis or comorbid infections. In addition, patients with early, aggressive forms of recurrent HCV, such as fibrosing cholestatic hepatitis, were excluded from this study, as were patients maintained on immunosuppressive agents other than cyclosporine or tacrolimus.
This trial was sponsored by AbbVie, whose employees also designed the study, gathered and analyzed the data, and wrote the report. Study investigator Dr. Paul Y. Kwo reported receiving personal fees and grants from, and serving on advisory boards for, AbbVie, Bristol-Myers Squibb, and other companies. His associates reported ties to numerous industry sources.
*Clarification, 11/11/14: A previous version of this story did not state that the data were presented by Dr. Mantry
An oral, interferon-free drug regimen produced a 97% rate of sustained virologic response in liver transplant recipients who had recurrent hepatitis C viral infection – “an historically difficult-to-treat population” at high risk of death who have extremely limited treatment options, according to a study reported at the annual meeting of the American Association for the Study of Liver Diseases.
In an industry-sponsored, open-label phase II trial involving 34 adults with recurrent HCV infection following liver transplantation, 24 weeks of daily ombitasvir plus the ritonavir-boosted protease inhibitor ABT-450 (ABT-50/r), added to dasabuvir and ribavirin, eradicated every patient’s HCV RNA levels within 4 months. Only one patient had a relapse during a further 24 weeks of follow-up, said Dr. Parvez Mantry of the Liver Institute at Methodist Dallas, who presented the data at the meeting.*
Results of the study, which was conducted at 10 transplant centers in the United States and Spain, were presented at the meeting and simultaneously published online Nov. 11 in the New England Journal of Medicine (N. Engl. J. Med. 2014 Nov. 11 [doi: 10.1056/NEJMoa1408921]).
The standard of care for treating recurrent HCV infection after liver transplantation has been 48 weeks of peginterferon with ribavirin, but response rates are relatively low (13%-43%) because of interferon’s toxic effects. Moreover, the agent is known to induce graft injury, reducing both graft and patient survival.
The investigators assessed the safety and efficacy of a tablet formulation combining ombitasvir, a potent NS5A inhibitor, with ABT-50/r, a protease inhibitor that increases peak, trough, and overall drug exposure and allows once-daily dosing. To this was added standard dasabuvir and ribavirin, with ribavirin dosing adjusted according to the treating physician’s discretion to avert adverse hematologic effects in these immunosuppressed transplant recipients. Modified doses of standard calcineurin inhibitors (cyclosporine or tacrolimus) also were recommended for all patients, and low-dose glucocorticoids were permitted as needed.
The study participants were 18-70 years of age (mean age, 59.6 years) and had received liver transplants because of chronic HCV infection a minimum of 1 year previously. They had no or only mild liver fibrosis, were receiving stable cyclosporine- or tacrolimus-based immunosuppression, and were not coinfected with HIV or hepatitis B.
The primary efficacy endpoint was a sustained virologic response (SVR) 12 weeks after treatment was completed. All the study participants achieved an SVR by week 4 of treatment, which persisted in all of them until treatment was completed. At that time, 1 patient relapsed, so the overall SVR rate was 97%. This same SVR rate was sustained through final follow-up at post-treatment week 24.
In the patient who relapsed, HCV DNA showed resistance-associated genetic variants that had not been present at baseline. This patient also had been unresponsive to previous peginterferon-ribavirin therapy.
Adverse events were common, although the majority were mild or moderate in severity. Fatigue, headache, and cough were the most frequent adverse events. Grade 2 elevations in total bilirubin developed in two patients (6%), with no jaundice or scleral icterus. Nine patients showed grade 2 decreases in hemoglobin; none required a blood transfusion, and five required erythropoietin. There were no deaths and no cases of graft rejection.
One patient discontinued the study drug at week 18 after developing moderate rash, memory impairment, and anxiety deemed to be possibly drug related. However, that patient had already achieved an SVR before discontinuing treatment, and that SVR persisted at final follow-up 12 weeks later.
However, this study was not large enough to allow adequate assessment of adverse event rates or comparison of them with rates for other treatments, the investigators noted.
The researchers also noted that these study participants were easier to treat than the general population of liver transplant recipients with recurrent HCV, because they did not have advanced fibrosis or comorbid infections. In addition, patients with early, aggressive forms of recurrent HCV, such as fibrosing cholestatic hepatitis, were excluded from this study, as were patients maintained on immunosuppressive agents other than cyclosporine or tacrolimus.
This trial was sponsored by AbbVie, whose employees also designed the study, gathered and analyzed the data, and wrote the report. Study investigator Dr. Paul Y. Kwo reported receiving personal fees and grants from, and serving on advisory boards for, AbbVie, Bristol-Myers Squibb, and other companies. His associates reported ties to numerous industry sources.
*Clarification, 11/11/14: A previous version of this story did not state that the data were presented by Dr. Mantry
Key clinical point: An oral, interferon-free drug combination produced a 97% sustained virologic response rate in liver transplant recipients with recurrent HCV infection.
Major finding: The primary efficacy endpoint, an SVR 12 weeks after completion of treatment, was 97% (33 of 34 patients).
Data source: An industry-sponsored, multicenter, open-label phase II trial involving 34 adults with chronic HCV infection despite liver transplantation.
Disclosures: This trial was sponsored by AbbVie, whose employees also designed the study, gathered and analyzed the data, and wrote the report. Dr. Kwo reported receiving personal fees and grants from, and serving on advisory boards for, AbbVie, Bristol-Myers Squibb, and other companies. His associates reported ties to numerous industry sources.
FDA approves simeprevir-sofosbuvir combo for hepatitis C
The Food and Drug Administration has approved the use of sofosbuvir in combination with simeprevir for treatment of patients with chronic hepatitis C virus. It is a ribavirin- and interferon-free regimen.
The approval is reflected in changes to the label of simeprevir (Olysio), an HCV NS3/4A protease inhibitor, which was approved in 2013 for chronic hepatitis C genotype 1 infection “as a component of a combination antiviral treatment regimen.”
The new label summarizes the results of COSMOS, an open label, randomized phase II trial of HCV genotype 1–infected prior null responders with a METAVIR fibrosis score of F0 F2 or treatment naive subjects and prior null responders with a METAVIR fibrosis score of F3 F4 and compensated liver disease. The sustained virologic response rates 12 weeks after planned end of treatment was 93% among those treated with the combination for 12 weeks, and 97% among those treated for 24 weeks. The study was published online (Lancet 2014 July [doi:10.1016/S0140-6736(14)61036-9]).
In COSMOS, the most common adverse reactions reported by more than 10% of treated patients during 12 weeks of combination treatment were fatigue in 25%, headache (21%), nausea (21%), insomnia (14%), pruritus (11%), rash (11%), and photosensitivity (7%). Among those treated for 24 weeks, dizziness (16%) and diarrhea (16%) were reported, according to the revised label.
When combined with sofosbuvir, treatment of treatment-naive and treatment-experienced patients is recommended for 12 weeks (for patients without cirrhosis) or 24 weeks (for patients with cirrhosis).
Simeprevir is marketed by Janssen Therapeutics. Sofosbuvir, an HCV nucleotide analog NS5B polymerase inhibitor approved in December 2013 for treatment of chronic hepatitis C as a component of a combination antiviral treatment regimen, is marketed as Sovaldi by Gilead Sciences. The combination was approved Nov. 5.
The Food and Drug Administration has approved the use of sofosbuvir in combination with simeprevir for treatment of patients with chronic hepatitis C virus. It is a ribavirin- and interferon-free regimen.
The approval is reflected in changes to the label of simeprevir (Olysio), an HCV NS3/4A protease inhibitor, which was approved in 2013 for chronic hepatitis C genotype 1 infection “as a component of a combination antiviral treatment regimen.”
The new label summarizes the results of COSMOS, an open label, randomized phase II trial of HCV genotype 1–infected prior null responders with a METAVIR fibrosis score of F0 F2 or treatment naive subjects and prior null responders with a METAVIR fibrosis score of F3 F4 and compensated liver disease. The sustained virologic response rates 12 weeks after planned end of treatment was 93% among those treated with the combination for 12 weeks, and 97% among those treated for 24 weeks. The study was published online (Lancet 2014 July [doi:10.1016/S0140-6736(14)61036-9]).
In COSMOS, the most common adverse reactions reported by more than 10% of treated patients during 12 weeks of combination treatment were fatigue in 25%, headache (21%), nausea (21%), insomnia (14%), pruritus (11%), rash (11%), and photosensitivity (7%). Among those treated for 24 weeks, dizziness (16%) and diarrhea (16%) were reported, according to the revised label.
When combined with sofosbuvir, treatment of treatment-naive and treatment-experienced patients is recommended for 12 weeks (for patients without cirrhosis) or 24 weeks (for patients with cirrhosis).
Simeprevir is marketed by Janssen Therapeutics. Sofosbuvir, an HCV nucleotide analog NS5B polymerase inhibitor approved in December 2013 for treatment of chronic hepatitis C as a component of a combination antiviral treatment regimen, is marketed as Sovaldi by Gilead Sciences. The combination was approved Nov. 5.
The Food and Drug Administration has approved the use of sofosbuvir in combination with simeprevir for treatment of patients with chronic hepatitis C virus. It is a ribavirin- and interferon-free regimen.
The approval is reflected in changes to the label of simeprevir (Olysio), an HCV NS3/4A protease inhibitor, which was approved in 2013 for chronic hepatitis C genotype 1 infection “as a component of a combination antiviral treatment regimen.”
The new label summarizes the results of COSMOS, an open label, randomized phase II trial of HCV genotype 1–infected prior null responders with a METAVIR fibrosis score of F0 F2 or treatment naive subjects and prior null responders with a METAVIR fibrosis score of F3 F4 and compensated liver disease. The sustained virologic response rates 12 weeks after planned end of treatment was 93% among those treated with the combination for 12 weeks, and 97% among those treated for 24 weeks. The study was published online (Lancet 2014 July [doi:10.1016/S0140-6736(14)61036-9]).
In COSMOS, the most common adverse reactions reported by more than 10% of treated patients during 12 weeks of combination treatment were fatigue in 25%, headache (21%), nausea (21%), insomnia (14%), pruritus (11%), rash (11%), and photosensitivity (7%). Among those treated for 24 weeks, dizziness (16%) and diarrhea (16%) were reported, according to the revised label.
When combined with sofosbuvir, treatment of treatment-naive and treatment-experienced patients is recommended for 12 weeks (for patients without cirrhosis) or 24 weeks (for patients with cirrhosis).
Simeprevir is marketed by Janssen Therapeutics. Sofosbuvir, an HCV nucleotide analog NS5B polymerase inhibitor approved in December 2013 for treatment of chronic hepatitis C as a component of a combination antiviral treatment regimen, is marketed as Sovaldi by Gilead Sciences. The combination was approved Nov. 5.
Lungs donated after cardiac arrest, brain death yield similar survival rates
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AUSTIN, TEX. – The risk of death at 1 year after lung transplantation with organs donated either after cardiac arrest or after brain death was virtually the same, an analysis of the literature has shown.
“Donation after cardiac death appears to be a safe and effective method to expand the donor pool,” said Dr. Dustin Krutsinger of the University of Iowa, Iowa City, who presented the findings during the Hot Topics in Pulmonary Critical Care session at the annual meeting of the American College of Chest Physicians.
Over the years, the demand for organ donations for lung transplant candidates has steadily increased while the number of available organs has remained static. This is due, in part, to physicians being concerned about injury to the organs during the ischemic period, as well as what can often be as much as an hour before organ procurement after withdrawal of life support. However, Dr. Krutsinger said the similarities between the two cohorts could result from the fact that before procurement, systemic circulation allows the lungs to oxygenate by perfusion, and so there is less impact during the ischemic period.
“There is also a thought that the ischemic period might actually protect the lungs and the liver from reperfusion injury. And we’re avoiding brain death, which is not a completely benign state,” he told the audience.
After conducting an extensive review of the literature for 1-year survival rates post lung transplantation, the investigators found 519 unique citations, including 58 citations selected for full text review, 10 observational cohort studies for systematic review, and another 5 such studies for meta-analysis.
Dr. Krutsinger and his colleagues found no significant difference in 1-year survival rates between the donation after cardiac death and the donation after brain death cohorts (P = .658). In a pooled analysis of the five studies, no significant difference in risk of death was found at 1 year after either transplantation procedure (relative risk, 0.66; 95% confidence interval, 0.38-1.15; P = .15). Although he thought the findings were limited by shortcomings in the data, such as the fact that the study was a retrospective analysis of unmatched cohorts and that the follow-up period was short, Dr. Krutsinger said in an interview that he thought the data were compelling enough for institutions to begin rethinking organ procurement and transplantation protocols. In addition to his own study, he cited a 2013 study which he said indicated that if lungs donated after cardiac arrest were included, the pool of available organs would increase by as much as 50% (Ann. Am. Thorac. Soc. 2013;10:73-80).
But challenges remain.
“There are some things you can do to the potential donors that are questionable ethicswise, such as administering heparin premortem, which would be beneficial to the actual recipients. But, up until they are pronounced dead, they are still a patient. You don’t really have that complication with a donation after brain death, since once brain death is determined, the person is officially dead. Things you then do to them to benefit the eventual recipients aren’t being done to a ‘patient.’ ”
Still, Dr. Krutsinger said that if organs procured after cardiac arrest were to become more common than after brain death, he would be “disappointed” since the data showed “the outcomes are similar, not inferior.”
Dr. Krutsinger said he had no relevant disclosures.
On Twitter @whitneymcknight
AT CHEST 2014
Key clinical point: Expansion of organ donation programs to include organs donated after cardiac death could help meet a growing demand for donated lungs.
Major finding: No significant difference was seen in lung transplantation 1-year survival rates between donation after cardiac arrest and donation after brain death.
Data source: A systematic review of 10 observational cohort studies and a meta-analysis of 5 studies, chosen from more than 500 citations that included 1-year survival data for lung transplantation occuring after either cardiac arrest or brain death.
Disclosures: Dr. Krutsinger said he had no relevant disclosures.
Lungs donated after asphyxiation, drowning found suitable for transplant
Patients who received lung transplants from donors who died of asphyxiation or drowning had similar survival rates and clinical outcomes as those whose donors died of other causes, according to a large registry analysis in the October issue of The Annals of Thoracic Surgery.
“Asphyxiation or drowning as a donor cause of death should not automatically exclude the organ from transplant consideration,” said Dr. Bryan A. Whitson of Ohio State University, Columbus, and his associates. Donor death from asphyxiation or drowning did not significantly affect rates of airway dehiscence, transplant rejection, posttransplant stroke or dialysis, or long-term survival.
Lungs donated after asphyxiation or drowning should be carefully evaluated for parenchymal injury, microbial contamination, and the possibility of primary graft dysfunction, the researchers cautioned. For example, asphyxiation and drowning can alter lung surfactant levels (Ann. Thorac. Surg. 2014;98:1145-51).
The analysis included 18,205 U.S. adults who underwent lung transplantation between 1987 and 2010, including 309 patients whose donors had reportedly died from drowning or asphyxiation. Patients were identified from the UNOS/OPTN STAR (United Network for Organ Sharing/Organ Procurement and Transplantation Network Standard Transplant Analysis and Research) database, which is overseen by the U.S. Department of Health & Human Services.
Ten-year survival curves did not vary based on donor cause of death, either when analyzed individually or when asphyxiation or drowning was compared with all other causes (P = .52), the researchers said. In fact, pulmonary deaths were significantly less common (5.8%) among recipients whose donors had died of asphyxiation or drowning compared with other causes (9.5%; P = .02).
Donor death from drowning and asphyxiation also did not significantly affect rates of treatment for transplant rejection within the first year after surgery (50.8% vs. 47.4% for all other causes of donor death), or posttransplant rates of stroke (0.7% vs. 2.1%) or dialysis (5.4% vs. 5.2%), the investigators said. However, hospital length of stay averaged 0.8 days longer when donors had died of asphyxiation or drowning compared with other causes (27.3 vs. 26.5 days; P < 0.001).
Dr. Jacques-Pierre Fontaine comments: The shortage of suitable donor lungs remains an important problem. Less than 20% of lungs being offered for donation are being used. The notion that asphyxiation or drowning excludes a patient from being a potential donor is widespread among some clinicians.
This extensive retrospective review of the robust UNOS Database demonstrates that recipients of lungs from donors who died from asphyxiation or drowning have similar 10-year survival and post-transplant complication rates. In carefully selected donors, these lungs may be successfully used. Furthermore, "optimization" of marginal donor lungs may become more prevalent as ex-vivo lung perfusion technology evolves.
Dr. Fontaine specializes in thoracic surgery at the Moffitt Cancer Center in Tampa, Florida.
Dr. Jacques-Pierre Fontaine comments: The shortage of suitable donor lungs remains an important problem. Less than 20% of lungs being offered for donation are being used. The notion that asphyxiation or drowning excludes a patient from being a potential donor is widespread among some clinicians.
This extensive retrospective review of the robust UNOS Database demonstrates that recipients of lungs from donors who died from asphyxiation or drowning have similar 10-year survival and post-transplant complication rates. In carefully selected donors, these lungs may be successfully used. Furthermore, "optimization" of marginal donor lungs may become more prevalent as ex-vivo lung perfusion technology evolves.
Dr. Fontaine specializes in thoracic surgery at the Moffitt Cancer Center in Tampa, Florida.
Dr. Jacques-Pierre Fontaine comments: The shortage of suitable donor lungs remains an important problem. Less than 20% of lungs being offered for donation are being used. The notion that asphyxiation or drowning excludes a patient from being a potential donor is widespread among some clinicians.
This extensive retrospective review of the robust UNOS Database demonstrates that recipients of lungs from donors who died from asphyxiation or drowning have similar 10-year survival and post-transplant complication rates. In carefully selected donors, these lungs may be successfully used. Furthermore, "optimization" of marginal donor lungs may become more prevalent as ex-vivo lung perfusion technology evolves.
Dr. Fontaine specializes in thoracic surgery at the Moffitt Cancer Center in Tampa, Florida.
Patients who received lung transplants from donors who died of asphyxiation or drowning had similar survival rates and clinical outcomes as those whose donors died of other causes, according to a large registry analysis in the October issue of The Annals of Thoracic Surgery.
“Asphyxiation or drowning as a donor cause of death should not automatically exclude the organ from transplant consideration,” said Dr. Bryan A. Whitson of Ohio State University, Columbus, and his associates. Donor death from asphyxiation or drowning did not significantly affect rates of airway dehiscence, transplant rejection, posttransplant stroke or dialysis, or long-term survival.
Lungs donated after asphyxiation or drowning should be carefully evaluated for parenchymal injury, microbial contamination, and the possibility of primary graft dysfunction, the researchers cautioned. For example, asphyxiation and drowning can alter lung surfactant levels (Ann. Thorac. Surg. 2014;98:1145-51).
The analysis included 18,205 U.S. adults who underwent lung transplantation between 1987 and 2010, including 309 patients whose donors had reportedly died from drowning or asphyxiation. Patients were identified from the UNOS/OPTN STAR (United Network for Organ Sharing/Organ Procurement and Transplantation Network Standard Transplant Analysis and Research) database, which is overseen by the U.S. Department of Health & Human Services.
Ten-year survival curves did not vary based on donor cause of death, either when analyzed individually or when asphyxiation or drowning was compared with all other causes (P = .52), the researchers said. In fact, pulmonary deaths were significantly less common (5.8%) among recipients whose donors had died of asphyxiation or drowning compared with other causes (9.5%; P = .02).
Donor death from drowning and asphyxiation also did not significantly affect rates of treatment for transplant rejection within the first year after surgery (50.8% vs. 47.4% for all other causes of donor death), or posttransplant rates of stroke (0.7% vs. 2.1%) or dialysis (5.4% vs. 5.2%), the investigators said. However, hospital length of stay averaged 0.8 days longer when donors had died of asphyxiation or drowning compared with other causes (27.3 vs. 26.5 days; P < 0.001).
Patients who received lung transplants from donors who died of asphyxiation or drowning had similar survival rates and clinical outcomes as those whose donors died of other causes, according to a large registry analysis in the October issue of The Annals of Thoracic Surgery.
“Asphyxiation or drowning as a donor cause of death should not automatically exclude the organ from transplant consideration,” said Dr. Bryan A. Whitson of Ohio State University, Columbus, and his associates. Donor death from asphyxiation or drowning did not significantly affect rates of airway dehiscence, transplant rejection, posttransplant stroke or dialysis, or long-term survival.
Lungs donated after asphyxiation or drowning should be carefully evaluated for parenchymal injury, microbial contamination, and the possibility of primary graft dysfunction, the researchers cautioned. For example, asphyxiation and drowning can alter lung surfactant levels (Ann. Thorac. Surg. 2014;98:1145-51).
The analysis included 18,205 U.S. adults who underwent lung transplantation between 1987 and 2010, including 309 patients whose donors had reportedly died from drowning or asphyxiation. Patients were identified from the UNOS/OPTN STAR (United Network for Organ Sharing/Organ Procurement and Transplantation Network Standard Transplant Analysis and Research) database, which is overseen by the U.S. Department of Health & Human Services.
Ten-year survival curves did not vary based on donor cause of death, either when analyzed individually or when asphyxiation or drowning was compared with all other causes (P = .52), the researchers said. In fact, pulmonary deaths were significantly less common (5.8%) among recipients whose donors had died of asphyxiation or drowning compared with other causes (9.5%; P = .02).
Donor death from drowning and asphyxiation also did not significantly affect rates of treatment for transplant rejection within the first year after surgery (50.8% vs. 47.4% for all other causes of donor death), or posttransplant rates of stroke (0.7% vs. 2.1%) or dialysis (5.4% vs. 5.2%), the investigators said. However, hospital length of stay averaged 0.8 days longer when donors had died of asphyxiation or drowning compared with other causes (27.3 vs. 26.5 days; P < 0.001).
Key clinical point: Lung transplant recipients had good outcomes and long-term survival in cases involving donors who died of asphyxiation or drowning.
Major finding: Pulmonary deaths were significantly less common (5.8%) among recipients whose donors had died of asphyxiation or drowning compared with other causes (9.5%; P = .02).
Data source: Retrospective registry analysis of 18,250 lung transplant recipients.
Disclosures: The authors did not report funding sources or conflicts of interest.
Later transplant for renal failure in lupus nephritis may raise graft failure risk
Delaying kidney transplantation to allow for quiescence of systemic lupus erythematosus–related immune activity in patients with lupus nephritis and end-stage renal disease does not appear to improve graft outcomes, according to an analysis of national surveillance data.
Of 4,743 transplant recipients with lupus nephritis and end-stage renal disease (LN-ESRD), 1,239 experienced graft failure. Overall, wait times of 3-12 months and 12-24 months were associated with 25% and 37% increased risk of graft failure, respectively, compared with wait times of less than 3 months, after adjustment for age, race, insurance, hemoglobin, and donor type.
A similar pattern was seen in white patients, except that wait times of more than 36 months in white patients were associated with a near doubling of graft failure risk (hazard ratio, 1.98), Laura C. Plantinga of Emory University, Atlanta, and her colleagues reported (Arthritis Care Res. 2014 Sept. 23 [doi:10.1002/acr.22482]).
Among black patients, longer wait times were not associated with graft failure in the adjusted analysis, and, in fact, there was a nonstatistically significant suggestion of a protective effect for wait time of 2 years or more. This finding may reflect unexplained differences in disease pathology between white and black LN-ESRD patients, the investigators said, adding that there was no increased risk of graft failure in black patients who were transplanted early.
“Our results suggest U.S. recommendations for transplantation in LN-ESRD may not align with evidence from the target population,” they said, noting that the results should be considered hypotheses-generating because of the limitations of the study and that additional study is needed to examine the potential confounding effects of clinically recognized SLE activity on the associations observed in this study.
Some of the investigators were supported through grants from the National Institutes of Health.
Delaying kidney transplantation to allow for quiescence of systemic lupus erythematosus–related immune activity in patients with lupus nephritis and end-stage renal disease does not appear to improve graft outcomes, according to an analysis of national surveillance data.
Of 4,743 transplant recipients with lupus nephritis and end-stage renal disease (LN-ESRD), 1,239 experienced graft failure. Overall, wait times of 3-12 months and 12-24 months were associated with 25% and 37% increased risk of graft failure, respectively, compared with wait times of less than 3 months, after adjustment for age, race, insurance, hemoglobin, and donor type.
A similar pattern was seen in white patients, except that wait times of more than 36 months in white patients were associated with a near doubling of graft failure risk (hazard ratio, 1.98), Laura C. Plantinga of Emory University, Atlanta, and her colleagues reported (Arthritis Care Res. 2014 Sept. 23 [doi:10.1002/acr.22482]).
Among black patients, longer wait times were not associated with graft failure in the adjusted analysis, and, in fact, there was a nonstatistically significant suggestion of a protective effect for wait time of 2 years or more. This finding may reflect unexplained differences in disease pathology between white and black LN-ESRD patients, the investigators said, adding that there was no increased risk of graft failure in black patients who were transplanted early.
“Our results suggest U.S. recommendations for transplantation in LN-ESRD may not align with evidence from the target population,” they said, noting that the results should be considered hypotheses-generating because of the limitations of the study and that additional study is needed to examine the potential confounding effects of clinically recognized SLE activity on the associations observed in this study.
Some of the investigators were supported through grants from the National Institutes of Health.
Delaying kidney transplantation to allow for quiescence of systemic lupus erythematosus–related immune activity in patients with lupus nephritis and end-stage renal disease does not appear to improve graft outcomes, according to an analysis of national surveillance data.
Of 4,743 transplant recipients with lupus nephritis and end-stage renal disease (LN-ESRD), 1,239 experienced graft failure. Overall, wait times of 3-12 months and 12-24 months were associated with 25% and 37% increased risk of graft failure, respectively, compared with wait times of less than 3 months, after adjustment for age, race, insurance, hemoglobin, and donor type.
A similar pattern was seen in white patients, except that wait times of more than 36 months in white patients were associated with a near doubling of graft failure risk (hazard ratio, 1.98), Laura C. Plantinga of Emory University, Atlanta, and her colleagues reported (Arthritis Care Res. 2014 Sept. 23 [doi:10.1002/acr.22482]).
Among black patients, longer wait times were not associated with graft failure in the adjusted analysis, and, in fact, there was a nonstatistically significant suggestion of a protective effect for wait time of 2 years or more. This finding may reflect unexplained differences in disease pathology between white and black LN-ESRD patients, the investigators said, adding that there was no increased risk of graft failure in black patients who were transplanted early.
“Our results suggest U.S. recommendations for transplantation in LN-ESRD may not align with evidence from the target population,” they said, noting that the results should be considered hypotheses-generating because of the limitations of the study and that additional study is needed to examine the potential confounding effects of clinically recognized SLE activity on the associations observed in this study.
Some of the investigators were supported through grants from the National Institutes of Health.
Key clinical point: Delaying transplantation in LN-ESRD patients may do more harm than good, although future studies should determine if longer wait times remain associated with increased risk of graft failure, independent of clinically recognized SLE activity.
Major finding: Overall risk of graft failure was increased by 25% and 37% with wait times of 3-12 months and 12-24 months, respectively (vs. less than 3 months).
Data source: National ESRD surveillance data (U.S. Renal Data System) for 4,743 LN-ESRD transplant recipients.
Disclosures: Some of the investigators were supported through grants from the National Institutes of Health.
Right-sided living donor kidney transplant found safe
SAN FRANCISCO – The practice of preferentially using left instead of right kidneys in living donor kidney transplantation may no longer be justified in the era of contemporary laparoscopic surgery, suggests a national study reported at the 2014 World Transplant Congress.
"The current approach in many centers is to prefer left living donor nephrectomy due to longer vessel length...Right donor nephrectomy, at least in our center and I think in most centers, has generally been reserved for cases of multiple or complex vessels on the left or incidental anatomical abnormalities on the right like cysts or stones," commented presenting author Dr. Tim E. Taber of Indiana University in Indianapolis.
Only one in seven of the roughly 59,000 living donor kidney transplants studied was performed using a right kidney. However, most short- and long-term outcomes were statistically indistinguishable between recipients of left and right kidneys, and the differences that were significant were small, he reported at the congress sponsored by the American Society of Transplant Surgeons.
"Our [study] is the largest national analysis or most recent large data analysis done on this subject in today’s surgical era of established laparoscopic living donor nephrectomies. There may be a minor risk for slightly inferior outcomes with right versus left kidneys," Dr. Taber concluded.
"Right-donor nephrectomy continues to be performed with great reluctance," he added. Yet, "under the accepted principles of live-donor nephrectomy, with enough surgical expertise, right-donor nephrectomy can be performed successfully. Right kidneys seem to have a very small difference, if any, in outcomes as compared to left kidneys. Surgical expertise and experience should be tailored toward this aspect."
A session attendee from Brazil commented, "We [prefer] to choose the right kidney in situations where we have one artery on the right side and multiple arteries on the left side." In these cases, his group uses an approach to the vasculature adopted from pancreas transplantation. "We have identical results with the right and left side," he reported.
Dr. Lloyd E. Ratner, director of renal and pancreatic transplantation at Columbia University Medical Center in New York, who also attended the session, said, "I feel somewhat responsible for causing this problem with the right kidney because we were the ones that originally described the higher thrombosis rate with the right kidney with the laparoscopic donor nephrectomies. And I think it scared everyone off from this topic."
As several attendees noted, "there are surgical ways of getting around this," he agreed, offering two more options. "The first is that if we get a short vein, we’re not reluctant at all to put a piece of Dacron onto it, so you don’t even need to dig out the saphenous and cause additional time or morbidity to the patient. And the nice thing about the Dacron grafts is that they are corrugated and they don’t collapse. They also stretch, so you don’t need to cut them exactly precisely," he said.
"And number two is when you are stapling ... it’s often useful to be able to staple onto the cava and not get the vein in one staple byte." By using two passes in the appropriate configuration, "you actually get a cuff of cava, then you have plenty of vein," he explained.
In the study, Dr. Taber and colleagues retrospectively analyzed data from 58,599 adult living donor kidney transplants performed during 2000-2009 and captured in the United Network for Organ Sharing (UNOS) database. In 86% of cases, surgeons used the donor’s left kidney.
Recipients of left and right kidneys were statistically indistinguishable with respect to hospital length of stay, treatment for acute rejection within 6 months, acute rejection as a cause of graft failure, inadequate urine production in the first 24 hours, primary graft failure, graft thrombosis or surgical complication as a contributory cause of graft failure, and 1-year graft survival.
Those receiving a right kidney did have significant but small increases in rates of delayed graft function, as defined by the need for dialysis within 7 days of transplantation (5.7% vs. 4.2%), lack of decline in serum creatinine in the first 24 hours (19.7% vs. 16.4%), treatment for acute rejection within 1 year (12.7% vs. 11.8%), and graft thrombosis as the cause of graft failure (1.1% vs. 0.8%).
The Kaplan-Meier cumulative rate of graft survival was better for left kidneys than for right kidneys (P = .006), but "these are essentially superimposed numbers," said Dr. Taber, who disclosed no conflicts of interest related to the research.
The study had limitations, such as its retrospective design, lack of more detailed information about donor and recipient outcomes, and reliance on data as reported by centers, he acknowledged. Also, such large studies may pick up small differences that are not clinically meaningful.
"With ever-increasing demands for living donor transplantation, right-donor nephrectomies are being considered more often. Every effort should be made to leave the donor with the higher-functioning kidney, but at the same time maximizing the living donor pool," Dr. Taber concluded.
Dr. Taber disclosed no relevant conflicts of interest.
SAN FRANCISCO – The practice of preferentially using left instead of right kidneys in living donor kidney transplantation may no longer be justified in the era of contemporary laparoscopic surgery, suggests a national study reported at the 2014 World Transplant Congress.
"The current approach in many centers is to prefer left living donor nephrectomy due to longer vessel length...Right donor nephrectomy, at least in our center and I think in most centers, has generally been reserved for cases of multiple or complex vessels on the left or incidental anatomical abnormalities on the right like cysts or stones," commented presenting author Dr. Tim E. Taber of Indiana University in Indianapolis.
Only one in seven of the roughly 59,000 living donor kidney transplants studied was performed using a right kidney. However, most short- and long-term outcomes were statistically indistinguishable between recipients of left and right kidneys, and the differences that were significant were small, he reported at the congress sponsored by the American Society of Transplant Surgeons.
"Our [study] is the largest national analysis or most recent large data analysis done on this subject in today’s surgical era of established laparoscopic living donor nephrectomies. There may be a minor risk for slightly inferior outcomes with right versus left kidneys," Dr. Taber concluded.
"Right-donor nephrectomy continues to be performed with great reluctance," he added. Yet, "under the accepted principles of live-donor nephrectomy, with enough surgical expertise, right-donor nephrectomy can be performed successfully. Right kidneys seem to have a very small difference, if any, in outcomes as compared to left kidneys. Surgical expertise and experience should be tailored toward this aspect."
A session attendee from Brazil commented, "We [prefer] to choose the right kidney in situations where we have one artery on the right side and multiple arteries on the left side." In these cases, his group uses an approach to the vasculature adopted from pancreas transplantation. "We have identical results with the right and left side," he reported.
Dr. Lloyd E. Ratner, director of renal and pancreatic transplantation at Columbia University Medical Center in New York, who also attended the session, said, "I feel somewhat responsible for causing this problem with the right kidney because we were the ones that originally described the higher thrombosis rate with the right kidney with the laparoscopic donor nephrectomies. And I think it scared everyone off from this topic."
As several attendees noted, "there are surgical ways of getting around this," he agreed, offering two more options. "The first is that if we get a short vein, we’re not reluctant at all to put a piece of Dacron onto it, so you don’t even need to dig out the saphenous and cause additional time or morbidity to the patient. And the nice thing about the Dacron grafts is that they are corrugated and they don’t collapse. They also stretch, so you don’t need to cut them exactly precisely," he said.
"And number two is when you are stapling ... it’s often useful to be able to staple onto the cava and not get the vein in one staple byte." By using two passes in the appropriate configuration, "you actually get a cuff of cava, then you have plenty of vein," he explained.
In the study, Dr. Taber and colleagues retrospectively analyzed data from 58,599 adult living donor kidney transplants performed during 2000-2009 and captured in the United Network for Organ Sharing (UNOS) database. In 86% of cases, surgeons used the donor’s left kidney.
Recipients of left and right kidneys were statistically indistinguishable with respect to hospital length of stay, treatment for acute rejection within 6 months, acute rejection as a cause of graft failure, inadequate urine production in the first 24 hours, primary graft failure, graft thrombosis or surgical complication as a contributory cause of graft failure, and 1-year graft survival.
Those receiving a right kidney did have significant but small increases in rates of delayed graft function, as defined by the need for dialysis within 7 days of transplantation (5.7% vs. 4.2%), lack of decline in serum creatinine in the first 24 hours (19.7% vs. 16.4%), treatment for acute rejection within 1 year (12.7% vs. 11.8%), and graft thrombosis as the cause of graft failure (1.1% vs. 0.8%).
The Kaplan-Meier cumulative rate of graft survival was better for left kidneys than for right kidneys (P = .006), but "these are essentially superimposed numbers," said Dr. Taber, who disclosed no conflicts of interest related to the research.
The study had limitations, such as its retrospective design, lack of more detailed information about donor and recipient outcomes, and reliance on data as reported by centers, he acknowledged. Also, such large studies may pick up small differences that are not clinically meaningful.
"With ever-increasing demands for living donor transplantation, right-donor nephrectomies are being considered more often. Every effort should be made to leave the donor with the higher-functioning kidney, but at the same time maximizing the living donor pool," Dr. Taber concluded.
Dr. Taber disclosed no relevant conflicts of interest.
SAN FRANCISCO – The practice of preferentially using left instead of right kidneys in living donor kidney transplantation may no longer be justified in the era of contemporary laparoscopic surgery, suggests a national study reported at the 2014 World Transplant Congress.
"The current approach in many centers is to prefer left living donor nephrectomy due to longer vessel length...Right donor nephrectomy, at least in our center and I think in most centers, has generally been reserved for cases of multiple or complex vessels on the left or incidental anatomical abnormalities on the right like cysts or stones," commented presenting author Dr. Tim E. Taber of Indiana University in Indianapolis.
Only one in seven of the roughly 59,000 living donor kidney transplants studied was performed using a right kidney. However, most short- and long-term outcomes were statistically indistinguishable between recipients of left and right kidneys, and the differences that were significant were small, he reported at the congress sponsored by the American Society of Transplant Surgeons.
"Our [study] is the largest national analysis or most recent large data analysis done on this subject in today’s surgical era of established laparoscopic living donor nephrectomies. There may be a minor risk for slightly inferior outcomes with right versus left kidneys," Dr. Taber concluded.
"Right-donor nephrectomy continues to be performed with great reluctance," he added. Yet, "under the accepted principles of live-donor nephrectomy, with enough surgical expertise, right-donor nephrectomy can be performed successfully. Right kidneys seem to have a very small difference, if any, in outcomes as compared to left kidneys. Surgical expertise and experience should be tailored toward this aspect."
A session attendee from Brazil commented, "We [prefer] to choose the right kidney in situations where we have one artery on the right side and multiple arteries on the left side." In these cases, his group uses an approach to the vasculature adopted from pancreas transplantation. "We have identical results with the right and left side," he reported.
Dr. Lloyd E. Ratner, director of renal and pancreatic transplantation at Columbia University Medical Center in New York, who also attended the session, said, "I feel somewhat responsible for causing this problem with the right kidney because we were the ones that originally described the higher thrombosis rate with the right kidney with the laparoscopic donor nephrectomies. And I think it scared everyone off from this topic."
As several attendees noted, "there are surgical ways of getting around this," he agreed, offering two more options. "The first is that if we get a short vein, we’re not reluctant at all to put a piece of Dacron onto it, so you don’t even need to dig out the saphenous and cause additional time or morbidity to the patient. And the nice thing about the Dacron grafts is that they are corrugated and they don’t collapse. They also stretch, so you don’t need to cut them exactly precisely," he said.
"And number two is when you are stapling ... it’s often useful to be able to staple onto the cava and not get the vein in one staple byte." By using two passes in the appropriate configuration, "you actually get a cuff of cava, then you have plenty of vein," he explained.
In the study, Dr. Taber and colleagues retrospectively analyzed data from 58,599 adult living donor kidney transplants performed during 2000-2009 and captured in the United Network for Organ Sharing (UNOS) database. In 86% of cases, surgeons used the donor’s left kidney.
Recipients of left and right kidneys were statistically indistinguishable with respect to hospital length of stay, treatment for acute rejection within 6 months, acute rejection as a cause of graft failure, inadequate urine production in the first 24 hours, primary graft failure, graft thrombosis or surgical complication as a contributory cause of graft failure, and 1-year graft survival.
Those receiving a right kidney did have significant but small increases in rates of delayed graft function, as defined by the need for dialysis within 7 days of transplantation (5.7% vs. 4.2%), lack of decline in serum creatinine in the first 24 hours (19.7% vs. 16.4%), treatment for acute rejection within 1 year (12.7% vs. 11.8%), and graft thrombosis as the cause of graft failure (1.1% vs. 0.8%).
The Kaplan-Meier cumulative rate of graft survival was better for left kidneys than for right kidneys (P = .006), but "these are essentially superimposed numbers," said Dr. Taber, who disclosed no conflicts of interest related to the research.
The study had limitations, such as its retrospective design, lack of more detailed information about donor and recipient outcomes, and reliance on data as reported by centers, he acknowledged. Also, such large studies may pick up small differences that are not clinically meaningful.
"With ever-increasing demands for living donor transplantation, right-donor nephrectomies are being considered more often. Every effort should be made to leave the donor with the higher-functioning kidney, but at the same time maximizing the living donor pool," Dr. Taber concluded.
Dr. Taber disclosed no relevant conflicts of interest.
FROM THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: Choices of kidney to transplant may not need to hinge on left or right donor organ as a deciding factor.
Major Finding: Recipients of left and right kidneys were statistically indistinguishable with respect to hospital length of stay, treatment for acute rejection within 6 months, acute rejection as a cause of graft failure, inadequate urine production in the first 24 hours, and primary graft failure for acute rejection.
Data Source: A national retrospective cohort study of 58,599 adult living donor kidney transplants done during 2000-2009
Disclosures: Dr. Taber disclosed no relevant conflicts of interest.
Liver grafts donated after circulatory death increase early risk of diabetes
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
SAN FRANCISCO – The type of liver graft used in transplantation plays a large role in early development of new-onset diabetes, according to a retrospective study of 430 patients from the United Kingdom.
A team led by Dr. Hermien Hartog, an honorary clinical fellow in the Liver Unit, Queen Elizabeth Hospital, Birmingham, England, studied patients undergoing primary liver transplant between 2008 and 2012. Patients were excluded from the study if they had preexisting diabetes, had died, or had undergone retransplantation within 90 days.
The investigators assessed both the development of new-onset diabetes after transplant (NODAT), using criteria adapted from a published article (Transplantation 2013;96:58-64), and its resolution, defined as the date of cessation of antihyperglycemic therapy or the last episode of hyperglycemia.
Seventy-nine percent of the patients received grafts donated after brain death (DBD), Dr. Hartog reported at the annual meeting of the 2014 World Transplant Congress. Among the recipients of grafts donated after circulatory death (DCD), the mean warm ischemic time was 21 minutes.
With a median follow-up of 2.5 years, the cumulative 1-year incidence of NODAT was 19% in the entire cohort, with a median time to onset of 30 days. In the 44% of affected patients whose NODAT resolved, the median time to resolution was 150 days post transplantation, Dr. Hartog reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The cumulative 1-year incidence of NODAT was 23% in DCD graft recipients and 18% in DBD graft recipients, a nonsignificant difference. But when patients were stratified by graft type, "we saw an early occurrence and high peak incidence of NODAT in DCD graft recipients. Also, a larger proportion of these patients resolved their NODAT over time," she commented.
The overall temporal pattern suggested that "the effect that we see of graft type seems to be temporary and [lessens] over time when multifactorial factors come into play," according to Dr. Hartog.
In multivariate analyses, the risk of NODAT within 90 days of transplantation was higher for patients who received a DCD graft (hazard ratio, 1.8). More detailed analysis showed that the elevation of risk was greatest within the first 15 days.
"Our study confirms known associations with NODAT after liver transplantation but identifies DCD graft as a novel risk factor. This causes a temporary effect in the early post-transplant period that is independent from known risk factors," Dr. Hartog commented.
"Based on our observations, we hypothesize that hyperglycemia may be related to liver graft function through ischemia-reperfusion–induced hepatic insulin resistance," she added. "We are currently trying to confirm our data in an independent data set, which will also include postreperfusion glucose levels and correlation with the insulin receptor pathway in time-zero liver biopsies."
"The clinical relevance of our findings is as yet unknown," she acknowledged. However, they may help inform new approaches for graft optimization and selection.
Session cochair Dr. Darius Mirza, also of the University of Birmingham, asked, "Why does the pattern of recovery seem to be different in the DCDs versus the DBDs? Also, why are the cumulative incidence and the time frame so different?"
"Actually, in the literature, I have not seen any reports looking at the early post-transplant period. So most reports look at one time point, normally 1 year," Dr. Hartog replied. "What I think is that there is an early peak caused by DCD grafts that would explain why there is an early peak, but also why those patients recover later on. I think this peak is a bit obscure because there are also other factors that come into play, maybe after a while, that will obscure that first peak. If you would take those other factors out of the equation, I think you would just see a peak in the early period."
Dr. Mirza also wondered about the role of using DCD grafts that are accepted under extended criteria. "So you start off using mainly young, fit DCD livers. Now, the vast majority are extended-criteria DCD livers. Do you think that plays a role, or is it too early to say?"
"Yes, I think so," Dr. Hartog said, while adding that this phenomenon is likely not restricted to DCD grafts. "From earlier literature, there is a clear difference between a living donated graft and deceased donation. And it might also be that the extended grafts or the more steatotic grafts may exhibit this effect more than the better grafts."
Dr. Hartog disclosed no conflicts of interest relevant to the research.
AT THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: Recipients of liver grafts donated after circulatory death are at a slightly higher risk for post-transplant new-onset diabetes.
Major finding: The risk of new-onset diabetes within 90 days of transplantation was 1.8-fold higher for patients who received a DCD graft than for peers who received a DBD graft.
Data source: A retrospective cohort study of 430 primary liver transplant recipients
Disclosures: Dr. Hartog disclosed no relevant conflicts of interest.
VEGF-A value may stratify risk in pediatric heart transplant recipients
SAN FRANCISCO – Monitoring plasma vascular endothelial growth factor A (VEGF-A) may help identify pediatric heart transplant patients who are at increased risk for poor outcomes, according to a study reported at the 2014 World Transplant Congress.
"Cardiac allograft vasculopathy [CAV] remains the leading cause of chronic allograft failure after heart transplantation. ... Therefore, it’s important for us to be able to anticipate the development of CAV and open up a therapeutic window," said Dr. Kevin P. Daly of Harvard Medical School and Boston Children’s Hospital.
"Our pilot data suggest that plasma VEGF-A levels below 90 pg/mL identify a low-risk patient population in whom a decreased frequency of coronary angiography can be considered. Future studies are needed to determine if using plasma VEGF-A levels to modify CAV screening frequency results in equivalent patient outcomes, with decreased resource utilization and improved quality of life," Dr. Daly commented at the congress, which was sponsored by the American Society of Transplant Surgeons.
As the vascular endothelium is the primary target of the immune response in CAV, the researchers hypothesized that VEGF-A likely contributes to an inflammatory cycle that leads to vascular damage and occlusion in the graft.
Participants in the single-center prospective cohort study were 44 consecutive children aged 2 years or older who were at least 18 months (median, 6 years) out from heart transplantation. They were scheduled for routine annual screening coronary angiography during 2009, and had no or mild CAV.
Moderate or severe CAV developed in 32% of patients who had VEGF-A values above the median value at baseline (90 pg/mL), compared with 5% of patients who had VEGF-A values below the median level (P = .02). Patients who developed this vasculopathy were more likely to die (38% vs. 0%), undergo retransplantation (38% vs. 0%), experience a myocardial infarction (12% vs. 0%), and be listed for retransplantation (12% vs. 0%).
"While this is a biomarker and we have shown it is associated with CAV, we have not shown that it is causal," Dr. Daly cautioned. Any treatment directed against VEGF would have to be conducted in the context of a clinical trial to assess its impact.
A subset of patients becomes nonadherent to therapy; a subset that is highly sensitized before transplant may have donor-specific antibody, Dr. Daly said. So "we don’t think we fully understand the inciting event, ... [but] VEGF-A has been shown before to be elevated in antibody-mediated rejection, so it’s not surprising to see this association."
"We didn’t have these data available clinically because it was all a research study, so we didn’t intervene on any of the patients in this cohort. But we have started to think about whether or not we could use VEGF-A levels at least in our ... patients who might not have arterial access, and it might be difficult to survey them for CAV. I think in order to really understand the appropriate way to use it, we would need a larger study," he remarked.
Dr. Daly disclosed that he had no conflicts of interest relevant to the study.
This was a very nice preliminary study but extremely limited in scope, as it has few patients and limited mechanistic studies. Most importantly, there was no validation cohort as is required to have confidence that a biomarker is predictive. A lot more work will be necessary before significance can be assigned to the use of VEGF-A as a potential biomarker.
Dr. Daniel R. Salomon is a professor and program medical director at the Scripps Center for Organ Transplantation, Scripps Research Institute, La Jolla, Calif. He was the cochair at the session where the research was presented, and made his remarks in an interview. He had no relevant conflicts of interest.
This was a very nice preliminary study but extremely limited in scope, as it has few patients and limited mechanistic studies. Most importantly, there was no validation cohort as is required to have confidence that a biomarker is predictive. A lot more work will be necessary before significance can be assigned to the use of VEGF-A as a potential biomarker.
Dr. Daniel R. Salomon is a professor and program medical director at the Scripps Center for Organ Transplantation, Scripps Research Institute, La Jolla, Calif. He was the cochair at the session where the research was presented, and made his remarks in an interview. He had no relevant conflicts of interest.
This was a very nice preliminary study but extremely limited in scope, as it has few patients and limited mechanistic studies. Most importantly, there was no validation cohort as is required to have confidence that a biomarker is predictive. A lot more work will be necessary before significance can be assigned to the use of VEGF-A as a potential biomarker.
Dr. Daniel R. Salomon is a professor and program medical director at the Scripps Center for Organ Transplantation, Scripps Research Institute, La Jolla, Calif. He was the cochair at the session where the research was presented, and made his remarks in an interview. He had no relevant conflicts of interest.
SAN FRANCISCO – Monitoring plasma vascular endothelial growth factor A (VEGF-A) may help identify pediatric heart transplant patients who are at increased risk for poor outcomes, according to a study reported at the 2014 World Transplant Congress.
"Cardiac allograft vasculopathy [CAV] remains the leading cause of chronic allograft failure after heart transplantation. ... Therefore, it’s important for us to be able to anticipate the development of CAV and open up a therapeutic window," said Dr. Kevin P. Daly of Harvard Medical School and Boston Children’s Hospital.
"Our pilot data suggest that plasma VEGF-A levels below 90 pg/mL identify a low-risk patient population in whom a decreased frequency of coronary angiography can be considered. Future studies are needed to determine if using plasma VEGF-A levels to modify CAV screening frequency results in equivalent patient outcomes, with decreased resource utilization and improved quality of life," Dr. Daly commented at the congress, which was sponsored by the American Society of Transplant Surgeons.
As the vascular endothelium is the primary target of the immune response in CAV, the researchers hypothesized that VEGF-A likely contributes to an inflammatory cycle that leads to vascular damage and occlusion in the graft.
Participants in the single-center prospective cohort study were 44 consecutive children aged 2 years or older who were at least 18 months (median, 6 years) out from heart transplantation. They were scheduled for routine annual screening coronary angiography during 2009, and had no or mild CAV.
Moderate or severe CAV developed in 32% of patients who had VEGF-A values above the median value at baseline (90 pg/mL), compared with 5% of patients who had VEGF-A values below the median level (P = .02). Patients who developed this vasculopathy were more likely to die (38% vs. 0%), undergo retransplantation (38% vs. 0%), experience a myocardial infarction (12% vs. 0%), and be listed for retransplantation (12% vs. 0%).
"While this is a biomarker and we have shown it is associated with CAV, we have not shown that it is causal," Dr. Daly cautioned. Any treatment directed against VEGF would have to be conducted in the context of a clinical trial to assess its impact.
A subset of patients becomes nonadherent to therapy; a subset that is highly sensitized before transplant may have donor-specific antibody, Dr. Daly said. So "we don’t think we fully understand the inciting event, ... [but] VEGF-A has been shown before to be elevated in antibody-mediated rejection, so it’s not surprising to see this association."
"We didn’t have these data available clinically because it was all a research study, so we didn’t intervene on any of the patients in this cohort. But we have started to think about whether or not we could use VEGF-A levels at least in our ... patients who might not have arterial access, and it might be difficult to survey them for CAV. I think in order to really understand the appropriate way to use it, we would need a larger study," he remarked.
Dr. Daly disclosed that he had no conflicts of interest relevant to the study.
SAN FRANCISCO – Monitoring plasma vascular endothelial growth factor A (VEGF-A) may help identify pediatric heart transplant patients who are at increased risk for poor outcomes, according to a study reported at the 2014 World Transplant Congress.
"Cardiac allograft vasculopathy [CAV] remains the leading cause of chronic allograft failure after heart transplantation. ... Therefore, it’s important for us to be able to anticipate the development of CAV and open up a therapeutic window," said Dr. Kevin P. Daly of Harvard Medical School and Boston Children’s Hospital.
"Our pilot data suggest that plasma VEGF-A levels below 90 pg/mL identify a low-risk patient population in whom a decreased frequency of coronary angiography can be considered. Future studies are needed to determine if using plasma VEGF-A levels to modify CAV screening frequency results in equivalent patient outcomes, with decreased resource utilization and improved quality of life," Dr. Daly commented at the congress, which was sponsored by the American Society of Transplant Surgeons.
As the vascular endothelium is the primary target of the immune response in CAV, the researchers hypothesized that VEGF-A likely contributes to an inflammatory cycle that leads to vascular damage and occlusion in the graft.
Participants in the single-center prospective cohort study were 44 consecutive children aged 2 years or older who were at least 18 months (median, 6 years) out from heart transplantation. They were scheduled for routine annual screening coronary angiography during 2009, and had no or mild CAV.
Moderate or severe CAV developed in 32% of patients who had VEGF-A values above the median value at baseline (90 pg/mL), compared with 5% of patients who had VEGF-A values below the median level (P = .02). Patients who developed this vasculopathy were more likely to die (38% vs. 0%), undergo retransplantation (38% vs. 0%), experience a myocardial infarction (12% vs. 0%), and be listed for retransplantation (12% vs. 0%).
"While this is a biomarker and we have shown it is associated with CAV, we have not shown that it is causal," Dr. Daly cautioned. Any treatment directed against VEGF would have to be conducted in the context of a clinical trial to assess its impact.
A subset of patients becomes nonadherent to therapy; a subset that is highly sensitized before transplant may have donor-specific antibody, Dr. Daly said. So "we don’t think we fully understand the inciting event, ... [but] VEGF-A has been shown before to be elevated in antibody-mediated rejection, so it’s not surprising to see this association."
"We didn’t have these data available clinically because it was all a research study, so we didn’t intervene on any of the patients in this cohort. But we have started to think about whether or not we could use VEGF-A levels at least in our ... patients who might not have arterial access, and it might be difficult to survey them for CAV. I think in order to really understand the appropriate way to use it, we would need a larger study," he remarked.
Dr. Daly disclosed that he had no conflicts of interest relevant to the study.
AT THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: Plasma VEGF-A levels may be a biomarker of risk for pediatric heart transplant patients.
Major finding: Patients with plasma VEGF-A levels above the median value of 90 pg/mL had a 32% rate of moderate or severe cardiac allograft vasculopathy within 5 years.
Data source: A prospective cohort study of 44 consecutive children who had undergone heart transplantation.
Disclosures: Dr. Daly disclosed no relevant conflicts of interest.
Study outlines risk factors for solid organ cancers after liver transplantation
SAN FRANCISCO – The indication for liver transplant, the selection of immunosuppression therapy, and smoking status influence the long-term risk of new solid organ malignancies after liver transplantation, Dr. Sebastian Rademacher reported at the 2014 World Transplant Congress.
Multivariate analysis showed that recipients’ risk of a new solid organ cancer was elevated for those who had a history of smoking (1.89). Risk was reduced for recipients who received tacrolimus, compared with cyclosporine A (0.56), and for patients who had primary biliary cirrhosis/primary sclerosing cholangitis (0.47) or hepatitis C infection (0.21) as the indication for transplantation.
"I think we have to reoptimize and reevaluate the currently used immunosuppressive regimens," Dr. Rademacher said. "We have to adapt cancer surveillance programs for high-risk patients. Further studies into surveillance protocols and surrogate markers and long-term outcomes are recommended."
Researchers led by Dr. Rademacher, a surgeon at the Campus Virchow Clinic, Charité, Berlin, retrospectively studied 1,179 consecutive adults who underwent liver transplantation between 1988 and 2002 and had follow-up evaluations until 2013. Patients were 47 years old, on average, at the time of transplantation, and the median follow-up was 13.3 years.
Their 20-year cumulative incidence of solid organ cancers was 14%, he reported at the congress, which was sponsored by the American Society of Transplant Surgeons. The mean age at cancer diagnosis was 56 years.
The researchers used age- and sex-matched individuals from the German general population for comparison. The standardized incidence ratio in transplant recipients was 1.2 for breast cancer, 9.4 for cancer of the oropharynx and larynx, 1.7 for cancers of the colon and rectum, 3.0 for lung cancer, 3.9 for esophageal and stomach cancers, 4.5 for kidney and bladder cancers, and 4.6 for cancers of the female genitourinary system.
"We tried to evaluate the different immunosuppressive regimens and, over time, we had, I think, 27 different primary regimens," Dr. Rademacher said. Steroid-free regimens and low-dose steroid were part of that consideration, "but we segregated them out. For the five most frequent regimens, there was no significance. We assessed immunosuppressive regimens given over at least 2 years, but there was no difference between the regimens. Also, the trough levels of tacrolimus did not have any significant influence," he said.
The investigators did not have data on cumulative immunosuppression or mTOR [mammalian target of rapamycin] inhibitors, which were introduced late in the study period, according to Dr. Rademacher, who disclosed no relevant conflicts of interest. A surrogate marker of immunosuppression, rejection frequency, did not significantly predict the development of solid organ malignancies.
The devil is in the details of this study. The incidence of solid organ tumors being high in the immunosuppressed population is well known, well documented. The difficulty is getting at what is driving that risk.
Lots of things have changed in immunosuppressive therapy over the last 20-25-years. The authors give us a snapshot, but they weren’t able to tell us whether the changes in immunosuppression had any impact on cancer risk, especially in regard to specific types of cancers.
Dr. Darius Mirza of the University of Birmingham, England, was the session cochair at the meeting. He made his remarks in an interview after the session and declared having no relevant conflicts of interest.
The devil is in the details of this study. The incidence of solid organ tumors being high in the immunosuppressed population is well known, well documented. The difficulty is getting at what is driving that risk.
Lots of things have changed in immunosuppressive therapy over the last 20-25-years. The authors give us a snapshot, but they weren’t able to tell us whether the changes in immunosuppression had any impact on cancer risk, especially in regard to specific types of cancers.
Dr. Darius Mirza of the University of Birmingham, England, was the session cochair at the meeting. He made his remarks in an interview after the session and declared having no relevant conflicts of interest.
The devil is in the details of this study. The incidence of solid organ tumors being high in the immunosuppressed population is well known, well documented. The difficulty is getting at what is driving that risk.
Lots of things have changed in immunosuppressive therapy over the last 20-25-years. The authors give us a snapshot, but they weren’t able to tell us whether the changes in immunosuppression had any impact on cancer risk, especially in regard to specific types of cancers.
Dr. Darius Mirza of the University of Birmingham, England, was the session cochair at the meeting. He made his remarks in an interview after the session and declared having no relevant conflicts of interest.
SAN FRANCISCO – The indication for liver transplant, the selection of immunosuppression therapy, and smoking status influence the long-term risk of new solid organ malignancies after liver transplantation, Dr. Sebastian Rademacher reported at the 2014 World Transplant Congress.
Multivariate analysis showed that recipients’ risk of a new solid organ cancer was elevated for those who had a history of smoking (1.89). Risk was reduced for recipients who received tacrolimus, compared with cyclosporine A (0.56), and for patients who had primary biliary cirrhosis/primary sclerosing cholangitis (0.47) or hepatitis C infection (0.21) as the indication for transplantation.
"I think we have to reoptimize and reevaluate the currently used immunosuppressive regimens," Dr. Rademacher said. "We have to adapt cancer surveillance programs for high-risk patients. Further studies into surveillance protocols and surrogate markers and long-term outcomes are recommended."
Researchers led by Dr. Rademacher, a surgeon at the Campus Virchow Clinic, Charité, Berlin, retrospectively studied 1,179 consecutive adults who underwent liver transplantation between 1988 and 2002 and had follow-up evaluations until 2013. Patients were 47 years old, on average, at the time of transplantation, and the median follow-up was 13.3 years.
Their 20-year cumulative incidence of solid organ cancers was 14%, he reported at the congress, which was sponsored by the American Society of Transplant Surgeons. The mean age at cancer diagnosis was 56 years.
The researchers used age- and sex-matched individuals from the German general population for comparison. The standardized incidence ratio in transplant recipients was 1.2 for breast cancer, 9.4 for cancer of the oropharynx and larynx, 1.7 for cancers of the colon and rectum, 3.0 for lung cancer, 3.9 for esophageal and stomach cancers, 4.5 for kidney and bladder cancers, and 4.6 for cancers of the female genitourinary system.
"We tried to evaluate the different immunosuppressive regimens and, over time, we had, I think, 27 different primary regimens," Dr. Rademacher said. Steroid-free regimens and low-dose steroid were part of that consideration, "but we segregated them out. For the five most frequent regimens, there was no significance. We assessed immunosuppressive regimens given over at least 2 years, but there was no difference between the regimens. Also, the trough levels of tacrolimus did not have any significant influence," he said.
The investigators did not have data on cumulative immunosuppression or mTOR [mammalian target of rapamycin] inhibitors, which were introduced late in the study period, according to Dr. Rademacher, who disclosed no relevant conflicts of interest. A surrogate marker of immunosuppression, rejection frequency, did not significantly predict the development of solid organ malignancies.
SAN FRANCISCO – The indication for liver transplant, the selection of immunosuppression therapy, and smoking status influence the long-term risk of new solid organ malignancies after liver transplantation, Dr. Sebastian Rademacher reported at the 2014 World Transplant Congress.
Multivariate analysis showed that recipients’ risk of a new solid organ cancer was elevated for those who had a history of smoking (1.89). Risk was reduced for recipients who received tacrolimus, compared with cyclosporine A (0.56), and for patients who had primary biliary cirrhosis/primary sclerosing cholangitis (0.47) or hepatitis C infection (0.21) as the indication for transplantation.
"I think we have to reoptimize and reevaluate the currently used immunosuppressive regimens," Dr. Rademacher said. "We have to adapt cancer surveillance programs for high-risk patients. Further studies into surveillance protocols and surrogate markers and long-term outcomes are recommended."
Researchers led by Dr. Rademacher, a surgeon at the Campus Virchow Clinic, Charité, Berlin, retrospectively studied 1,179 consecutive adults who underwent liver transplantation between 1988 and 2002 and had follow-up evaluations until 2013. Patients were 47 years old, on average, at the time of transplantation, and the median follow-up was 13.3 years.
Their 20-year cumulative incidence of solid organ cancers was 14%, he reported at the congress, which was sponsored by the American Society of Transplant Surgeons. The mean age at cancer diagnosis was 56 years.
The researchers used age- and sex-matched individuals from the German general population for comparison. The standardized incidence ratio in transplant recipients was 1.2 for breast cancer, 9.4 for cancer of the oropharynx and larynx, 1.7 for cancers of the colon and rectum, 3.0 for lung cancer, 3.9 for esophageal and stomach cancers, 4.5 for kidney and bladder cancers, and 4.6 for cancers of the female genitourinary system.
"We tried to evaluate the different immunosuppressive regimens and, over time, we had, I think, 27 different primary regimens," Dr. Rademacher said. Steroid-free regimens and low-dose steroid were part of that consideration, "but we segregated them out. For the five most frequent regimens, there was no significance. We assessed immunosuppressive regimens given over at least 2 years, but there was no difference between the regimens. Also, the trough levels of tacrolimus did not have any significant influence," he said.
The investigators did not have data on cumulative immunosuppression or mTOR [mammalian target of rapamycin] inhibitors, which were introduced late in the study period, according to Dr. Rademacher, who disclosed no relevant conflicts of interest. A surrogate marker of immunosuppression, rejection frequency, did not significantly predict the development of solid organ malignancies.
AT THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: Immunosuppression regimen selection influences risk for solid cancers after liver transplantation.
Major Finding: Risk of a new solid organ cancer was reduced for liver transplant recipients who got tacrolimus, compared with cyclosporine A (0.56), for their immunosuppression regimen.
Data Source: A retrospective cohort study of 1,179 adults who underwent liver transplantation between 1988 and 2002
Disclosures: Dr. Rademacher disclosed no relevant conflicts of interest.
Early elimination of cyclosporine after heart transplant has renal benefit
SAN FRANCISCO – Use of an everolimus-containing regimen with early stopping of cyclosporine after de novo heart transplantation improves renal function and reduces cardiac allograft vasculopathy, without compromising graft outcomes, new data suggest.
These was among key findings of the randomized, open-label SCHEDULE (Scandinavian Heart Transplant Everolimus De Novo Study with Early Calcineurin Inhibitor Avoidance) reported at the 2014 World Transplant Congress.
"Renal dysfunction and cardiac allograft vasculopathy are markers for increased morbidity and mortality after heart transplantation," lead author Dr. Vilborg Sigurdardottir commented when introducing the study.
Patients in the trial were randomized evenly to a three-drug regimen containing the calcineurin inhibitor cyclosporine (Sandimmune) or to a four-drug regimen also containing the mTOR inhibitor everolimus (Zortress) with discontinuation of cyclosporine at week 7-11. Everolimus is currently approved by the Food and Drug Administration to prevent graft rejection in kidney and liver transplant recipients and, under another brand name, to treat some cancers.
Measured glomerular filtration rate (GFR) at 12 months, the trial’s primary outcome, was 30% better in the everolimus group than in the cyclosporine group (79.8 vs. 61.5 mL/min per 1.73 m2; P less than .001), according to results presented at the congress and recently published (Am. J. Transplant. 2014;14:1828-38).
The urinary albumin-creatinine ratio was higher in the everolimus group, but none of the patients had nephrotic levels of proteinuria.
Rates of adverse events were similar, with the exception that the everolimus group had a lower rate of cytomegalovirus infection (5% vs. 30%) and a higher rate of pneumonia (12% vs. 3%), Dr. Sigurdardottir reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The incidence of biopsy-proven acute rejection of at least grade 2R was greater with everolimus (40% vs. 18%, P = .01). However, at 12 months, the groups did not differ with respect to left ventricular function as assessed by echocardiography and biomarkers, and, in a cardiac reserve substudy, with respect to cardiac output and pulmonary capillary wedge pressure.
The incidence of cardiac allograft vasculopathy, defined as a mean media-intima thickness of at least 0.5 mm on intravascular ultrasound (IVUS), was lower in the everolimus group (51% vs. 65%, P less than .01), and progression assessed as the change in percent atheroma volume was slower in that group.
"Everolimus initiation and early cyclosporine elimination in de novo heart transplant recipients showed a highly significant improvement of renal function in terms of measured GFR, a reduced incidence of cytomegalovirus [a confirmatory result of previous large-scale studies], similar numbers of adverse and serious adverse events, and an increased incidence of treated acute rejection, however, without hemodynamic compromise and with preserved cardiac function and preserved cardiac reserve," concluded Dr. Sigurdardottir, who is medical director of heart transplantation at the Transplant Institute, Sahlgrenska University Hospital, Gothenburg, Sweden. "We saw also favorable coronary remodeling and less graft vasculopathy, as previously shown."
Among patients whose donor hearts had such disease, the increase in media-intima thickness and percent atheroma volume was less with everolimus than with cyclosporine, Dr. Sigurdardottir said. "Interestingly, we saw here that the total atheroma volume decreased between baseline and 12 months in the everolimus group in the patients who had preexisting donor disease."
An attendee from Norway said, "I am a nephrologist, and if I were to get a new heart, I’d rather have a GFR of 61 and no rejection than a GFR of 73 with rejection. Have you looked at the development of donor-specific antibodies in the ones who had rejection, because I’d like to live for more than a year – I’d like to live 3 years or 5 years or 10 years."
"You are absolutely right. At the time of transplantation, we would be looking at the acute problems, and we often see the kidney dysfunction, so we want to do something about that. But of course these studies need to tell us how patients fare longer term," Dr. Sigurdardottir agreed. None of the patients were found to have donor-specific antibodies, but the trial protocol did not mandate routine measurement, she said.
An attendee from Los Angeles commented, "We tried to do CNI [calcineurin inhibitor] weaning in 2006 and had hemodynamically compromised rejection. Now, I congratulate you on being innovative and having quadruple therapy from the get-go and then taking off the CNI. But the issue of increased rejection is important because ISHLT [International Society for Heart and Lung Transplantation] data show that that does lead to poorer outcome. It is countered by your improvement in renal function, but also your IVUS result, I think, is very important as well."
"Rejection is an important issue, but it is a common issue after transplantation. It was usually manageable. Since we didn’t see any hemodynamic compromise, it was up to each investigator to evaluate what to do. There were nine patients who converted to combination therapy," Dr. Sigurdardottir reported. "The future needs to tell us what the relevance of this rejection is, and we will do a follow-up at 3 and 5 years."
Dr. Sigurdardottir disclosed no relevant conflicts of interest. The trial was sponsored by Novartis, manufacturer of everolimus.
SAN FRANCISCO – Use of an everolimus-containing regimen with early stopping of cyclosporine after de novo heart transplantation improves renal function and reduces cardiac allograft vasculopathy, without compromising graft outcomes, new data suggest.
These was among key findings of the randomized, open-label SCHEDULE (Scandinavian Heart Transplant Everolimus De Novo Study with Early Calcineurin Inhibitor Avoidance) reported at the 2014 World Transplant Congress.
"Renal dysfunction and cardiac allograft vasculopathy are markers for increased morbidity and mortality after heart transplantation," lead author Dr. Vilborg Sigurdardottir commented when introducing the study.
Patients in the trial were randomized evenly to a three-drug regimen containing the calcineurin inhibitor cyclosporine (Sandimmune) or to a four-drug regimen also containing the mTOR inhibitor everolimus (Zortress) with discontinuation of cyclosporine at week 7-11. Everolimus is currently approved by the Food and Drug Administration to prevent graft rejection in kidney and liver transplant recipients and, under another brand name, to treat some cancers.
Measured glomerular filtration rate (GFR) at 12 months, the trial’s primary outcome, was 30% better in the everolimus group than in the cyclosporine group (79.8 vs. 61.5 mL/min per 1.73 m2; P less than .001), according to results presented at the congress and recently published (Am. J. Transplant. 2014;14:1828-38).
The urinary albumin-creatinine ratio was higher in the everolimus group, but none of the patients had nephrotic levels of proteinuria.
Rates of adverse events were similar, with the exception that the everolimus group had a lower rate of cytomegalovirus infection (5% vs. 30%) and a higher rate of pneumonia (12% vs. 3%), Dr. Sigurdardottir reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The incidence of biopsy-proven acute rejection of at least grade 2R was greater with everolimus (40% vs. 18%, P = .01). However, at 12 months, the groups did not differ with respect to left ventricular function as assessed by echocardiography and biomarkers, and, in a cardiac reserve substudy, with respect to cardiac output and pulmonary capillary wedge pressure.
The incidence of cardiac allograft vasculopathy, defined as a mean media-intima thickness of at least 0.5 mm on intravascular ultrasound (IVUS), was lower in the everolimus group (51% vs. 65%, P less than .01), and progression assessed as the change in percent atheroma volume was slower in that group.
"Everolimus initiation and early cyclosporine elimination in de novo heart transplant recipients showed a highly significant improvement of renal function in terms of measured GFR, a reduced incidence of cytomegalovirus [a confirmatory result of previous large-scale studies], similar numbers of adverse and serious adverse events, and an increased incidence of treated acute rejection, however, without hemodynamic compromise and with preserved cardiac function and preserved cardiac reserve," concluded Dr. Sigurdardottir, who is medical director of heart transplantation at the Transplant Institute, Sahlgrenska University Hospital, Gothenburg, Sweden. "We saw also favorable coronary remodeling and less graft vasculopathy, as previously shown."
Among patients whose donor hearts had such disease, the increase in media-intima thickness and percent atheroma volume was less with everolimus than with cyclosporine, Dr. Sigurdardottir said. "Interestingly, we saw here that the total atheroma volume decreased between baseline and 12 months in the everolimus group in the patients who had preexisting donor disease."
An attendee from Norway said, "I am a nephrologist, and if I were to get a new heart, I’d rather have a GFR of 61 and no rejection than a GFR of 73 with rejection. Have you looked at the development of donor-specific antibodies in the ones who had rejection, because I’d like to live for more than a year – I’d like to live 3 years or 5 years or 10 years."
"You are absolutely right. At the time of transplantation, we would be looking at the acute problems, and we often see the kidney dysfunction, so we want to do something about that. But of course these studies need to tell us how patients fare longer term," Dr. Sigurdardottir agreed. None of the patients were found to have donor-specific antibodies, but the trial protocol did not mandate routine measurement, she said.
An attendee from Los Angeles commented, "We tried to do CNI [calcineurin inhibitor] weaning in 2006 and had hemodynamically compromised rejection. Now, I congratulate you on being innovative and having quadruple therapy from the get-go and then taking off the CNI. But the issue of increased rejection is important because ISHLT [International Society for Heart and Lung Transplantation] data show that that does lead to poorer outcome. It is countered by your improvement in renal function, but also your IVUS result, I think, is very important as well."
"Rejection is an important issue, but it is a common issue after transplantation. It was usually manageable. Since we didn’t see any hemodynamic compromise, it was up to each investigator to evaluate what to do. There were nine patients who converted to combination therapy," Dr. Sigurdardottir reported. "The future needs to tell us what the relevance of this rejection is, and we will do a follow-up at 3 and 5 years."
Dr. Sigurdardottir disclosed no relevant conflicts of interest. The trial was sponsored by Novartis, manufacturer of everolimus.
SAN FRANCISCO – Use of an everolimus-containing regimen with early stopping of cyclosporine after de novo heart transplantation improves renal function and reduces cardiac allograft vasculopathy, without compromising graft outcomes, new data suggest.
These was among key findings of the randomized, open-label SCHEDULE (Scandinavian Heart Transplant Everolimus De Novo Study with Early Calcineurin Inhibitor Avoidance) reported at the 2014 World Transplant Congress.
"Renal dysfunction and cardiac allograft vasculopathy are markers for increased morbidity and mortality after heart transplantation," lead author Dr. Vilborg Sigurdardottir commented when introducing the study.
Patients in the trial were randomized evenly to a three-drug regimen containing the calcineurin inhibitor cyclosporine (Sandimmune) or to a four-drug regimen also containing the mTOR inhibitor everolimus (Zortress) with discontinuation of cyclosporine at week 7-11. Everolimus is currently approved by the Food and Drug Administration to prevent graft rejection in kidney and liver transplant recipients and, under another brand name, to treat some cancers.
Measured glomerular filtration rate (GFR) at 12 months, the trial’s primary outcome, was 30% better in the everolimus group than in the cyclosporine group (79.8 vs. 61.5 mL/min per 1.73 m2; P less than .001), according to results presented at the congress and recently published (Am. J. Transplant. 2014;14:1828-38).
The urinary albumin-creatinine ratio was higher in the everolimus group, but none of the patients had nephrotic levels of proteinuria.
Rates of adverse events were similar, with the exception that the everolimus group had a lower rate of cytomegalovirus infection (5% vs. 30%) and a higher rate of pneumonia (12% vs. 3%), Dr. Sigurdardottir reported at the congress, which was sponsored by the American Society of Transplant Surgeons.
The incidence of biopsy-proven acute rejection of at least grade 2R was greater with everolimus (40% vs. 18%, P = .01). However, at 12 months, the groups did not differ with respect to left ventricular function as assessed by echocardiography and biomarkers, and, in a cardiac reserve substudy, with respect to cardiac output and pulmonary capillary wedge pressure.
The incidence of cardiac allograft vasculopathy, defined as a mean media-intima thickness of at least 0.5 mm on intravascular ultrasound (IVUS), was lower in the everolimus group (51% vs. 65%, P less than .01), and progression assessed as the change in percent atheroma volume was slower in that group.
"Everolimus initiation and early cyclosporine elimination in de novo heart transplant recipients showed a highly significant improvement of renal function in terms of measured GFR, a reduced incidence of cytomegalovirus [a confirmatory result of previous large-scale studies], similar numbers of adverse and serious adverse events, and an increased incidence of treated acute rejection, however, without hemodynamic compromise and with preserved cardiac function and preserved cardiac reserve," concluded Dr. Sigurdardottir, who is medical director of heart transplantation at the Transplant Institute, Sahlgrenska University Hospital, Gothenburg, Sweden. "We saw also favorable coronary remodeling and less graft vasculopathy, as previously shown."
Among patients whose donor hearts had such disease, the increase in media-intima thickness and percent atheroma volume was less with everolimus than with cyclosporine, Dr. Sigurdardottir said. "Interestingly, we saw here that the total atheroma volume decreased between baseline and 12 months in the everolimus group in the patients who had preexisting donor disease."
An attendee from Norway said, "I am a nephrologist, and if I were to get a new heart, I’d rather have a GFR of 61 and no rejection than a GFR of 73 with rejection. Have you looked at the development of donor-specific antibodies in the ones who had rejection, because I’d like to live for more than a year – I’d like to live 3 years or 5 years or 10 years."
"You are absolutely right. At the time of transplantation, we would be looking at the acute problems, and we often see the kidney dysfunction, so we want to do something about that. But of course these studies need to tell us how patients fare longer term," Dr. Sigurdardottir agreed. None of the patients were found to have donor-specific antibodies, but the trial protocol did not mandate routine measurement, she said.
An attendee from Los Angeles commented, "We tried to do CNI [calcineurin inhibitor] weaning in 2006 and had hemodynamically compromised rejection. Now, I congratulate you on being innovative and having quadruple therapy from the get-go and then taking off the CNI. But the issue of increased rejection is important because ISHLT [International Society for Heart and Lung Transplantation] data show that that does lead to poorer outcome. It is countered by your improvement in renal function, but also your IVUS result, I think, is very important as well."
"Rejection is an important issue, but it is a common issue after transplantation. It was usually manageable. Since we didn’t see any hemodynamic compromise, it was up to each investigator to evaluate what to do. There were nine patients who converted to combination therapy," Dr. Sigurdardottir reported. "The future needs to tell us what the relevance of this rejection is, and we will do a follow-up at 3 and 5 years."
Dr. Sigurdardottir disclosed no relevant conflicts of interest. The trial was sponsored by Novartis, manufacturer of everolimus.
FROM THE 2014 WORLD TRANSPLANT CONGRESS
Key clinical point: For post–heart transplant patients, early cessation of cyclosporine when using an everolimus-containing regimen appears to be safe and did not compromise graft outcomes.
Major finding: Compared with patients continued on cyclosporine, patients taken off this agent at 7-11 weeks had a 30% better measured glomerular filtration rate at 12 months.
Data source: A randomized, open-label trial of 115 patients undergoing de novo heart transplantation
Disclosures: Dr. Sigurdardottir disclosed no relevant conflicts of interest. The trial was sponsored by Novartis, manufacturer of everolimus.