Drugs produce comparable results in CP-CML

Article Type
Changed
Wed, 07/06/2016 - 05:00
Display Headline
Drugs produce comparable results in CP-CML

Imatinib tablet

Long-term results from the DASISION trial suggest that dasatinib and imatinib produce similar outcomes in patients with newly diagnosed chronic phase chronic myeloid leukemia (CP-CML).

Although patients who received dasatinib experienced faster and deeper molecular responses than patients who received imatinib, the overall survival and progression-free survival rates were similar between the treatment arms.

Overall, adverse events (AEs) were similar between the arms as well.

Researchers said these results suggest that dasatinib should continue to be considered an option for patients with newly diagnosed CP-CML.

The team reported the results of this study in the Journal of Clinical Oncology. The research was sponsored by Bristol-Myers Squibb.

The trial enrolled 519 patients with newly diagnosed CP-CML. They were randomized to receive dasatinib at 100 mg once daily (n=259) or imatinib at 400 mg once daily (n=260). Baseline characteristics were well-balanced between the arms.

At 5 years of follow-up, 61% of patients in the dasatinib arm and 63% of patients in the imatinib arm remained on treatment.

Response and survival

The cumulative 5-year rate of major molecular response was 76% in the dasatinib arm and 64% in the imatinib arm (P=0.0022). The rates of MR4.5 were 42% and 33%, respectively (P=0.0251).

The estimated 5-year overall survival was 91% in the dasatinib arm and 90% in the imatinib arm (hazard ratio=1.01; 95% CI, 0.58 to 1.73).

The estimated 5-year progression-free survival was 85% and 86%, respectively (hazard ratio=1.06; 95% CI, 0.68 to 1.66).

Safety

In both treatment arms, most AEs were grade 1 or 2. Grade 3/4 AEs occurred in 15% of patients in the dasatinib arm and 11% of patients in the imatinib arm.

Rates of grade 3/4 hematologic AEs tended to be higher in the dasatinib arm than the imatinib arm.

But the rates of most drug-related, nonhematologic AEs were lower in the dasatinib arm than the imatinib arm or were comparable between the arms.

The exception was drug-related pleural effusion, which was more common with dasatinib (28%) than with imatinib (0.8%).

Drug-related AEs were largely manageable, although they led to treatment discontinuation in 16% of dasatinib-treated patients and 7% of imatinib-treated patients.

By 5 years, 26 patients (10%) in each treatment arm had died. Nine patients in the dasatinib arm died of disease progression, as did 17 patients in the imatinib arm.

Publications
Topics

Imatinib tablet

Long-term results from the DASISION trial suggest that dasatinib and imatinib produce similar outcomes in patients with newly diagnosed chronic phase chronic myeloid leukemia (CP-CML).

Although patients who received dasatinib experienced faster and deeper molecular responses than patients who received imatinib, the overall survival and progression-free survival rates were similar between the treatment arms.

Overall, adverse events (AEs) were similar between the arms as well.

Researchers said these results suggest that dasatinib should continue to be considered an option for patients with newly diagnosed CP-CML.

The team reported the results of this study in the Journal of Clinical Oncology. The research was sponsored by Bristol-Myers Squibb.

The trial enrolled 519 patients with newly diagnosed CP-CML. They were randomized to receive dasatinib at 100 mg once daily (n=259) or imatinib at 400 mg once daily (n=260). Baseline characteristics were well-balanced between the arms.

At 5 years of follow-up, 61% of patients in the dasatinib arm and 63% of patients in the imatinib arm remained on treatment.

Response and survival

The cumulative 5-year rate of major molecular response was 76% in the dasatinib arm and 64% in the imatinib arm (P=0.0022). The rates of MR4.5 were 42% and 33%, respectively (P=0.0251).

The estimated 5-year overall survival was 91% in the dasatinib arm and 90% in the imatinib arm (hazard ratio=1.01; 95% CI, 0.58 to 1.73).

The estimated 5-year progression-free survival was 85% and 86%, respectively (hazard ratio=1.06; 95% CI, 0.68 to 1.66).

Safety

In both treatment arms, most AEs were grade 1 or 2. Grade 3/4 AEs occurred in 15% of patients in the dasatinib arm and 11% of patients in the imatinib arm.

Rates of grade 3/4 hematologic AEs tended to be higher in the dasatinib arm than the imatinib arm.

But the rates of most drug-related, nonhematologic AEs were lower in the dasatinib arm than the imatinib arm or were comparable between the arms.

The exception was drug-related pleural effusion, which was more common with dasatinib (28%) than with imatinib (0.8%).

Drug-related AEs were largely manageable, although they led to treatment discontinuation in 16% of dasatinib-treated patients and 7% of imatinib-treated patients.

By 5 years, 26 patients (10%) in each treatment arm had died. Nine patients in the dasatinib arm died of disease progression, as did 17 patients in the imatinib arm.

Imatinib tablet

Long-term results from the DASISION trial suggest that dasatinib and imatinib produce similar outcomes in patients with newly diagnosed chronic phase chronic myeloid leukemia (CP-CML).

Although patients who received dasatinib experienced faster and deeper molecular responses than patients who received imatinib, the overall survival and progression-free survival rates were similar between the treatment arms.

Overall, adverse events (AEs) were similar between the arms as well.

Researchers said these results suggest that dasatinib should continue to be considered an option for patients with newly diagnosed CP-CML.

The team reported the results of this study in the Journal of Clinical Oncology. The research was sponsored by Bristol-Myers Squibb.

The trial enrolled 519 patients with newly diagnosed CP-CML. They were randomized to receive dasatinib at 100 mg once daily (n=259) or imatinib at 400 mg once daily (n=260). Baseline characteristics were well-balanced between the arms.

At 5 years of follow-up, 61% of patients in the dasatinib arm and 63% of patients in the imatinib arm remained on treatment.

Response and survival

The cumulative 5-year rate of major molecular response was 76% in the dasatinib arm and 64% in the imatinib arm (P=0.0022). The rates of MR4.5 were 42% and 33%, respectively (P=0.0251).

The estimated 5-year overall survival was 91% in the dasatinib arm and 90% in the imatinib arm (hazard ratio=1.01; 95% CI, 0.58 to 1.73).

The estimated 5-year progression-free survival was 85% and 86%, respectively (hazard ratio=1.06; 95% CI, 0.68 to 1.66).

Safety

In both treatment arms, most AEs were grade 1 or 2. Grade 3/4 AEs occurred in 15% of patients in the dasatinib arm and 11% of patients in the imatinib arm.

Rates of grade 3/4 hematologic AEs tended to be higher in the dasatinib arm than the imatinib arm.

But the rates of most drug-related, nonhematologic AEs were lower in the dasatinib arm than the imatinib arm or were comparable between the arms.

The exception was drug-related pleural effusion, which was more common with dasatinib (28%) than with imatinib (0.8%).

Drug-related AEs were largely manageable, although they led to treatment discontinuation in 16% of dasatinib-treated patients and 7% of imatinib-treated patients.

By 5 years, 26 patients (10%) in each treatment arm had died. Nine patients in the dasatinib arm died of disease progression, as did 17 patients in the imatinib arm.

Publications
Publications
Topics
Article Type
Display Headline
Drugs produce comparable results in CP-CML
Display Headline
Drugs produce comparable results in CP-CML
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

EC expands approved use of carfilzomib

Article Type
Changed
Wed, 07/06/2016 - 05:00
Display Headline
EC expands approved use of carfilzomib

Carfilzomib (Kyprolis)

Photo from Amgen

The European Commission (EC) has expanded the approved use of the proteasome inhibitor carfilzomib (Kyprolis).

The drug is now approved for use in combination with dexamethasone to treat adults with multiple myeloma (MM) who have received at least 1 prior therapy.

Carfilzomib was previously approved by the EC for use in combination with lenalidomide and dexamethasone to treat adult MM patients who have received at least 1 prior therapy.

The EC approved the extended indication for carfilzomib based on data from the phase 3 ENDEAVOR trial.

The trial included 929 MM patients whose disease had relapsed after 1 to 3 prior therapeutic regimens.

The patients received either carfilzomib plus dexamethasone (n=464) or bortezomib plus dexamethasone (n=465) until disease progression.

The primary endpoint was progression-free survival. The median progression-free survival was 18.7 months in the carfilzomib arm and 9.4 months in the bortezomib arm. The hazard ratio was 0.53 (P<0.0001).

Overall survival data were not yet mature at last follow-up.

Treatment discontinuation due to adverse events and on-study deaths were comparable between the 2 treatment arms.

However, a number of known adverse events were reported at a higher rate in the carfilzomib arm than the bortezomib arm, including dyspnea (28% vs 13%), hypertension (25% vs 3%), pyrexia (27% vs 14%), cough (25% vs 15%), cardiac failure (8% vs 3%), and acute renal failure (8% vs 5%).

Carfilzomib is marketed as Kyprolis by Onyx Pharmaceuticals, Inc., a subsidiary of Amgen that holds development and commercialization rights to the drug globally, with the exception of Japan.

Publications
Topics

Carfilzomib (Kyprolis)

Photo from Amgen

The European Commission (EC) has expanded the approved use of the proteasome inhibitor carfilzomib (Kyprolis).

The drug is now approved for use in combination with dexamethasone to treat adults with multiple myeloma (MM) who have received at least 1 prior therapy.

Carfilzomib was previously approved by the EC for use in combination with lenalidomide and dexamethasone to treat adult MM patients who have received at least 1 prior therapy.

The EC approved the extended indication for carfilzomib based on data from the phase 3 ENDEAVOR trial.

The trial included 929 MM patients whose disease had relapsed after 1 to 3 prior therapeutic regimens.

The patients received either carfilzomib plus dexamethasone (n=464) or bortezomib plus dexamethasone (n=465) until disease progression.

The primary endpoint was progression-free survival. The median progression-free survival was 18.7 months in the carfilzomib arm and 9.4 months in the bortezomib arm. The hazard ratio was 0.53 (P<0.0001).

Overall survival data were not yet mature at last follow-up.

Treatment discontinuation due to adverse events and on-study deaths were comparable between the 2 treatment arms.

However, a number of known adverse events were reported at a higher rate in the carfilzomib arm than the bortezomib arm, including dyspnea (28% vs 13%), hypertension (25% vs 3%), pyrexia (27% vs 14%), cough (25% vs 15%), cardiac failure (8% vs 3%), and acute renal failure (8% vs 5%).

Carfilzomib is marketed as Kyprolis by Onyx Pharmaceuticals, Inc., a subsidiary of Amgen that holds development and commercialization rights to the drug globally, with the exception of Japan.

Carfilzomib (Kyprolis)

Photo from Amgen

The European Commission (EC) has expanded the approved use of the proteasome inhibitor carfilzomib (Kyprolis).

The drug is now approved for use in combination with dexamethasone to treat adults with multiple myeloma (MM) who have received at least 1 prior therapy.

Carfilzomib was previously approved by the EC for use in combination with lenalidomide and dexamethasone to treat adult MM patients who have received at least 1 prior therapy.

The EC approved the extended indication for carfilzomib based on data from the phase 3 ENDEAVOR trial.

The trial included 929 MM patients whose disease had relapsed after 1 to 3 prior therapeutic regimens.

The patients received either carfilzomib plus dexamethasone (n=464) or bortezomib plus dexamethasone (n=465) until disease progression.

The primary endpoint was progression-free survival. The median progression-free survival was 18.7 months in the carfilzomib arm and 9.4 months in the bortezomib arm. The hazard ratio was 0.53 (P<0.0001).

Overall survival data were not yet mature at last follow-up.

Treatment discontinuation due to adverse events and on-study deaths were comparable between the 2 treatment arms.

However, a number of known adverse events were reported at a higher rate in the carfilzomib arm than the bortezomib arm, including dyspnea (28% vs 13%), hypertension (25% vs 3%), pyrexia (27% vs 14%), cough (25% vs 15%), cardiac failure (8% vs 3%), and acute renal failure (8% vs 5%).

Carfilzomib is marketed as Kyprolis by Onyx Pharmaceuticals, Inc., a subsidiary of Amgen that holds development and commercialization rights to the drug globally, with the exception of Japan.

Publications
Publications
Topics
Article Type
Display Headline
EC expands approved use of carfilzomib
Display Headline
EC expands approved use of carfilzomib
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

How multiple infections make malaria worse

Article Type
Changed
Wed, 07/06/2016 - 05:00
Display Headline
How multiple infections make malaria worse

Plasmodium sporozoite

Image by Ute Frevert

and Margaret Shear

New research suggests that infections with 2 types of malaria parasite lead to greater health risks because 1 species helps the other thrive.

Investigators sought to understand what happens when the 2 most common malaria parasites cause infection at the same time, as they are known to attack the body in different ways.

The team found the first parasite helps provide the second with more of the resources it needs to prosper.

“Immune responses are assumed to determine the outcome of interactions between parasite species, but our study clearly shows that resources can be more important,” said Sarah Reece, of the University of Edinburgh in Scotland.

“Our findings also challenge ideas that 1 species will outcompete the other, which explains why infections involving 2 parasite species can pose a greater health risk to patients.”

Dr Reece and her colleagues recounted these findings in Ecology Letters.

In humans, the malaria parasite Plasmodium falciparum infects red blood cells of all ages, while the Plasmodium vivax parasite attacks only young red blood cells.

The current study, conducted in mice with equivalent malaria parasites (P chabaudi and P yoelii), showed that the body’s response to the first infection produces more of the type of red blood cell the second parasite needs.

In response to the first infection, millions of red blood cells are destroyed. The body responds by replenishing these cells.

The fresh cells then become infected by the second type of parasite, making the infection worse.

The investigators said these results appear to explain why infections with both P falciparum and P vivax often have worse outcomes for patients than infections with a single malaria parasite.

Publications
Topics

Plasmodium sporozoite

Image by Ute Frevert

and Margaret Shear

New research suggests that infections with 2 types of malaria parasite lead to greater health risks because 1 species helps the other thrive.

Investigators sought to understand what happens when the 2 most common malaria parasites cause infection at the same time, as they are known to attack the body in different ways.

The team found the first parasite helps provide the second with more of the resources it needs to prosper.

“Immune responses are assumed to determine the outcome of interactions between parasite species, but our study clearly shows that resources can be more important,” said Sarah Reece, of the University of Edinburgh in Scotland.

“Our findings also challenge ideas that 1 species will outcompete the other, which explains why infections involving 2 parasite species can pose a greater health risk to patients.”

Dr Reece and her colleagues recounted these findings in Ecology Letters.

In humans, the malaria parasite Plasmodium falciparum infects red blood cells of all ages, while the Plasmodium vivax parasite attacks only young red blood cells.

The current study, conducted in mice with equivalent malaria parasites (P chabaudi and P yoelii), showed that the body’s response to the first infection produces more of the type of red blood cell the second parasite needs.

In response to the first infection, millions of red blood cells are destroyed. The body responds by replenishing these cells.

The fresh cells then become infected by the second type of parasite, making the infection worse.

The investigators said these results appear to explain why infections with both P falciparum and P vivax often have worse outcomes for patients than infections with a single malaria parasite.

Plasmodium sporozoite

Image by Ute Frevert

and Margaret Shear

New research suggests that infections with 2 types of malaria parasite lead to greater health risks because 1 species helps the other thrive.

Investigators sought to understand what happens when the 2 most common malaria parasites cause infection at the same time, as they are known to attack the body in different ways.

The team found the first parasite helps provide the second with more of the resources it needs to prosper.

“Immune responses are assumed to determine the outcome of interactions between parasite species, but our study clearly shows that resources can be more important,” said Sarah Reece, of the University of Edinburgh in Scotland.

“Our findings also challenge ideas that 1 species will outcompete the other, which explains why infections involving 2 parasite species can pose a greater health risk to patients.”

Dr Reece and her colleagues recounted these findings in Ecology Letters.

In humans, the malaria parasite Plasmodium falciparum infects red blood cells of all ages, while the Plasmodium vivax parasite attacks only young red blood cells.

The current study, conducted in mice with equivalent malaria parasites (P chabaudi and P yoelii), showed that the body’s response to the first infection produces more of the type of red blood cell the second parasite needs.

In response to the first infection, millions of red blood cells are destroyed. The body responds by replenishing these cells.

The fresh cells then become infected by the second type of parasite, making the infection worse.

The investigators said these results appear to explain why infections with both P falciparum and P vivax often have worse outcomes for patients than infections with a single malaria parasite.

Publications
Publications
Topics
Article Type
Display Headline
How multiple infections make malaria worse
Display Headline
How multiple infections make malaria worse
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Study explains link between malignant hyperthermia and bleeding abnormalities

Article Type
Changed
Wed, 07/06/2016 - 05:00
Display Headline
Study explains link between malignant hyperthermia and bleeding abnormalities

Lab mouse

A new study helps explain why some patients with malignant hyperthermia may suffer from excessive bleeding.

The findings suggest a mutation that causes malignant hyperthermia can disrupt calcium signaling in vascular smooth muscle cells, leading to bleeding abnormalities.

What’s more, researchers found that a drug clinically approved to treat muscle-related symptoms in malignant hyperthermia helped stop bleeding.

Rubén Lopez, of Basel University Hospital in Switzerland, and his colleagues conducted this research and reported their findings in Science Signaling.

Patients with malignant hyperthermia experience dangerously high fever and severe muscle contractions when exposed to general anesthesia.

Malignant hyperthermia is often caused by mutations in the RYR1 gene, which encodes a calcium channel in skeletal muscle called ryanodine receptor type 1 (RyR1).

For some patients with these mutations, malignant hyperthermia is accompanied by a mild bleeding disorder, but whether the 2 conditions are connected has not been clear.

Working in a mouse model of malignant hyperthermia, researchers found that vascular smooth muscle cells with mutated RyR1 displayed frequent spikes in calcium levels, known as calcium sparks. These sparks led to excessive vasodilation and prolonged bleeding.

Blocking the receptor with dantrolene, a drug used to treat malignant hyperthermia, helped reduce bleeding in the mice and in a single human patient, pointing to an unexpected benefit from the drug.

The findings suggest that mutations in RyR1, which is also found in other types of smooth muscle cells such as those in the bladder and uterus, may have a wider range of effects than previously thought.

Publications
Topics

Lab mouse

A new study helps explain why some patients with malignant hyperthermia may suffer from excessive bleeding.

The findings suggest a mutation that causes malignant hyperthermia can disrupt calcium signaling in vascular smooth muscle cells, leading to bleeding abnormalities.

What’s more, researchers found that a drug clinically approved to treat muscle-related symptoms in malignant hyperthermia helped stop bleeding.

Rubén Lopez, of Basel University Hospital in Switzerland, and his colleagues conducted this research and reported their findings in Science Signaling.

Patients with malignant hyperthermia experience dangerously high fever and severe muscle contractions when exposed to general anesthesia.

Malignant hyperthermia is often caused by mutations in the RYR1 gene, which encodes a calcium channel in skeletal muscle called ryanodine receptor type 1 (RyR1).

For some patients with these mutations, malignant hyperthermia is accompanied by a mild bleeding disorder, but whether the 2 conditions are connected has not been clear.

Working in a mouse model of malignant hyperthermia, researchers found that vascular smooth muscle cells with mutated RyR1 displayed frequent spikes in calcium levels, known as calcium sparks. These sparks led to excessive vasodilation and prolonged bleeding.

Blocking the receptor with dantrolene, a drug used to treat malignant hyperthermia, helped reduce bleeding in the mice and in a single human patient, pointing to an unexpected benefit from the drug.

The findings suggest that mutations in RyR1, which is also found in other types of smooth muscle cells such as those in the bladder and uterus, may have a wider range of effects than previously thought.

Lab mouse

A new study helps explain why some patients with malignant hyperthermia may suffer from excessive bleeding.

The findings suggest a mutation that causes malignant hyperthermia can disrupt calcium signaling in vascular smooth muscle cells, leading to bleeding abnormalities.

What’s more, researchers found that a drug clinically approved to treat muscle-related symptoms in malignant hyperthermia helped stop bleeding.

Rubén Lopez, of Basel University Hospital in Switzerland, and his colleagues conducted this research and reported their findings in Science Signaling.

Patients with malignant hyperthermia experience dangerously high fever and severe muscle contractions when exposed to general anesthesia.

Malignant hyperthermia is often caused by mutations in the RYR1 gene, which encodes a calcium channel in skeletal muscle called ryanodine receptor type 1 (RyR1).

For some patients with these mutations, malignant hyperthermia is accompanied by a mild bleeding disorder, but whether the 2 conditions are connected has not been clear.

Working in a mouse model of malignant hyperthermia, researchers found that vascular smooth muscle cells with mutated RyR1 displayed frequent spikes in calcium levels, known as calcium sparks. These sparks led to excessive vasodilation and prolonged bleeding.

Blocking the receptor with dantrolene, a drug used to treat malignant hyperthermia, helped reduce bleeding in the mice and in a single human patient, pointing to an unexpected benefit from the drug.

The findings suggest that mutations in RyR1, which is also found in other types of smooth muscle cells such as those in the bladder and uterus, may have a wider range of effects than previously thought.

Publications
Publications
Topics
Article Type
Display Headline
Study explains link between malignant hyperthermia and bleeding abnormalities
Display Headline
Study explains link between malignant hyperthermia and bleeding abnormalities
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

David Henry's JCSO podcast, July 2016

Article Type
Changed
Fri, 01/04/2019 - 11:13
Display Headline
David Henry's JCSO podcast, July 2016

In the July podcast for The Journal of Community and Supportive Oncology, Dr David Henry discusses an editorial by Dr Linda bosserman, in which she presents the case for pathways and the importance of processes and team work in paving the way for value-based care. Survivor care is the focus of 2 Original Reports in which investigators report on adolescent and young adult perceptions of cancer survivor care and supportive programming and on the symptoms, unmet need, and quality of life among recent breast cancer survivors. Also in the Original Report section are reports on the impact of loss of income and medicine costs on the financial burden for cancer patients in Australia and the use of a gene-panel for testing for hereditary ovarian cancer. Dr Henry also looks at 2 Case Reports, one in which a patient undergoes multivisceral resection for growing teratoma syndrome and another in which a patient presents with aleukemic acute lymphoblastic leukemia with unusual clinical features. Diabetes management in cancer patients is the topic of a lengthy and informative interview between Dr Henry and Dr Todd Brown.

 

Listen to the podcast below.

 

Publications
Sections

In the July podcast for The Journal of Community and Supportive Oncology, Dr David Henry discusses an editorial by Dr Linda bosserman, in which she presents the case for pathways and the importance of processes and team work in paving the way for value-based care. Survivor care is the focus of 2 Original Reports in which investigators report on adolescent and young adult perceptions of cancer survivor care and supportive programming and on the symptoms, unmet need, and quality of life among recent breast cancer survivors. Also in the Original Report section are reports on the impact of loss of income and medicine costs on the financial burden for cancer patients in Australia and the use of a gene-panel for testing for hereditary ovarian cancer. Dr Henry also looks at 2 Case Reports, one in which a patient undergoes multivisceral resection for growing teratoma syndrome and another in which a patient presents with aleukemic acute lymphoblastic leukemia with unusual clinical features. Diabetes management in cancer patients is the topic of a lengthy and informative interview between Dr Henry and Dr Todd Brown.

 

Listen to the podcast below.

 

In the July podcast for The Journal of Community and Supportive Oncology, Dr David Henry discusses an editorial by Dr Linda bosserman, in which she presents the case for pathways and the importance of processes and team work in paving the way for value-based care. Survivor care is the focus of 2 Original Reports in which investigators report on adolescent and young adult perceptions of cancer survivor care and supportive programming and on the symptoms, unmet need, and quality of life among recent breast cancer survivors. Also in the Original Report section are reports on the impact of loss of income and medicine costs on the financial burden for cancer patients in Australia and the use of a gene-panel for testing for hereditary ovarian cancer. Dr Henry also looks at 2 Case Reports, one in which a patient undergoes multivisceral resection for growing teratoma syndrome and another in which a patient presents with aleukemic acute lymphoblastic leukemia with unusual clinical features. Diabetes management in cancer patients is the topic of a lengthy and informative interview between Dr Henry and Dr Todd Brown.

 

Listen to the podcast below.

 

Publications
Publications
Article Type
Display Headline
David Henry's JCSO podcast, July 2016
Display Headline
David Henry's JCSO podcast, July 2016
Sections
Disallow All Ads
Alternative CME

Absorb bioresorbable vascular scaffold wins FDA approval

Article Type
Changed
Tue, 07/21/2020 - 14:18
Display Headline
Absorb bioresorbable vascular scaffold wins FDA approval

The Food and Drug Administration approved the first fully absorbable vascular scaffold designed for use in coronary arteries, the Absorb GT1 bioresorbable vascular scaffold system, made by Abbott.

Concurrent with the FDA’s announcement on July 5, the company said that it plans to start immediate commercial rollout of the Absorb bioresorbable vascular scaffold (BVS). Initial availability will be limited to the roughly 100 most active sites that participated in the ABSORB III trial, the pivotal study that established noninferiority of the BVS, compared with a state-of-the-art metallic coronary stent during 1-year follow-up, according to a company spokesman.

Dr. Hiram G. Bezerra

However, the ABSORB III results, reported in October 2015, failed to document any superiority of the BVS, compared with a metallic stent. The potential advantages of a BVS remain for now unproven, and are based on the potential long-term advantages of using devices in percutaneous coronary interventions that slowly degrade away and thereby eliminate a residual metallic structure in a patient’s coronaries and the long-term threat they could pose for thrombosis or interference with subsequent coronary procedures.

“All the potential advantages are hypothetical at this point,” said Hiram G. Bezerra, MD, an investigator in the ABSORB III trial and director of the cardiac catheterization laboratory at University Hospitals Case Medical Center in Cleveland. However, “if you have a metallic stent it lasts a lifetime, creating a metallic cage” that could interfere with a possible later coronary procedure or be the site for thrombus formation. Disappearance of the BVS also creates the possibility for eventual restoration of more normal vasomotion in the coronary wall, said Dr. Bezerra, a self-professed “enthusiast” for the BVS alternative.

A major limiting factor for BVS use today is coronary diameter because the Absorb BVS is bulkier than metallic stents. The ABSORB III trial limited use of the BVS to coronary vessels with a reference-vessel diameter by visual assessment of at least 2.5 mm, with an upper limit of 3.75 mm. Other limiting factors can be coronary calcification and tortuosity, although Dr. Bezerra said that these obstacles are usually overcome with a more time-consuming procedure if the operator is committed to placing a BVS.

Another variable will be the cost of the BVS. According to the Abbott spokesman, the device “will be priced so that it will be broadly accessible to hospitals.” Also, the Absorb BVS will receive payer reimbursement comparable to a drug-eluting stent using existing reimbursement codes, the spokesman said. Abbott will require inexperienced operators to take a training course to learn proper placement technique.

Dr. Bezerra admitted that he is probably an outlier in his plan to quickly make the BVS a mainstay of his practice. “I think adoption will be slow in the beginning” for most U.S. operators, he predicted. One of his Cleveland colleagues who spoke about the near-term prospects BVS use last October when the ABSORB III results came out predicted that immediate use might occur in about 10%-15% of patients undergoing percutaneous coronary interventions, similar to the usage level in Europe where this BVS has been available for several years.

Dr. Bezerra has been a consultant to Abbott and St. Jude. He was an investigator on the ABSORB III trial.

[email protected]

On Twitter @mitchelzoler

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
absorb, bioresorbable vascular scaffold, Abbott, FDA, Bezerra, percutaneous coronary intervention
Author and Disclosure Information

Author and Disclosure Information

Related Articles

The Food and Drug Administration approved the first fully absorbable vascular scaffold designed for use in coronary arteries, the Absorb GT1 bioresorbable vascular scaffold system, made by Abbott.

Concurrent with the FDA’s announcement on July 5, the company said that it plans to start immediate commercial rollout of the Absorb bioresorbable vascular scaffold (BVS). Initial availability will be limited to the roughly 100 most active sites that participated in the ABSORB III trial, the pivotal study that established noninferiority of the BVS, compared with a state-of-the-art metallic coronary stent during 1-year follow-up, according to a company spokesman.

Dr. Hiram G. Bezerra

However, the ABSORB III results, reported in October 2015, failed to document any superiority of the BVS, compared with a metallic stent. The potential advantages of a BVS remain for now unproven, and are based on the potential long-term advantages of using devices in percutaneous coronary interventions that slowly degrade away and thereby eliminate a residual metallic structure in a patient’s coronaries and the long-term threat they could pose for thrombosis or interference with subsequent coronary procedures.

“All the potential advantages are hypothetical at this point,” said Hiram G. Bezerra, MD, an investigator in the ABSORB III trial and director of the cardiac catheterization laboratory at University Hospitals Case Medical Center in Cleveland. However, “if you have a metallic stent it lasts a lifetime, creating a metallic cage” that could interfere with a possible later coronary procedure or be the site for thrombus formation. Disappearance of the BVS also creates the possibility for eventual restoration of more normal vasomotion in the coronary wall, said Dr. Bezerra, a self-professed “enthusiast” for the BVS alternative.

A major limiting factor for BVS use today is coronary diameter because the Absorb BVS is bulkier than metallic stents. The ABSORB III trial limited use of the BVS to coronary vessels with a reference-vessel diameter by visual assessment of at least 2.5 mm, with an upper limit of 3.75 mm. Other limiting factors can be coronary calcification and tortuosity, although Dr. Bezerra said that these obstacles are usually overcome with a more time-consuming procedure if the operator is committed to placing a BVS.

Another variable will be the cost of the BVS. According to the Abbott spokesman, the device “will be priced so that it will be broadly accessible to hospitals.” Also, the Absorb BVS will receive payer reimbursement comparable to a drug-eluting stent using existing reimbursement codes, the spokesman said. Abbott will require inexperienced operators to take a training course to learn proper placement technique.

Dr. Bezerra admitted that he is probably an outlier in his plan to quickly make the BVS a mainstay of his practice. “I think adoption will be slow in the beginning” for most U.S. operators, he predicted. One of his Cleveland colleagues who spoke about the near-term prospects BVS use last October when the ABSORB III results came out predicted that immediate use might occur in about 10%-15% of patients undergoing percutaneous coronary interventions, similar to the usage level in Europe where this BVS has been available for several years.

Dr. Bezerra has been a consultant to Abbott and St. Jude. He was an investigator on the ABSORB III trial.

[email protected]

On Twitter @mitchelzoler

The Food and Drug Administration approved the first fully absorbable vascular scaffold designed for use in coronary arteries, the Absorb GT1 bioresorbable vascular scaffold system, made by Abbott.

Concurrent with the FDA’s announcement on July 5, the company said that it plans to start immediate commercial rollout of the Absorb bioresorbable vascular scaffold (BVS). Initial availability will be limited to the roughly 100 most active sites that participated in the ABSORB III trial, the pivotal study that established noninferiority of the BVS, compared with a state-of-the-art metallic coronary stent during 1-year follow-up, according to a company spokesman.

Dr. Hiram G. Bezerra

However, the ABSORB III results, reported in October 2015, failed to document any superiority of the BVS, compared with a metallic stent. The potential advantages of a BVS remain for now unproven, and are based on the potential long-term advantages of using devices in percutaneous coronary interventions that slowly degrade away and thereby eliminate a residual metallic structure in a patient’s coronaries and the long-term threat they could pose for thrombosis or interference with subsequent coronary procedures.

“All the potential advantages are hypothetical at this point,” said Hiram G. Bezerra, MD, an investigator in the ABSORB III trial and director of the cardiac catheterization laboratory at University Hospitals Case Medical Center in Cleveland. However, “if you have a metallic stent it lasts a lifetime, creating a metallic cage” that could interfere with a possible later coronary procedure or be the site for thrombus formation. Disappearance of the BVS also creates the possibility for eventual restoration of more normal vasomotion in the coronary wall, said Dr. Bezerra, a self-professed “enthusiast” for the BVS alternative.

A major limiting factor for BVS use today is coronary diameter because the Absorb BVS is bulkier than metallic stents. The ABSORB III trial limited use of the BVS to coronary vessels with a reference-vessel diameter by visual assessment of at least 2.5 mm, with an upper limit of 3.75 mm. Other limiting factors can be coronary calcification and tortuosity, although Dr. Bezerra said that these obstacles are usually overcome with a more time-consuming procedure if the operator is committed to placing a BVS.

Another variable will be the cost of the BVS. According to the Abbott spokesman, the device “will be priced so that it will be broadly accessible to hospitals.” Also, the Absorb BVS will receive payer reimbursement comparable to a drug-eluting stent using existing reimbursement codes, the spokesman said. Abbott will require inexperienced operators to take a training course to learn proper placement technique.

Dr. Bezerra admitted that he is probably an outlier in his plan to quickly make the BVS a mainstay of his practice. “I think adoption will be slow in the beginning” for most U.S. operators, he predicted. One of his Cleveland colleagues who spoke about the near-term prospects BVS use last October when the ABSORB III results came out predicted that immediate use might occur in about 10%-15% of patients undergoing percutaneous coronary interventions, similar to the usage level in Europe where this BVS has been available for several years.

Dr. Bezerra has been a consultant to Abbott and St. Jude. He was an investigator on the ABSORB III trial.

[email protected]

On Twitter @mitchelzoler

References

References

Publications
Publications
Topics
Article Type
Display Headline
Absorb bioresorbable vascular scaffold wins FDA approval
Display Headline
Absorb bioresorbable vascular scaffold wins FDA approval
Legacy Keywords
absorb, bioresorbable vascular scaffold, Abbott, FDA, Bezerra, percutaneous coronary intervention
Legacy Keywords
absorb, bioresorbable vascular scaffold, Abbott, FDA, Bezerra, percutaneous coronary intervention
Article Source

PURLs Copyright

Inside the Article

HCQ eye toxicity needs experience to assess

Article Type
Changed
Sat, 12/08/2018 - 02:48
Display Headline
HCQ eye toxicity needs experience to assess

LONDON – Retinopathy in patients taking long-term hydroxychloroquine for rheumatic conditions requires assessment by those experienced with specialized ophthalmic imaging, according to study findings presented at the European Congress of Rheumatology.

Nonspecific abnormalities, which often are unrelated to hydroxychloroquine (HCQ), can be seen with many of the tests recommended by current ophthalmology guidelines. These changes need “careful interpretation by retina specialists,” the study’s investigators wrote in a poster presentation.

HCQ is used widely for the treatment of systemic lupus erythematosus (SLE), rheumatoid arthritis, and many other inflammatory or autoimmune conditions, but it can cause irreversible eye damage and is often associated with prolonged (greater than 5 years) use. Specifically, it can cause a type of end-stage retinopathy called bull’s-eye maculopathy, which is where the fovea becomes hyperpigmented, much like the bull’s-eye on a dartboard. This can lead to substantial vision loss (blind spots) if not caught early.

Dr. Syed Mahmood Ali Shah

Although it is reasonably rare to develop end-stage retinopathy, there is currently no treatment for HCQ-induced retinopathy. Stopping the drug may not necessarily stop the retinal damage, and drug withdrawal may not be an option in many patients given the lack of alternative options to treat the symptoms of SLE, study author and ophthalmologist Syed Mahmood Ali Shah, MBBS, MD, said in an interview.

Dr. Shah and his associates at Johns Hopkins University in Baltimore reported on applying the 2011 American Academy of Ophthalmology (AAO) guidelines on screening for HCQ retinopathy (Ophthalmology. 2011;118:415-22) to an academic practice. They also estimated the prevalence of HCQ retinopathy among 135 consecutively treated patients with SLE using recommended tests. The mean duration of HCQ use was 12.5 years.

The 2011 AAO guidelines – which in March 2016 were updated (Ophthalmology 2016 Jun;123:1386-94) – recommended the use of three “ancillary” tests in addition to the usual clinical ophthalmic examination and assessment of visual fields: optical coherence tomography (OCT), fundus autofluorescence (FAF), and multifocal electroretinography (mfERG). Dr. Shah and his colleagues used these three tests together with eye-tracking microperimetry (MP) as a substitute for Humphrey Visual Fields (HVF), which is a common visual field test used in the United States.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Early HCQ-induced eye damage as seen on microperimetry.

One difference between the 2011 guidelines and 2016 revision is that “the baseline exam can now be performed relying [only] on the fundus exam, with additional imaging required only for abnormal patients,” Dr. Shah said. “Overall, the guidelines have not changed on how often and how much you follow up,” he added. “The change is that there is no need to do these tests at baseline unless changes of the fundus are present.” However, OCT has become more widely used in many offices and has been recognized as the most useful objective test and shall be performed if there are any abnormal findings of the fundus.

A total of 266 eyes were examined using these imaging methods and interpreted by experienced retina specialists. Overall, HCQ-related abnormalities were noted in 14 (5%) eyes using OCT, 18 (7%) using FAF, 27 (10%) eyes using mfERG, and 20 (7%) using MP.

MP had the lowest discrepancy between the overall number of eyes with abnormalities (72 [27%] of 266) detected and the number of eyes with abnormalities related to HCQ (20 [28%] of 72), followed by OCT (21% and 25%, respectively), FAF (19% and 35%) and mfERG (37% and 28%). Only four patients (3%) showed changes in all four tests suggestive of HCQ retinopathy.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Late HCQ-induced eye damage as seen on microperimetry of the same eye of the same patient.

In the absence of baseline data from the AAO recommended ancillary tests before the use of HCQ, “it may be difficult to interpret changes seen on these tests since most of the screenings are done by regular ophthalmologists who lack the equipment and experience with specialized testing such as mfERG, FAF, and OCT,” Dr. Shah and his coauthors noted. “We found a substantial number of cases with abnormalities unrelated to HCQ.”

Giving some practical advice, Dr. Shah noted that “before a patient starts treatment with HCQ, they should undergo a baseline ophthalmic assessment. Then if the patient complains of any vision changes, even if they have been taking the drug for less than 5 years, they should be reassessed.”

While repeat follow-up is, of course, necessary, he intimated that it is necessary to find a balance of risk and cost in regard to the frequency of screening for drug-related damage. “The American Academy of Ophthalmology currently recommends that a baseline fundus exam be performed shortly after starting HCQ. Ancillary OCT and visual fields shall only be performed if the fundus is abnormal at this baseline exam. However, since most retina specialists get OCT and visual field testing anyway it is wise to look at these as well,” he suggested. After 5 years of using the drug, they must be seen more regularly, and this is the point when ophthalmologists can decide if this should be every 6 months or annually, with the latter recommended by the AAO guidelines for patients with no additional risk factors.

 

 

The study was supported by noncommercial grants. Dr. Shah had no conflicts of interest to disclose.

References

Click for Credit Link
Meeting/Event
Author and Disclosure Information

Publications
Topics
Sections
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

LONDON – Retinopathy in patients taking long-term hydroxychloroquine for rheumatic conditions requires assessment by those experienced with specialized ophthalmic imaging, according to study findings presented at the European Congress of Rheumatology.

Nonspecific abnormalities, which often are unrelated to hydroxychloroquine (HCQ), can be seen with many of the tests recommended by current ophthalmology guidelines. These changes need “careful interpretation by retina specialists,” the study’s investigators wrote in a poster presentation.

HCQ is used widely for the treatment of systemic lupus erythematosus (SLE), rheumatoid arthritis, and many other inflammatory or autoimmune conditions, but it can cause irreversible eye damage and is often associated with prolonged (greater than 5 years) use. Specifically, it can cause a type of end-stage retinopathy called bull’s-eye maculopathy, which is where the fovea becomes hyperpigmented, much like the bull’s-eye on a dartboard. This can lead to substantial vision loss (blind spots) if not caught early.

Dr. Syed Mahmood Ali Shah

Although it is reasonably rare to develop end-stage retinopathy, there is currently no treatment for HCQ-induced retinopathy. Stopping the drug may not necessarily stop the retinal damage, and drug withdrawal may not be an option in many patients given the lack of alternative options to treat the symptoms of SLE, study author and ophthalmologist Syed Mahmood Ali Shah, MBBS, MD, said in an interview.

Dr. Shah and his associates at Johns Hopkins University in Baltimore reported on applying the 2011 American Academy of Ophthalmology (AAO) guidelines on screening for HCQ retinopathy (Ophthalmology. 2011;118:415-22) to an academic practice. They also estimated the prevalence of HCQ retinopathy among 135 consecutively treated patients with SLE using recommended tests. The mean duration of HCQ use was 12.5 years.

The 2011 AAO guidelines – which in March 2016 were updated (Ophthalmology 2016 Jun;123:1386-94) – recommended the use of three “ancillary” tests in addition to the usual clinical ophthalmic examination and assessment of visual fields: optical coherence tomography (OCT), fundus autofluorescence (FAF), and multifocal electroretinography (mfERG). Dr. Shah and his colleagues used these three tests together with eye-tracking microperimetry (MP) as a substitute for Humphrey Visual Fields (HVF), which is a common visual field test used in the United States.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Early HCQ-induced eye damage as seen on microperimetry.

One difference between the 2011 guidelines and 2016 revision is that “the baseline exam can now be performed relying [only] on the fundus exam, with additional imaging required only for abnormal patients,” Dr. Shah said. “Overall, the guidelines have not changed on how often and how much you follow up,” he added. “The change is that there is no need to do these tests at baseline unless changes of the fundus are present.” However, OCT has become more widely used in many offices and has been recognized as the most useful objective test and shall be performed if there are any abnormal findings of the fundus.

A total of 266 eyes were examined using these imaging methods and interpreted by experienced retina specialists. Overall, HCQ-related abnormalities were noted in 14 (5%) eyes using OCT, 18 (7%) using FAF, 27 (10%) eyes using mfERG, and 20 (7%) using MP.

MP had the lowest discrepancy between the overall number of eyes with abnormalities (72 [27%] of 266) detected and the number of eyes with abnormalities related to HCQ (20 [28%] of 72), followed by OCT (21% and 25%, respectively), FAF (19% and 35%) and mfERG (37% and 28%). Only four patients (3%) showed changes in all four tests suggestive of HCQ retinopathy.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Late HCQ-induced eye damage as seen on microperimetry of the same eye of the same patient.

In the absence of baseline data from the AAO recommended ancillary tests before the use of HCQ, “it may be difficult to interpret changes seen on these tests since most of the screenings are done by regular ophthalmologists who lack the equipment and experience with specialized testing such as mfERG, FAF, and OCT,” Dr. Shah and his coauthors noted. “We found a substantial number of cases with abnormalities unrelated to HCQ.”

Giving some practical advice, Dr. Shah noted that “before a patient starts treatment with HCQ, they should undergo a baseline ophthalmic assessment. Then if the patient complains of any vision changes, even if they have been taking the drug for less than 5 years, they should be reassessed.”

While repeat follow-up is, of course, necessary, he intimated that it is necessary to find a balance of risk and cost in regard to the frequency of screening for drug-related damage. “The American Academy of Ophthalmology currently recommends that a baseline fundus exam be performed shortly after starting HCQ. Ancillary OCT and visual fields shall only be performed if the fundus is abnormal at this baseline exam. However, since most retina specialists get OCT and visual field testing anyway it is wise to look at these as well,” he suggested. After 5 years of using the drug, they must be seen more regularly, and this is the point when ophthalmologists can decide if this should be every 6 months or annually, with the latter recommended by the AAO guidelines for patients with no additional risk factors.

 

 

The study was supported by noncommercial grants. Dr. Shah had no conflicts of interest to disclose.

LONDON – Retinopathy in patients taking long-term hydroxychloroquine for rheumatic conditions requires assessment by those experienced with specialized ophthalmic imaging, according to study findings presented at the European Congress of Rheumatology.

Nonspecific abnormalities, which often are unrelated to hydroxychloroquine (HCQ), can be seen with many of the tests recommended by current ophthalmology guidelines. These changes need “careful interpretation by retina specialists,” the study’s investigators wrote in a poster presentation.

HCQ is used widely for the treatment of systemic lupus erythematosus (SLE), rheumatoid arthritis, and many other inflammatory or autoimmune conditions, but it can cause irreversible eye damage and is often associated with prolonged (greater than 5 years) use. Specifically, it can cause a type of end-stage retinopathy called bull’s-eye maculopathy, which is where the fovea becomes hyperpigmented, much like the bull’s-eye on a dartboard. This can lead to substantial vision loss (blind spots) if not caught early.

Dr. Syed Mahmood Ali Shah

Although it is reasonably rare to develop end-stage retinopathy, there is currently no treatment for HCQ-induced retinopathy. Stopping the drug may not necessarily stop the retinal damage, and drug withdrawal may not be an option in many patients given the lack of alternative options to treat the symptoms of SLE, study author and ophthalmologist Syed Mahmood Ali Shah, MBBS, MD, said in an interview.

Dr. Shah and his associates at Johns Hopkins University in Baltimore reported on applying the 2011 American Academy of Ophthalmology (AAO) guidelines on screening for HCQ retinopathy (Ophthalmology. 2011;118:415-22) to an academic practice. They also estimated the prevalence of HCQ retinopathy among 135 consecutively treated patients with SLE using recommended tests. The mean duration of HCQ use was 12.5 years.

The 2011 AAO guidelines – which in March 2016 were updated (Ophthalmology 2016 Jun;123:1386-94) – recommended the use of three “ancillary” tests in addition to the usual clinical ophthalmic examination and assessment of visual fields: optical coherence tomography (OCT), fundus autofluorescence (FAF), and multifocal electroretinography (mfERG). Dr. Shah and his colleagues used these three tests together with eye-tracking microperimetry (MP) as a substitute for Humphrey Visual Fields (HVF), which is a common visual field test used in the United States.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Early HCQ-induced eye damage as seen on microperimetry.

One difference between the 2011 guidelines and 2016 revision is that “the baseline exam can now be performed relying [only] on the fundus exam, with additional imaging required only for abnormal patients,” Dr. Shah said. “Overall, the guidelines have not changed on how often and how much you follow up,” he added. “The change is that there is no need to do these tests at baseline unless changes of the fundus are present.” However, OCT has become more widely used in many offices and has been recognized as the most useful objective test and shall be performed if there are any abnormal findings of the fundus.

A total of 266 eyes were examined using these imaging methods and interpreted by experienced retina specialists. Overall, HCQ-related abnormalities were noted in 14 (5%) eyes using OCT, 18 (7%) using FAF, 27 (10%) eyes using mfERG, and 20 (7%) using MP.

MP had the lowest discrepancy between the overall number of eyes with abnormalities (72 [27%] of 266) detected and the number of eyes with abnormalities related to HCQ (20 [28%] of 72), followed by OCT (21% and 25%, respectively), FAF (19% and 35%) and mfERG (37% and 28%). Only four patients (3%) showed changes in all four tests suggestive of HCQ retinopathy.

Courtesy Syed Mahmood Ali Shah, MBBS, MD
Late HCQ-induced eye damage as seen on microperimetry of the same eye of the same patient.

In the absence of baseline data from the AAO recommended ancillary tests before the use of HCQ, “it may be difficult to interpret changes seen on these tests since most of the screenings are done by regular ophthalmologists who lack the equipment and experience with specialized testing such as mfERG, FAF, and OCT,” Dr. Shah and his coauthors noted. “We found a substantial number of cases with abnormalities unrelated to HCQ.”

Giving some practical advice, Dr. Shah noted that “before a patient starts treatment with HCQ, they should undergo a baseline ophthalmic assessment. Then if the patient complains of any vision changes, even if they have been taking the drug for less than 5 years, they should be reassessed.”

While repeat follow-up is, of course, necessary, he intimated that it is necessary to find a balance of risk and cost in regard to the frequency of screening for drug-related damage. “The American Academy of Ophthalmology currently recommends that a baseline fundus exam be performed shortly after starting HCQ. Ancillary OCT and visual fields shall only be performed if the fundus is abnormal at this baseline exam. However, since most retina specialists get OCT and visual field testing anyway it is wise to look at these as well,” he suggested. After 5 years of using the drug, they must be seen more regularly, and this is the point when ophthalmologists can decide if this should be every 6 months or annually, with the latter recommended by the AAO guidelines for patients with no additional risk factors.

 

 

The study was supported by noncommercial grants. Dr. Shah had no conflicts of interest to disclose.

References

References

Publications
Publications
Topics
Article Type
Display Headline
HCQ eye toxicity needs experience to assess
Display Headline
HCQ eye toxicity needs experience to assess
Click for Credit Status
Active
Sections
Article Source

AT THE EULAR 2016 CONGRESS

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Several eye abnormalities can be mistaken for hydroxychloroquine-related eye toxicity, making specialist ophthalmic assessment paramount.

Major finding: Only four patients (3%) showed changes in all four tests suggestive of HCQ retinopathy.

Data source: Observational study of 135 patients with SLE being seen for suspected hydroxychloroquine-related retinopathy at an academic practice

Disclosures: The study was supported by noncommercial grants. Dr. Shah had no conflicts of interest to disclose.

mtDNA level predicts IVF embryo viability

Article Type
Changed
Tue, 12/04/2018 - 15:52
Display Headline
mtDNA level predicts IVF embryo viability

HELSINKI – Mitochondrial DNA level appears to be a useful biomarker for in vitro fertilization embryo viability, according to findings from a blinded prospective non-selection study.

An analysis of 280 chromosomally normal blastocysts showed that 15 (5.4%) contained unusually high levels of mitochondrial DNA (mtDNA) and the remaining blastocysts had normal or low mtDNA levels. Of 111 of the blastocyst transfers for which outcome data were available, 78 (70%) led to ongoing pregnancies, and all of those involved blastocysts with normal or low mtDNA levels, while 8 of the 33 blastocysts that failed to implant (24%) had unusually high mtDNA levels, Elpida Fragouli, PhD, reported at the annual meeting of the European Society of Human Reproduction and Embryology.

©ktsimage/iStockphoto.com
IVF

Thus, the ongoing pregnancy rate for morphologically good, euploid blastocysts was 76% for those with normal/low mtDNA levels, compared with 0% for those with elevated mtDNA levels – a highly statistically significant difference. The overall pregnancy rate was 70%, said Dr. Fragouli of Reprogenetics UK and the University of Oxford.

The blastocysts in the study were generated by 143 couples who underwent IVF in a single clinic. All blastocysts were biopsied and shown to be chromosomally normal using preimplantation genetic screening.

“The study demonstrates that mitochondrial DNA levels are highly predictive of an embryo’s implantation potential,” Dr. Fragouli said, noting that the “very robust” findings could potentially enhance embryo selection and improve IVF outcomes.

The methodology used in the study has been extensively validated, she said. However, a randomized clinical trial will be necessary to determine the true extent of any clinical benefit, she added, noting that research is also needed to improve understanding of the biology of mtDNA expansion.

The findings are of particular interest, because while it is well known that chromosomal abnormality in embryos is common and increases with age, and is the main cause of implantation failure, it has been less clear why about a third of euploid embryos fail to produce a pregnancy.

“The combination of chromosome analysis and mitochondrial assessment may now represent the most accurate and predictive measure of embryo viability with great potential for improving IVF outcome,” according to an ESHRE press release on the findings.

Levels of mtDNA can be quickly measured using polymerase chain reaction. Next generation sequencing can also be used, Dr. Fragouli noted. However, since aneuploidy remains the most common cause of embryo implantation failure, mtDNA and chromosome testing would be necessary.

“Mitochondrial analysis does not replace [aneuploidy screening]. It is the combination of the two methods ... that is so powerful,” she said, noting that efforts are underway to develop an approach to assessing chromosome content and mtDNA simultaneously to reduce the extra cost.

The group has started offering mtDNA quantification clinically in the United States and has applied to the Human Fertilisation and Embryology Authority for a license to use the testing in the United Kingdom.

Reprogenetics provided funding for this study.

[email protected]

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
IVF, embryo viability, mitrochondrial DNA
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

HELSINKI – Mitochondrial DNA level appears to be a useful biomarker for in vitro fertilization embryo viability, according to findings from a blinded prospective non-selection study.

An analysis of 280 chromosomally normal blastocysts showed that 15 (5.4%) contained unusually high levels of mitochondrial DNA (mtDNA) and the remaining blastocysts had normal or low mtDNA levels. Of 111 of the blastocyst transfers for which outcome data were available, 78 (70%) led to ongoing pregnancies, and all of those involved blastocysts with normal or low mtDNA levels, while 8 of the 33 blastocysts that failed to implant (24%) had unusually high mtDNA levels, Elpida Fragouli, PhD, reported at the annual meeting of the European Society of Human Reproduction and Embryology.

©ktsimage/iStockphoto.com
IVF

Thus, the ongoing pregnancy rate for morphologically good, euploid blastocysts was 76% for those with normal/low mtDNA levels, compared with 0% for those with elevated mtDNA levels – a highly statistically significant difference. The overall pregnancy rate was 70%, said Dr. Fragouli of Reprogenetics UK and the University of Oxford.

The blastocysts in the study were generated by 143 couples who underwent IVF in a single clinic. All blastocysts were biopsied and shown to be chromosomally normal using preimplantation genetic screening.

“The study demonstrates that mitochondrial DNA levels are highly predictive of an embryo’s implantation potential,” Dr. Fragouli said, noting that the “very robust” findings could potentially enhance embryo selection and improve IVF outcomes.

The methodology used in the study has been extensively validated, she said. However, a randomized clinical trial will be necessary to determine the true extent of any clinical benefit, she added, noting that research is also needed to improve understanding of the biology of mtDNA expansion.

The findings are of particular interest, because while it is well known that chromosomal abnormality in embryos is common and increases with age, and is the main cause of implantation failure, it has been less clear why about a third of euploid embryos fail to produce a pregnancy.

“The combination of chromosome analysis and mitochondrial assessment may now represent the most accurate and predictive measure of embryo viability with great potential for improving IVF outcome,” according to an ESHRE press release on the findings.

Levels of mtDNA can be quickly measured using polymerase chain reaction. Next generation sequencing can also be used, Dr. Fragouli noted. However, since aneuploidy remains the most common cause of embryo implantation failure, mtDNA and chromosome testing would be necessary.

“Mitochondrial analysis does not replace [aneuploidy screening]. It is the combination of the two methods ... that is so powerful,” she said, noting that efforts are underway to develop an approach to assessing chromosome content and mtDNA simultaneously to reduce the extra cost.

The group has started offering mtDNA quantification clinically in the United States and has applied to the Human Fertilisation and Embryology Authority for a license to use the testing in the United Kingdom.

Reprogenetics provided funding for this study.

[email protected]

HELSINKI – Mitochondrial DNA level appears to be a useful biomarker for in vitro fertilization embryo viability, according to findings from a blinded prospective non-selection study.

An analysis of 280 chromosomally normal blastocysts showed that 15 (5.4%) contained unusually high levels of mitochondrial DNA (mtDNA) and the remaining blastocysts had normal or low mtDNA levels. Of 111 of the blastocyst transfers for which outcome data were available, 78 (70%) led to ongoing pregnancies, and all of those involved blastocysts with normal or low mtDNA levels, while 8 of the 33 blastocysts that failed to implant (24%) had unusually high mtDNA levels, Elpida Fragouli, PhD, reported at the annual meeting of the European Society of Human Reproduction and Embryology.

©ktsimage/iStockphoto.com
IVF

Thus, the ongoing pregnancy rate for morphologically good, euploid blastocysts was 76% for those with normal/low mtDNA levels, compared with 0% for those with elevated mtDNA levels – a highly statistically significant difference. The overall pregnancy rate was 70%, said Dr. Fragouli of Reprogenetics UK and the University of Oxford.

The blastocysts in the study were generated by 143 couples who underwent IVF in a single clinic. All blastocysts were biopsied and shown to be chromosomally normal using preimplantation genetic screening.

“The study demonstrates that mitochondrial DNA levels are highly predictive of an embryo’s implantation potential,” Dr. Fragouli said, noting that the “very robust” findings could potentially enhance embryo selection and improve IVF outcomes.

The methodology used in the study has been extensively validated, she said. However, a randomized clinical trial will be necessary to determine the true extent of any clinical benefit, she added, noting that research is also needed to improve understanding of the biology of mtDNA expansion.

The findings are of particular interest, because while it is well known that chromosomal abnormality in embryos is common and increases with age, and is the main cause of implantation failure, it has been less clear why about a third of euploid embryos fail to produce a pregnancy.

“The combination of chromosome analysis and mitochondrial assessment may now represent the most accurate and predictive measure of embryo viability with great potential for improving IVF outcome,” according to an ESHRE press release on the findings.

Levels of mtDNA can be quickly measured using polymerase chain reaction. Next generation sequencing can also be used, Dr. Fragouli noted. However, since aneuploidy remains the most common cause of embryo implantation failure, mtDNA and chromosome testing would be necessary.

“Mitochondrial analysis does not replace [aneuploidy screening]. It is the combination of the two methods ... that is so powerful,” she said, noting that efforts are underway to develop an approach to assessing chromosome content and mtDNA simultaneously to reduce the extra cost.

The group has started offering mtDNA quantification clinically in the United States and has applied to the Human Fertilisation and Embryology Authority for a license to use the testing in the United Kingdom.

Reprogenetics provided funding for this study.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
mtDNA level predicts IVF embryo viability
Display Headline
mtDNA level predicts IVF embryo viability
Legacy Keywords
IVF, embryo viability, mitrochondrial DNA
Legacy Keywords
IVF, embryo viability, mitrochondrial DNA
Sections
Article Source

AT ESHRE 2016

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Mitochondrial DNA level may offer a way to assess embryo viability when doing in vitro fertilization.

Major finding: The ongoing pregnancy rate for euploid blastocysts was 76% for those with normal/low mtDNA levels, compared with 0% for those with elevated mtDNA levels.

Data source: A blinded prospective non-selection study of 280 blastocysts.

Disclosures: Reprogenetics provided funding for this study.

Pediatric Cancer Survivors at Increased Risk for Endocrine Abnormalities

Article Type
Changed
Thu, 06/15/2017 - 12:08
Display Headline
Pediatric Cancer Survivors at Increased Risk for Endocrine Abnormalities

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

References

Click for Credit Link
Author and Disclosure Information

Jessica Craig, Family Practice News Digital Network

Publications
Topics
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Jessica Craig, Family Practice News Digital Network

Author and Disclosure Information

Jessica Craig, Family Practice News Digital Network

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Pediatric Cancer Survivors at Increased Risk for Endocrine Abnormalities
Display Headline
Pediatric Cancer Survivors at Increased Risk for Endocrine Abnormalities
Click for Credit Status
Active
Article Source

FROM THE JOURNAL OF CLINICAL ONCOLOGY

PURLs Copyright

Pediatric cancer survivors at increased risk for endocrine abnormalities

Article Type
Changed
Fri, 01/18/2019 - 16:02
Display Headline
Pediatric cancer survivors at increased risk for endocrine abnormalities

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

 

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

[email protected]

On Twitter @jessnicolecraig

Click for Credit Link
Publications
Topics
Click for Credit Link
Click for Credit Link

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

 

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

[email protected]

On Twitter @jessnicolecraig

Patients who survived pediatric-onset cancer are at increased risk for developing or experiencing endocrine abnormalities.

Risk was significantly higher in survivors who underwent high-risk therapeutic exposures compared with survivors not so exposed. Moreover, the incidence and prevalence of endocrine abnormalities increased across the lifespan of survivors, reported Sogol Mostoufi-Moab, MD, of University of Pennsylvania, Philadelphia, and his associates (J Clin Oncol. 2016 Jul. doi: 10.1200/JCO.2016.66.6545).

A total of 14,290 patients met the study’s eligibility requirements, which included a diagnosis of cancer before age 21 years and 5-year survival following diagnosis. Cancer diagnoses included leukemia, Hodgkin and non-Hodgkin lymphoma, Wilms tumor, neuroblastoma, sarcoma, bone malignancy, and central nervous system malignancy. Baseline and follow-up questionnaires collected endocrine-related outcomes of interest, demographic information, and medical histories for both cancer survivors and their siblings (n = 4,031). For survivors, median age at diagnosis was 6 years and median age at last follow-up was 32 years. For siblings, median age at last follow-up was 34 years.

 

Overall 44% of cancer survivors had at least one endocrinopathy, 16.7% had at least two, and 6.6% had three or more. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Specifically, thyroid disorders were more frequent among cancer survivors than among their siblings: underactive thyroid (hazard ratio, 2.2; 95% confidence interval, 1.8-2.7), overactive thyroid (HR, 2.4; 95% CI, 1.7-3.3), thyroid nodules (HR, 3.9; 95% CI, 2.9-5.4), and thyroid cancer (HR 2.5; 95% CI, 1.2-5.3).

Compared to their siblings, cancer survivors showed increased risk of developing diabetes (RR, 1.8; 95% CI, 1.4-2.3).

Among survivors, those exposed to high-risk therapies (defined by the Children’s Oncology Group’s Long-Term Follow-Up Guidelinesfor Survivors of Childhood, Adolescent, and Young Adult Cancers) were at a greater risk of developing primary hypothyroidism (HR, 6.6; 95% CI, 5.6-7.8) central hypothyroidism (HR, 3.9; 95% CI, 2.9-5.2), an overactive thyroid (HR, 1.8; 95% CI, 1.2-2.8), thyroid nodules (HR, 6.3; 95% CI, 5.2-7.5), and thyroid cancer (HR, 9.2; 95% CI, 6.2-13.7) compared with survivors not so exposed.

The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.

[email protected]

On Twitter @jessnicolecraig

Publications
Publications
Topics
Article Type
Display Headline
Pediatric cancer survivors at increased risk for endocrine abnormalities
Display Headline
Pediatric cancer survivors at increased risk for endocrine abnormalities
Click for Credit Status
Active
Article Source

FROM THE JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads
Vitals

Key clinical point: Survivors of pediatric-onset cancer are at increased risk for developing endocrine abnormalities.

Major finding: Overall, 44% of childhood cancer survivors had at least one endocrinopathy. Survivors of Hodgkin lymphoma had the highest frequency of endocrine abnormality (60.1%) followed by survivors of CNS malignancy (54%), leukemia (45.6%), sarcoma (41.3%), non-Hodgkin lymphoma (39.7%), and neuroblastoma (31.9%).

Data source: A multi-institutional retrospective study of 14,290 men and women who survived pediatric cancer.

Disclosures: The National Cancer Institute, the Cancer Center Support Grant, and the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital funded the study. Dr. Mostoufi-Moab and nine other investigators had no disclosures to report. Two investigators reported receiving financial compensation or honoraria from Merck or Sandoz.