User login
Pregnancy delays onset of multiple sclerosis symptoms
Key clinical point: Pregnancy and childbirth seem to delay the onset of multiple sclerosis by more than 3 years. However, having more pregnancies is not associated with later onset.
Major finding: Onset of clinically isolated syndrome (CIS) was later in women with previous pregnancies and childbirths vs. those without pregnancies and childbirths (hazard ratio, 0.68; P less than .001), with a median delay of 3.3 years. No association was seen between higher number of pregnancies and childbirths and delay in CIS onset.
Study details: The findings are based on a multicenter cohort study of 2,557 women with CIS (mean age at CIS onset, 31.5 years) from the MSBase Registry.
Disclosures: The study was supported by a postgraduate scholarship and Ian Ballard Travel Award from MS Research Australia, an Australian Government Research Training Program Scholarship, and a grant from the National Health and Medical Research Council. The lead author reported receiving grants from MS Research Australia during the conduct of the study; grants, personal fees, and nonfinancial support from Biogen; grants and personal fees from Merck Serono; personal fees from Teva and Novartis; and nonfinancial support from Roche and Genzyme-Sanofi outside the submitted work.
Citation: Nguyen AL et al. JAMA Neurol. 2020 Sep 14. doi: 10.1001/jamaneurol.2020.3324.
Key clinical point: Pregnancy and childbirth seem to delay the onset of multiple sclerosis by more than 3 years. However, having more pregnancies is not associated with later onset.
Major finding: Onset of clinically isolated syndrome (CIS) was later in women with previous pregnancies and childbirths vs. those without pregnancies and childbirths (hazard ratio, 0.68; P less than .001), with a median delay of 3.3 years. No association was seen between higher number of pregnancies and childbirths and delay in CIS onset.
Study details: The findings are based on a multicenter cohort study of 2,557 women with CIS (mean age at CIS onset, 31.5 years) from the MSBase Registry.
Disclosures: The study was supported by a postgraduate scholarship and Ian Ballard Travel Award from MS Research Australia, an Australian Government Research Training Program Scholarship, and a grant from the National Health and Medical Research Council. The lead author reported receiving grants from MS Research Australia during the conduct of the study; grants, personal fees, and nonfinancial support from Biogen; grants and personal fees from Merck Serono; personal fees from Teva and Novartis; and nonfinancial support from Roche and Genzyme-Sanofi outside the submitted work.
Citation: Nguyen AL et al. JAMA Neurol. 2020 Sep 14. doi: 10.1001/jamaneurol.2020.3324.
Key clinical point: Pregnancy and childbirth seem to delay the onset of multiple sclerosis by more than 3 years. However, having more pregnancies is not associated with later onset.
Major finding: Onset of clinically isolated syndrome (CIS) was later in women with previous pregnancies and childbirths vs. those without pregnancies and childbirths (hazard ratio, 0.68; P less than .001), with a median delay of 3.3 years. No association was seen between higher number of pregnancies and childbirths and delay in CIS onset.
Study details: The findings are based on a multicenter cohort study of 2,557 women with CIS (mean age at CIS onset, 31.5 years) from the MSBase Registry.
Disclosures: The study was supported by a postgraduate scholarship and Ian Ballard Travel Award from MS Research Australia, an Australian Government Research Training Program Scholarship, and a grant from the National Health and Medical Research Council. The lead author reported receiving grants from MS Research Australia during the conduct of the study; grants, personal fees, and nonfinancial support from Biogen; grants and personal fees from Merck Serono; personal fees from Teva and Novartis; and nonfinancial support from Roche and Genzyme-Sanofi outside the submitted work.
Citation: Nguyen AL et al. JAMA Neurol. 2020 Sep 14. doi: 10.1001/jamaneurol.2020.3324.
Osteoporosis underdiagnosed in older men with fracture
Osteoporosis is frequently underdiagnosed and undertreated in men before and even after they have experienced a fracture, according to research presented at the virtual annual meeting of the American College of Rheumatology.
“This is an important public health concern,” as fractures contribute significantly to morbidity and mortality, said Jeffrey Curtis, MD, MS, MPH, professor of medicine in the division of clinical immunology and rheumatology at the University of Alabama at Birmingham.
Men are often overlooked, he said, “because it’s misconstrued as a disease that mainly, if not only, affects Caucasian women,” despite the fact that 20%-25% of fractures occur in men.
Emerging evidence suggests that men who have bone fractures have worse outcomes than women, Dr. Curtis said.
Guidelines lacking
Consistent guidelines for osteoporosis screening among men are also lacking, leading to ambiguity and increased disease burden.
Researchers studied records for a 5% random sample of male Medicare fee-for-service beneficiaries (n = 9,876) aged at least 65 years with a closed fragility fracture between January 2010 and September 2014. Average age for the men with fractures was 77.9 years, and the most common sites of the fracture were the spine, hip, and ankle.
They looked back to see whether these men had been effectively screened and treated.
Very few had.
“We found that 92.8% of them did not have any diagnosis or treatment of osteoporosis at baseline,” Curtis said. On top of that, less than 6% of men had undergone any dual-energy x-ray absorptiometry (DEXA) or bone mineral testing in the 2 years prior to their fracture.
Even men who had high-risk factors for falls, such as those using beta-blockers, mobility impairment, or a history of opioid use, were unlikely to be screened, he said.
Dr. Curtis’s data show there was actually a decline in DEXA scans from 2012 to 2014, and that decline was particularly high in men aged 75 years and older who are more likely to be at risk for fracture.
In addition to underscreening and undertreating before the fracture, Dr. Curtis said, “The treatment patterns after the fracture were not much better.” In the year after the fracture, “only about 10% of these men had BMD [bone mineral density] testing. Only 9% were treated with an osteoporosis medication.”
“Importantly, about 7% of the men in this large cohort went on to have one or more fractures in the next year,” he added.
Reasons for undertreatment
Reasons for the poor rates of diagnosis and treatment may begin with patients not having symptoms. Therefore, they aren’t coming into doctors’ offices asking to be screened. “Even if they break bones, they may not know enough to ask how to prevent the next fracture,” Dr. Curtis said.
There’s a financial obstacle as well, Dr. Curtis explained. “U.S. legislation that provides population screening for Medicare patients really, for men, is quite dissimilar to the near-universal coverage for women. So many clinicians worry they won’t get reimbursed if they order DEXA in men for screening.”
Additionally, postfracture quality-of-care guidelines that are reimbursed as part of the Medicare Access and CHIP Reauthorization Act of 2015 and the Merit-based Incentive Payment System program specifically exclude men, he noted.
Better management of male osteoporosis, including early identification of at-risk individuals is clearly warranted, he said, so they can be screened and put on effective therapy.
Sonali Khandelwal, MD, a rheumatologist with Rush University Medical Center, Chicago, who was not part of the research, agreed.
She said in an interview that part of the problem is that diagnosis and treatment could come from a variety of specialists – endocrinologists, rheumatologists, orthopedists, and primary care physicians – and each may think it falls in another’s realm.
At Rush and some other sites nationally, she said, an alert is registered in electronic medical records flagging any patient who may need bone density screening based on age, medications, or history.
Rush University also has a fracture liaison service under which everyone hospitalized there who may have had a history of a fracture or is admitted with a fracture gets followed up with screening and treatment, “to capture those patients who may not have come through the system otherwise.”
She said guidelines have called for DEXA screening for men at age 70, but she said clinical screening should start younger – as young as 50 – for patients with conditions such as lupus, rheumatoid arthritis, hypogonadism, or those on chronic steroids.
Dr. Khandelwal said that, even when an insurance company doesn›t typically cover bone density screening for men, physicians can often make a case for reimbursement if the patient has a history of falls or fractures.
“In the long run, preventing a fracture is saving so much more money than when you get a fracture and end up in a hospital and have to go to a nursing home,” she said.
Dr. Curtis reported relationships with AbbVie, Amgen, Bristol-Myers Squibb, Corrona, Janssen, Lilly, Myriad, Pfizer, Regeneron, Roche, UCB, Gilead Sciences, and Sanofi. Dr. Khandelwal reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Osteoporosis is frequently underdiagnosed and undertreated in men before and even after they have experienced a fracture, according to research presented at the virtual annual meeting of the American College of Rheumatology.
“This is an important public health concern,” as fractures contribute significantly to morbidity and mortality, said Jeffrey Curtis, MD, MS, MPH, professor of medicine in the division of clinical immunology and rheumatology at the University of Alabama at Birmingham.
Men are often overlooked, he said, “because it’s misconstrued as a disease that mainly, if not only, affects Caucasian women,” despite the fact that 20%-25% of fractures occur in men.
Emerging evidence suggests that men who have bone fractures have worse outcomes than women, Dr. Curtis said.
Guidelines lacking
Consistent guidelines for osteoporosis screening among men are also lacking, leading to ambiguity and increased disease burden.
Researchers studied records for a 5% random sample of male Medicare fee-for-service beneficiaries (n = 9,876) aged at least 65 years with a closed fragility fracture between January 2010 and September 2014. Average age for the men with fractures was 77.9 years, and the most common sites of the fracture were the spine, hip, and ankle.
They looked back to see whether these men had been effectively screened and treated.
Very few had.
“We found that 92.8% of them did not have any diagnosis or treatment of osteoporosis at baseline,” Curtis said. On top of that, less than 6% of men had undergone any dual-energy x-ray absorptiometry (DEXA) or bone mineral testing in the 2 years prior to their fracture.
Even men who had high-risk factors for falls, such as those using beta-blockers, mobility impairment, or a history of opioid use, were unlikely to be screened, he said.
Dr. Curtis’s data show there was actually a decline in DEXA scans from 2012 to 2014, and that decline was particularly high in men aged 75 years and older who are more likely to be at risk for fracture.
In addition to underscreening and undertreating before the fracture, Dr. Curtis said, “The treatment patterns after the fracture were not much better.” In the year after the fracture, “only about 10% of these men had BMD [bone mineral density] testing. Only 9% were treated with an osteoporosis medication.”
“Importantly, about 7% of the men in this large cohort went on to have one or more fractures in the next year,” he added.
Reasons for undertreatment
Reasons for the poor rates of diagnosis and treatment may begin with patients not having symptoms. Therefore, they aren’t coming into doctors’ offices asking to be screened. “Even if they break bones, they may not know enough to ask how to prevent the next fracture,” Dr. Curtis said.
There’s a financial obstacle as well, Dr. Curtis explained. “U.S. legislation that provides population screening for Medicare patients really, for men, is quite dissimilar to the near-universal coverage for women. So many clinicians worry they won’t get reimbursed if they order DEXA in men for screening.”
Additionally, postfracture quality-of-care guidelines that are reimbursed as part of the Medicare Access and CHIP Reauthorization Act of 2015 and the Merit-based Incentive Payment System program specifically exclude men, he noted.
Better management of male osteoporosis, including early identification of at-risk individuals is clearly warranted, he said, so they can be screened and put on effective therapy.
Sonali Khandelwal, MD, a rheumatologist with Rush University Medical Center, Chicago, who was not part of the research, agreed.
She said in an interview that part of the problem is that diagnosis and treatment could come from a variety of specialists – endocrinologists, rheumatologists, orthopedists, and primary care physicians – and each may think it falls in another’s realm.
At Rush and some other sites nationally, she said, an alert is registered in electronic medical records flagging any patient who may need bone density screening based on age, medications, or history.
Rush University also has a fracture liaison service under which everyone hospitalized there who may have had a history of a fracture or is admitted with a fracture gets followed up with screening and treatment, “to capture those patients who may not have come through the system otherwise.”
She said guidelines have called for DEXA screening for men at age 70, but she said clinical screening should start younger – as young as 50 – for patients with conditions such as lupus, rheumatoid arthritis, hypogonadism, or those on chronic steroids.
Dr. Khandelwal said that, even when an insurance company doesn›t typically cover bone density screening for men, physicians can often make a case for reimbursement if the patient has a history of falls or fractures.
“In the long run, preventing a fracture is saving so much more money than when you get a fracture and end up in a hospital and have to go to a nursing home,” she said.
Dr. Curtis reported relationships with AbbVie, Amgen, Bristol-Myers Squibb, Corrona, Janssen, Lilly, Myriad, Pfizer, Regeneron, Roche, UCB, Gilead Sciences, and Sanofi. Dr. Khandelwal reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Osteoporosis is frequently underdiagnosed and undertreated in men before and even after they have experienced a fracture, according to research presented at the virtual annual meeting of the American College of Rheumatology.
“This is an important public health concern,” as fractures contribute significantly to morbidity and mortality, said Jeffrey Curtis, MD, MS, MPH, professor of medicine in the division of clinical immunology and rheumatology at the University of Alabama at Birmingham.
Men are often overlooked, he said, “because it’s misconstrued as a disease that mainly, if not only, affects Caucasian women,” despite the fact that 20%-25% of fractures occur in men.
Emerging evidence suggests that men who have bone fractures have worse outcomes than women, Dr. Curtis said.
Guidelines lacking
Consistent guidelines for osteoporosis screening among men are also lacking, leading to ambiguity and increased disease burden.
Researchers studied records for a 5% random sample of male Medicare fee-for-service beneficiaries (n = 9,876) aged at least 65 years with a closed fragility fracture between January 2010 and September 2014. Average age for the men with fractures was 77.9 years, and the most common sites of the fracture were the spine, hip, and ankle.
They looked back to see whether these men had been effectively screened and treated.
Very few had.
“We found that 92.8% of them did not have any diagnosis or treatment of osteoporosis at baseline,” Curtis said. On top of that, less than 6% of men had undergone any dual-energy x-ray absorptiometry (DEXA) or bone mineral testing in the 2 years prior to their fracture.
Even men who had high-risk factors for falls, such as those using beta-blockers, mobility impairment, or a history of opioid use, were unlikely to be screened, he said.
Dr. Curtis’s data show there was actually a decline in DEXA scans from 2012 to 2014, and that decline was particularly high in men aged 75 years and older who are more likely to be at risk for fracture.
In addition to underscreening and undertreating before the fracture, Dr. Curtis said, “The treatment patterns after the fracture were not much better.” In the year after the fracture, “only about 10% of these men had BMD [bone mineral density] testing. Only 9% were treated with an osteoporosis medication.”
“Importantly, about 7% of the men in this large cohort went on to have one or more fractures in the next year,” he added.
Reasons for undertreatment
Reasons for the poor rates of diagnosis and treatment may begin with patients not having symptoms. Therefore, they aren’t coming into doctors’ offices asking to be screened. “Even if they break bones, they may not know enough to ask how to prevent the next fracture,” Dr. Curtis said.
There’s a financial obstacle as well, Dr. Curtis explained. “U.S. legislation that provides population screening for Medicare patients really, for men, is quite dissimilar to the near-universal coverage for women. So many clinicians worry they won’t get reimbursed if they order DEXA in men for screening.”
Additionally, postfracture quality-of-care guidelines that are reimbursed as part of the Medicare Access and CHIP Reauthorization Act of 2015 and the Merit-based Incentive Payment System program specifically exclude men, he noted.
Better management of male osteoporosis, including early identification of at-risk individuals is clearly warranted, he said, so they can be screened and put on effective therapy.
Sonali Khandelwal, MD, a rheumatologist with Rush University Medical Center, Chicago, who was not part of the research, agreed.
She said in an interview that part of the problem is that diagnosis and treatment could come from a variety of specialists – endocrinologists, rheumatologists, orthopedists, and primary care physicians – and each may think it falls in another’s realm.
At Rush and some other sites nationally, she said, an alert is registered in electronic medical records flagging any patient who may need bone density screening based on age, medications, or history.
Rush University also has a fracture liaison service under which everyone hospitalized there who may have had a history of a fracture or is admitted with a fracture gets followed up with screening and treatment, “to capture those patients who may not have come through the system otherwise.”
She said guidelines have called for DEXA screening for men at age 70, but she said clinical screening should start younger – as young as 50 – for patients with conditions such as lupus, rheumatoid arthritis, hypogonadism, or those on chronic steroids.
Dr. Khandelwal said that, even when an insurance company doesn›t typically cover bone density screening for men, physicians can often make a case for reimbursement if the patient has a history of falls or fractures.
“In the long run, preventing a fracture is saving so much more money than when you get a fracture and end up in a hospital and have to go to a nursing home,” she said.
Dr. Curtis reported relationships with AbbVie, Amgen, Bristol-Myers Squibb, Corrona, Janssen, Lilly, Myriad, Pfizer, Regeneron, Roche, UCB, Gilead Sciences, and Sanofi. Dr. Khandelwal reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Treatment sequence with romosozumab influences osteoporosis outcomes
Timing is everything when it comes to the use of the anabolic agent romosozumab (Evenity) for the treatment of advanced osteoporosis, a review of clinical trials suggests.
In four studies with treatment sequences in which romosozumab was administered either before or following the use of an antiresorptive agent, initial treatment with 1 year of romosozumab produced substantial bone mineral density (BMD) gains in the total hip and lumbar spine.
Transition from romosozumab to a potent resorptive agent, either alendronate or denosumab (Prolia) augmented the initial gains, reported Felicia Cosman, MD, professor of clinical medicine at Columbia University, New York.
Romosozumab was the third approved agent in its class, following teriparatide in 2002, and abaloparatide (Tymlos) in 2017, both of which have been shown to produce rapid reductions in fracture risk and large improvements in BMD when they were administered up front, followed by an antiresorptive agent.
“But since romosozumab has a very different mechanism of action compared to both teriparatide and abaloparatide, we didn’t know if treatment sequence would be as important for this agent as it was for teriparatide,” she said during a press briefing prior to her presentation of the data in an oral abstract session at the virtual annual meeting of the American College of Rheumatology.
Two-for-one
Romosozumab is unique in that it both increases bone formation and decreases bone resorption, and has been shown in treatment-naive postmenopausal women with osteoporosis to significantly improve BMD and reduce fracture risk, compared with either placebo or alendronate. Romosozumab has also been studied as sequential therapy in patients treated initially with either alendronate or denosumab.
To see whether treatment sequence could have differential effects on clinical outcomes for patients with osteoporosis, Dr. Cosman and colleagues looked at results from four clinical trials, using levels of bone turnover markers (procollagen type I N-terminal propeptide [PINP] and beta-isomer of the C-terminal telopeptide of type I collagen [beta-CTX]) and BMD gains in the total hip and spine as outcomes.
The two trials of romosozumab in treatment-naive women were the ARCH trial comparing romosozumab with alendronate in a double-blind phase for 1 year, followed by 1 year of open-label alendronate, and the FRAME trial, in which romosozumab was compared with placebo in a 1-year double-blind phase, followed by 1-year of open-label denosumab.
The two trials of romosozumab in women treated initially with antiresorptive agents were the STRUCTURE trial in which patients on oral bisphosphonates for at least 3 years or alendronate 70 mg weekly for 1 year were randomized to receive either romosozumab or teriparatide, and a phase 2 trial (NCT00896532) that included a 24-month romosozumab or placebo treatment phase followed by rerandomization to a 12-month extension phase with denosumab or placebo, followed by a 12-month retreatment phase with romosozumab, followed by a 24-month follow-on phase with zoledronic acid or no intervention.
Total hip BMD gains
In the ARCH trial, total hip BMD increased 6.2% with 1 year of romosozumab, and a cumulative total of 7.1% with the 2-year romosozumab/alendronate sequence. In the FRAME trial, patients gained 6.8% in total hip BMD after 1 year of romosozumab and a total of 8.8% after 2 years of romosozumab followed by denosumab.
In contrast, in the STRUCTURE trial, patients treated for 1 year or longer with alendronate and then with 1 year of romosozumab had a 2.9% BMD gain in the total hip. In the phase 2 trial, 1 year of romosozumab following 1 year of denosumab yielded a 0.9% BMD gain, for a total gain of 3.8% with the denosumab sequence.
Lumbar spine BMD gains
In ARCH, lumbar spine BMD increased 13.7% with 1 year of romosozumab, and a total of 15.2% with the 2-year sequence of romosozumab followed by alendronate. Similarly, in FRAME, patients gained 13.3% in BMD after a year of romosozumab, and total of 17.6% by the end of the 2-year romosozumab/denosumab sequence.
In contrast, in STRUCTURE, patients who had previously been on alendronate for at least 1 year had a gain of 9.8% after 1 year of romosozumab, and in the phase 2 study, patients who had been on denosumab for 1 year had an increase in lumbar spine BMD of 5.3% after 1 year on romosozumab, and a total gain of 11.5% at the end of the 2-year sequence.
Serum PINP and beta-CTX
Looking at the markers of bone turnover, the investigators saw that, in both ARCH and FRAME, PINP peaked at over 80% of baseline at 1 month, and then continued to steadily decline past 1 year. The beta-CTX nadir was 40%-50% below baseline at 1 year.
At the end of year 2, the PINP nadir was –67% with follow-on alendronate, and –69% with denosumab, and the beta-CTX nadir was –72% and –92%, respectively.
In the two trials where romosozumab was the follow-on therapy, however, the trends were distinctly different. In STRUCTURE, for example, PINP peaked at 141% of baseline at 1 month, and then returned toward baseline, whereas beta-CTX remained largely unchanged.
In the phase 2 trial, PINP peaked at 28% above baseline at 9 months, and then only slightly declined, and beta-CTX peaked at 211% at the end of 1 year of romosozumab.
Best used up front
“This study is important, because it suggests that for the three bone-building drugs that the best effects will really be attained on bone strength if the agents are used as initial therapy in very-high-risk patients. Those are people who have sustained fractures within the preceding 2 years, who had multiple fractures at any point in their adulthood, and who present with very low BMD, particularly if they have any associated clinical risk factors such as family history or other underlying diseases or medications that have detrimental effects on bone,” Dr. Cosman said at the briefing.
Marcy Bolster, MD, from the division of rheumatology, allergy, and immunology at Massachusetts General Hospital, Boston, and associate professor of medicine at Harvard Medical School in Boston, who was not involved in the study, commented that the study provides important information for clinicians who treat patients with osteoporosis.
“We have an increasing number of medications available for use in the treatment of patients with osteoporosis, and as we consider the importance of reducing fracture risk, the duration of therapy, the timing of a bisphosphonate holiday, it is essential that we consider any advantages to the order or sequence of our medications,” she said when asked for comment.
“This study provides evidence supporting the concept of the ‘anabolic window’ in which there is a demonstrated advantage in treating patients with an anabolic agent prior to treatment with an antiresorptive agent, and while gains in bone mineral density were achieved with either order of medication use, the gains were more dramatic with treatment with romosozumab as the first agent,” she added.
Dr. Bolster also noted it will be important to demonstrate reduction in fracture risk as well as gain in BMD.
The study was sponsored by Amgen, Astellas, and UCB. Dr. Cosman disclosed grants/research support from Amgen, and consulting fees and speaker activities for Amgen and Radius Health. Dr. Bolster disclosed relationships with AbbVie, Corbus, Cumberland, Gilead, Johnson & Johnson, and Pfizer.
SOURCE: Cosman F et al. Arthritis Rheumatol. 2020;72(suppl 10), Abstract 1973.
Timing is everything when it comes to the use of the anabolic agent romosozumab (Evenity) for the treatment of advanced osteoporosis, a review of clinical trials suggests.
In four studies with treatment sequences in which romosozumab was administered either before or following the use of an antiresorptive agent, initial treatment with 1 year of romosozumab produced substantial bone mineral density (BMD) gains in the total hip and lumbar spine.
Transition from romosozumab to a potent resorptive agent, either alendronate or denosumab (Prolia) augmented the initial gains, reported Felicia Cosman, MD, professor of clinical medicine at Columbia University, New York.
Romosozumab was the third approved agent in its class, following teriparatide in 2002, and abaloparatide (Tymlos) in 2017, both of which have been shown to produce rapid reductions in fracture risk and large improvements in BMD when they were administered up front, followed by an antiresorptive agent.
“But since romosozumab has a very different mechanism of action compared to both teriparatide and abaloparatide, we didn’t know if treatment sequence would be as important for this agent as it was for teriparatide,” she said during a press briefing prior to her presentation of the data in an oral abstract session at the virtual annual meeting of the American College of Rheumatology.
Two-for-one
Romosozumab is unique in that it both increases bone formation and decreases bone resorption, and has been shown in treatment-naive postmenopausal women with osteoporosis to significantly improve BMD and reduce fracture risk, compared with either placebo or alendronate. Romosozumab has also been studied as sequential therapy in patients treated initially with either alendronate or denosumab.
To see whether treatment sequence could have differential effects on clinical outcomes for patients with osteoporosis, Dr. Cosman and colleagues looked at results from four clinical trials, using levels of bone turnover markers (procollagen type I N-terminal propeptide [PINP] and beta-isomer of the C-terminal telopeptide of type I collagen [beta-CTX]) and BMD gains in the total hip and spine as outcomes.
The two trials of romosozumab in treatment-naive women were the ARCH trial comparing romosozumab with alendronate in a double-blind phase for 1 year, followed by 1 year of open-label alendronate, and the FRAME trial, in which romosozumab was compared with placebo in a 1-year double-blind phase, followed by 1-year of open-label denosumab.
The two trials of romosozumab in women treated initially with antiresorptive agents were the STRUCTURE trial in which patients on oral bisphosphonates for at least 3 years or alendronate 70 mg weekly for 1 year were randomized to receive either romosozumab or teriparatide, and a phase 2 trial (NCT00896532) that included a 24-month romosozumab or placebo treatment phase followed by rerandomization to a 12-month extension phase with denosumab or placebo, followed by a 12-month retreatment phase with romosozumab, followed by a 24-month follow-on phase with zoledronic acid or no intervention.
Total hip BMD gains
In the ARCH trial, total hip BMD increased 6.2% with 1 year of romosozumab, and a cumulative total of 7.1% with the 2-year romosozumab/alendronate sequence. In the FRAME trial, patients gained 6.8% in total hip BMD after 1 year of romosozumab and a total of 8.8% after 2 years of romosozumab followed by denosumab.
In contrast, in the STRUCTURE trial, patients treated for 1 year or longer with alendronate and then with 1 year of romosozumab had a 2.9% BMD gain in the total hip. In the phase 2 trial, 1 year of romosozumab following 1 year of denosumab yielded a 0.9% BMD gain, for a total gain of 3.8% with the denosumab sequence.
Lumbar spine BMD gains
In ARCH, lumbar spine BMD increased 13.7% with 1 year of romosozumab, and a total of 15.2% with the 2-year sequence of romosozumab followed by alendronate. Similarly, in FRAME, patients gained 13.3% in BMD after a year of romosozumab, and total of 17.6% by the end of the 2-year romosozumab/denosumab sequence.
In contrast, in STRUCTURE, patients who had previously been on alendronate for at least 1 year had a gain of 9.8% after 1 year of romosozumab, and in the phase 2 study, patients who had been on denosumab for 1 year had an increase in lumbar spine BMD of 5.3% after 1 year on romosozumab, and a total gain of 11.5% at the end of the 2-year sequence.
Serum PINP and beta-CTX
Looking at the markers of bone turnover, the investigators saw that, in both ARCH and FRAME, PINP peaked at over 80% of baseline at 1 month, and then continued to steadily decline past 1 year. The beta-CTX nadir was 40%-50% below baseline at 1 year.
At the end of year 2, the PINP nadir was –67% with follow-on alendronate, and –69% with denosumab, and the beta-CTX nadir was –72% and –92%, respectively.
In the two trials where romosozumab was the follow-on therapy, however, the trends were distinctly different. In STRUCTURE, for example, PINP peaked at 141% of baseline at 1 month, and then returned toward baseline, whereas beta-CTX remained largely unchanged.
In the phase 2 trial, PINP peaked at 28% above baseline at 9 months, and then only slightly declined, and beta-CTX peaked at 211% at the end of 1 year of romosozumab.
Best used up front
“This study is important, because it suggests that for the three bone-building drugs that the best effects will really be attained on bone strength if the agents are used as initial therapy in very-high-risk patients. Those are people who have sustained fractures within the preceding 2 years, who had multiple fractures at any point in their adulthood, and who present with very low BMD, particularly if they have any associated clinical risk factors such as family history or other underlying diseases or medications that have detrimental effects on bone,” Dr. Cosman said at the briefing.
Marcy Bolster, MD, from the division of rheumatology, allergy, and immunology at Massachusetts General Hospital, Boston, and associate professor of medicine at Harvard Medical School in Boston, who was not involved in the study, commented that the study provides important information for clinicians who treat patients with osteoporosis.
“We have an increasing number of medications available for use in the treatment of patients with osteoporosis, and as we consider the importance of reducing fracture risk, the duration of therapy, the timing of a bisphosphonate holiday, it is essential that we consider any advantages to the order or sequence of our medications,” she said when asked for comment.
“This study provides evidence supporting the concept of the ‘anabolic window’ in which there is a demonstrated advantage in treating patients with an anabolic agent prior to treatment with an antiresorptive agent, and while gains in bone mineral density were achieved with either order of medication use, the gains were more dramatic with treatment with romosozumab as the first agent,” she added.
Dr. Bolster also noted it will be important to demonstrate reduction in fracture risk as well as gain in BMD.
The study was sponsored by Amgen, Astellas, and UCB. Dr. Cosman disclosed grants/research support from Amgen, and consulting fees and speaker activities for Amgen and Radius Health. Dr. Bolster disclosed relationships with AbbVie, Corbus, Cumberland, Gilead, Johnson & Johnson, and Pfizer.
SOURCE: Cosman F et al. Arthritis Rheumatol. 2020;72(suppl 10), Abstract 1973.
Timing is everything when it comes to the use of the anabolic agent romosozumab (Evenity) for the treatment of advanced osteoporosis, a review of clinical trials suggests.
In four studies with treatment sequences in which romosozumab was administered either before or following the use of an antiresorptive agent, initial treatment with 1 year of romosozumab produced substantial bone mineral density (BMD) gains in the total hip and lumbar spine.
Transition from romosozumab to a potent resorptive agent, either alendronate or denosumab (Prolia) augmented the initial gains, reported Felicia Cosman, MD, professor of clinical medicine at Columbia University, New York.
Romosozumab was the third approved agent in its class, following teriparatide in 2002, and abaloparatide (Tymlos) in 2017, both of which have been shown to produce rapid reductions in fracture risk and large improvements in BMD when they were administered up front, followed by an antiresorptive agent.
“But since romosozumab has a very different mechanism of action compared to both teriparatide and abaloparatide, we didn’t know if treatment sequence would be as important for this agent as it was for teriparatide,” she said during a press briefing prior to her presentation of the data in an oral abstract session at the virtual annual meeting of the American College of Rheumatology.
Two-for-one
Romosozumab is unique in that it both increases bone formation and decreases bone resorption, and has been shown in treatment-naive postmenopausal women with osteoporosis to significantly improve BMD and reduce fracture risk, compared with either placebo or alendronate. Romosozumab has also been studied as sequential therapy in patients treated initially with either alendronate or denosumab.
To see whether treatment sequence could have differential effects on clinical outcomes for patients with osteoporosis, Dr. Cosman and colleagues looked at results from four clinical trials, using levels of bone turnover markers (procollagen type I N-terminal propeptide [PINP] and beta-isomer of the C-terminal telopeptide of type I collagen [beta-CTX]) and BMD gains in the total hip and spine as outcomes.
The two trials of romosozumab in treatment-naive women were the ARCH trial comparing romosozumab with alendronate in a double-blind phase for 1 year, followed by 1 year of open-label alendronate, and the FRAME trial, in which romosozumab was compared with placebo in a 1-year double-blind phase, followed by 1-year of open-label denosumab.
The two trials of romosozumab in women treated initially with antiresorptive agents were the STRUCTURE trial in which patients on oral bisphosphonates for at least 3 years or alendronate 70 mg weekly for 1 year were randomized to receive either romosozumab or teriparatide, and a phase 2 trial (NCT00896532) that included a 24-month romosozumab or placebo treatment phase followed by rerandomization to a 12-month extension phase with denosumab or placebo, followed by a 12-month retreatment phase with romosozumab, followed by a 24-month follow-on phase with zoledronic acid or no intervention.
Total hip BMD gains
In the ARCH trial, total hip BMD increased 6.2% with 1 year of romosozumab, and a cumulative total of 7.1% with the 2-year romosozumab/alendronate sequence. In the FRAME trial, patients gained 6.8% in total hip BMD after 1 year of romosozumab and a total of 8.8% after 2 years of romosozumab followed by denosumab.
In contrast, in the STRUCTURE trial, patients treated for 1 year or longer with alendronate and then with 1 year of romosozumab had a 2.9% BMD gain in the total hip. In the phase 2 trial, 1 year of romosozumab following 1 year of denosumab yielded a 0.9% BMD gain, for a total gain of 3.8% with the denosumab sequence.
Lumbar spine BMD gains
In ARCH, lumbar spine BMD increased 13.7% with 1 year of romosozumab, and a total of 15.2% with the 2-year sequence of romosozumab followed by alendronate. Similarly, in FRAME, patients gained 13.3% in BMD after a year of romosozumab, and total of 17.6% by the end of the 2-year romosozumab/denosumab sequence.
In contrast, in STRUCTURE, patients who had previously been on alendronate for at least 1 year had a gain of 9.8% after 1 year of romosozumab, and in the phase 2 study, patients who had been on denosumab for 1 year had an increase in lumbar spine BMD of 5.3% after 1 year on romosozumab, and a total gain of 11.5% at the end of the 2-year sequence.
Serum PINP and beta-CTX
Looking at the markers of bone turnover, the investigators saw that, in both ARCH and FRAME, PINP peaked at over 80% of baseline at 1 month, and then continued to steadily decline past 1 year. The beta-CTX nadir was 40%-50% below baseline at 1 year.
At the end of year 2, the PINP nadir was –67% with follow-on alendronate, and –69% with denosumab, and the beta-CTX nadir was –72% and –92%, respectively.
In the two trials where romosozumab was the follow-on therapy, however, the trends were distinctly different. In STRUCTURE, for example, PINP peaked at 141% of baseline at 1 month, and then returned toward baseline, whereas beta-CTX remained largely unchanged.
In the phase 2 trial, PINP peaked at 28% above baseline at 9 months, and then only slightly declined, and beta-CTX peaked at 211% at the end of 1 year of romosozumab.
Best used up front
“This study is important, because it suggests that for the three bone-building drugs that the best effects will really be attained on bone strength if the agents are used as initial therapy in very-high-risk patients. Those are people who have sustained fractures within the preceding 2 years, who had multiple fractures at any point in their adulthood, and who present with very low BMD, particularly if they have any associated clinical risk factors such as family history or other underlying diseases or medications that have detrimental effects on bone,” Dr. Cosman said at the briefing.
Marcy Bolster, MD, from the division of rheumatology, allergy, and immunology at Massachusetts General Hospital, Boston, and associate professor of medicine at Harvard Medical School in Boston, who was not involved in the study, commented that the study provides important information for clinicians who treat patients with osteoporosis.
“We have an increasing number of medications available for use in the treatment of patients with osteoporosis, and as we consider the importance of reducing fracture risk, the duration of therapy, the timing of a bisphosphonate holiday, it is essential that we consider any advantages to the order or sequence of our medications,” she said when asked for comment.
“This study provides evidence supporting the concept of the ‘anabolic window’ in which there is a demonstrated advantage in treating patients with an anabolic agent prior to treatment with an antiresorptive agent, and while gains in bone mineral density were achieved with either order of medication use, the gains were more dramatic with treatment with romosozumab as the first agent,” she added.
Dr. Bolster also noted it will be important to demonstrate reduction in fracture risk as well as gain in BMD.
The study was sponsored by Amgen, Astellas, and UCB. Dr. Cosman disclosed grants/research support from Amgen, and consulting fees and speaker activities for Amgen and Radius Health. Dr. Bolster disclosed relationships with AbbVie, Corbus, Cumberland, Gilead, Johnson & Johnson, and Pfizer.
SOURCE: Cosman F et al. Arthritis Rheumatol. 2020;72(suppl 10), Abstract 1973.
FROM ACR 2020
.
The case for a new skin typing system
Even though the popular classification system added skin types V and VI in 1988 to the first iteration established in 1975, it was never intended for categorizing skin color, according to Karen C. Kagha, MD.
“This topic is going to become more relevant in our clinical practices, especially when you look at the current population trends in the U.S.,” Dr. Kagha, MD, a dermatologist and cosmetic and laser fellow in the department of dermatology at Massachusetts General Hospital and a research fellow at Wellman Center for Photomedicine, both in Boston, said during a virtual course on laser and aesthetic skin therapy. “Minority groups continue to increase. According to the U.S. Census Bureau, by the year 2050, we can expect that the majority of the population will be of non-European descent.”
The original intent of the Fitzpatrick Skin Typing (FST) system was to establish a minimal erythema dose, or likelihood to burn for patients receiving phototherapy, she continued. However, a recently published survey of 141 board-certified dermatologists and trainees found that 31% of respondents said that they used the Fitzpatrick Skin Typing System to describe the patient’s race or ethnicity, 47% used it to describe the patients’ constitutive skin color, and 22% used it in both scenarios.
“There also have been inconsistencies reported with the Fitzpatrick Skin Typing System,” Dr. Kagha said during the meeting, which was sponsored by Harvard Medical School, Massachusetts General Hospital, and the Wellman Center for Photomedicine. “Some studies show that there are inconsistent correlations between self-reported Fitzpatrick Skin Type and burn risk, and between self-reported FST and physician-reported FST. This means that some patients self-identify differently from how their physicians identify them.” There have also been some inconsistent correlations between race and objective measures of pigmentation, and also between race and self-reported FST.
Several classification systems have surfaced to try to bridge some of these gaps, including the Fanous classification, the Roberts Skin Type Classification System, and the Lancer Ethnicity Scale. Some of these have focused more on expanding the racial and ethnic categories to help predict response to procedures, she said, while others have focused more on hyperpigmentation, photoaging, and risk of scarring. “Others have suggested having different color matching systems to expand on a number of color-matched hues with regard to hyperpigmentation or race,” Dr. Kagha added. “In spite of all these efforts, it seems that the FST system remains the most widely used classification system in dermatology. I think that’s likely because we haven’t established a new consensus on a different system to use.”
She went on to postulate that there is likely “an infinite number of skin colors that are also impacted by geographic and cultural factors. Perhaps we should restructure how we think about skin typing. We need to establish a new consensus on skin typing, one that respects the variability in skin color but also one that’s clear, concise, objective, practical, and can be universally accepted.”
Dr. Kagha concluded her remarks by encouraging dermatologists to become more comfortable with treating all skin types. “This is going to be in line with the current population trends in the U.S., and also in line with the patients that we serve. Finally, I think we as physicians are in a unique position. Our patients’ frustrations and unsolved mysteries can drive our passion and our patient-centered innovation. For me, a common theme and a common source of frustration that I’ve seen in patients with increased melanin in their skin is figuring out how to effectively remove or prevent unwanted marks or unwanted pigment without disturbing the baseline pigment.”
Dr. Kagha reported having no financial disclosures.
Even though the popular classification system added skin types V and VI in 1988 to the first iteration established in 1975, it was never intended for categorizing skin color, according to Karen C. Kagha, MD.
“This topic is going to become more relevant in our clinical practices, especially when you look at the current population trends in the U.S.,” Dr. Kagha, MD, a dermatologist and cosmetic and laser fellow in the department of dermatology at Massachusetts General Hospital and a research fellow at Wellman Center for Photomedicine, both in Boston, said during a virtual course on laser and aesthetic skin therapy. “Minority groups continue to increase. According to the U.S. Census Bureau, by the year 2050, we can expect that the majority of the population will be of non-European descent.”
The original intent of the Fitzpatrick Skin Typing (FST) system was to establish a minimal erythema dose, or likelihood to burn for patients receiving phototherapy, she continued. However, a recently published survey of 141 board-certified dermatologists and trainees found that 31% of respondents said that they used the Fitzpatrick Skin Typing System to describe the patient’s race or ethnicity, 47% used it to describe the patients’ constitutive skin color, and 22% used it in both scenarios.
“There also have been inconsistencies reported with the Fitzpatrick Skin Typing System,” Dr. Kagha said during the meeting, which was sponsored by Harvard Medical School, Massachusetts General Hospital, and the Wellman Center for Photomedicine. “Some studies show that there are inconsistent correlations between self-reported Fitzpatrick Skin Type and burn risk, and between self-reported FST and physician-reported FST. This means that some patients self-identify differently from how their physicians identify them.” There have also been some inconsistent correlations between race and objective measures of pigmentation, and also between race and self-reported FST.
Several classification systems have surfaced to try to bridge some of these gaps, including the Fanous classification, the Roberts Skin Type Classification System, and the Lancer Ethnicity Scale. Some of these have focused more on expanding the racial and ethnic categories to help predict response to procedures, she said, while others have focused more on hyperpigmentation, photoaging, and risk of scarring. “Others have suggested having different color matching systems to expand on a number of color-matched hues with regard to hyperpigmentation or race,” Dr. Kagha added. “In spite of all these efforts, it seems that the FST system remains the most widely used classification system in dermatology. I think that’s likely because we haven’t established a new consensus on a different system to use.”
She went on to postulate that there is likely “an infinite number of skin colors that are also impacted by geographic and cultural factors. Perhaps we should restructure how we think about skin typing. We need to establish a new consensus on skin typing, one that respects the variability in skin color but also one that’s clear, concise, objective, practical, and can be universally accepted.”
Dr. Kagha concluded her remarks by encouraging dermatologists to become more comfortable with treating all skin types. “This is going to be in line with the current population trends in the U.S., and also in line with the patients that we serve. Finally, I think we as physicians are in a unique position. Our patients’ frustrations and unsolved mysteries can drive our passion and our patient-centered innovation. For me, a common theme and a common source of frustration that I’ve seen in patients with increased melanin in their skin is figuring out how to effectively remove or prevent unwanted marks or unwanted pigment without disturbing the baseline pigment.”
Dr. Kagha reported having no financial disclosures.
Even though the popular classification system added skin types V and VI in 1988 to the first iteration established in 1975, it was never intended for categorizing skin color, according to Karen C. Kagha, MD.
“This topic is going to become more relevant in our clinical practices, especially when you look at the current population trends in the U.S.,” Dr. Kagha, MD, a dermatologist and cosmetic and laser fellow in the department of dermatology at Massachusetts General Hospital and a research fellow at Wellman Center for Photomedicine, both in Boston, said during a virtual course on laser and aesthetic skin therapy. “Minority groups continue to increase. According to the U.S. Census Bureau, by the year 2050, we can expect that the majority of the population will be of non-European descent.”
The original intent of the Fitzpatrick Skin Typing (FST) system was to establish a minimal erythema dose, or likelihood to burn for patients receiving phototherapy, she continued. However, a recently published survey of 141 board-certified dermatologists and trainees found that 31% of respondents said that they used the Fitzpatrick Skin Typing System to describe the patient’s race or ethnicity, 47% used it to describe the patients’ constitutive skin color, and 22% used it in both scenarios.
“There also have been inconsistencies reported with the Fitzpatrick Skin Typing System,” Dr. Kagha said during the meeting, which was sponsored by Harvard Medical School, Massachusetts General Hospital, and the Wellman Center for Photomedicine. “Some studies show that there are inconsistent correlations between self-reported Fitzpatrick Skin Type and burn risk, and between self-reported FST and physician-reported FST. This means that some patients self-identify differently from how their physicians identify them.” There have also been some inconsistent correlations between race and objective measures of pigmentation, and also between race and self-reported FST.
Several classification systems have surfaced to try to bridge some of these gaps, including the Fanous classification, the Roberts Skin Type Classification System, and the Lancer Ethnicity Scale. Some of these have focused more on expanding the racial and ethnic categories to help predict response to procedures, she said, while others have focused more on hyperpigmentation, photoaging, and risk of scarring. “Others have suggested having different color matching systems to expand on a number of color-matched hues with regard to hyperpigmentation or race,” Dr. Kagha added. “In spite of all these efforts, it seems that the FST system remains the most widely used classification system in dermatology. I think that’s likely because we haven’t established a new consensus on a different system to use.”
She went on to postulate that there is likely “an infinite number of skin colors that are also impacted by geographic and cultural factors. Perhaps we should restructure how we think about skin typing. We need to establish a new consensus on skin typing, one that respects the variability in skin color but also one that’s clear, concise, objective, practical, and can be universally accepted.”
Dr. Kagha concluded her remarks by encouraging dermatologists to become more comfortable with treating all skin types. “This is going to be in line with the current population trends in the U.S., and also in line with the patients that we serve. Finally, I think we as physicians are in a unique position. Our patients’ frustrations and unsolved mysteries can drive our passion and our patient-centered innovation. For me, a common theme and a common source of frustration that I’ve seen in patients with increased melanin in their skin is figuring out how to effectively remove or prevent unwanted marks or unwanted pigment without disturbing the baseline pigment.”
Dr. Kagha reported having no financial disclosures.
FROM A LASER & AESTHETIC SKIN THERAPY COURSE
Genomic analysis reveals insights into pathogenesis of neuroblastoma
Insights into the genetic drivers of the disease were identified based on data from whole-genome, whole-exome, and/or transcriptome sequencing of tumor samples.
“The comprehensive genome-wide analysis performed here allowed us to discover age-associated alterations in MYCN, TERT, PTPRD, and Ras pathway alterations, which, together with ATRX, represent the majority of common driver gene alterations in neuroblastoma,” wrote study author Samuel W. Brady, PhD, of St. Jude Children’s Research Hospital in Memphis, Tenn., and colleagues.
The group’s findings were published in Nature Communications.
The researchers integrated and analyzed data from 702 neuroblastomas encompassing all age and risk categories, with the goal of identifying rare driver events and age-related molecular aberrations. Among the samples, 23 were from patients who had relapsed.
The researchers found that 40% of samples had somatic alterations in known driver genes, with the most common alterations being MYCN (19%; primarily amplification), TERT (17%; structural variations [SVs]), SHANK2 (13%; SVs), PTPRD (11%; SVs and focal deletions), ALK (10%; single nucleotide variants [SNVs] and SVs), and ATRX (8%; multiple mutation types).
MYCN and TERT alterations were more common in younger children (median age of 2.3 years and 3.8 years, respectively), while ATRX alterations were more frequently seen in older patients (median age of 5.6 years).
“These findings suggest that the sympathetic nervous system, the tissue from which neuroblastoma arises, is susceptible to different oncogenic insults at different times during development, which could be explored in future investigations using animal models,” the researchers wrote.
Furthermore, they found evidence to suggest the COSMIC mutational signature 18 is the most common cause of driver SNVs in neuroblastoma, including most Ras-activating and ALK variants.
Signature 18 was enriched in neuroblastomas with increased expression of mitochondrial ribosome and electron transport–associated genes, 17q gain, and MYCN amplification.
“[T]his mutagenic process, which is caused by ROS [reactive oxygen species] in other settings (though not proven in neuroblastoma), may promote evolution and heterogeneity, as many driver SNVs, such as ALK mutations, are later events in neuroblastoma,” the researchers explained.
Based on these findings, the authors concluded that neuroblastomas with 17q gain may be amenable to precision medicines, possibly through targeting altered mitochondrial function.
“[Our] findings will identify patients who might be eligible for targeted therapy and those that may be at higher risk based on a combination of genetic alterations detected by these genome-wide sequencing methods,” commented study author Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital.
The study was supported by grants from the National Cancer Institute and by the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital. One author disclosed financial affiliations with Y-mabs Therapeutics, Abpro-Labs, Eureka Therapeutics, and Biotec Pharmacon.
SOURCE: Brady SW et al. Nat Commun. 2020 Oct 14. doi: 10.1038/s41467-020-18987-4.
Insights into the genetic drivers of the disease were identified based on data from whole-genome, whole-exome, and/or transcriptome sequencing of tumor samples.
“The comprehensive genome-wide analysis performed here allowed us to discover age-associated alterations in MYCN, TERT, PTPRD, and Ras pathway alterations, which, together with ATRX, represent the majority of common driver gene alterations in neuroblastoma,” wrote study author Samuel W. Brady, PhD, of St. Jude Children’s Research Hospital in Memphis, Tenn., and colleagues.
The group’s findings were published in Nature Communications.
The researchers integrated and analyzed data from 702 neuroblastomas encompassing all age and risk categories, with the goal of identifying rare driver events and age-related molecular aberrations. Among the samples, 23 were from patients who had relapsed.
The researchers found that 40% of samples had somatic alterations in known driver genes, with the most common alterations being MYCN (19%; primarily amplification), TERT (17%; structural variations [SVs]), SHANK2 (13%; SVs), PTPRD (11%; SVs and focal deletions), ALK (10%; single nucleotide variants [SNVs] and SVs), and ATRX (8%; multiple mutation types).
MYCN and TERT alterations were more common in younger children (median age of 2.3 years and 3.8 years, respectively), while ATRX alterations were more frequently seen in older patients (median age of 5.6 years).
“These findings suggest that the sympathetic nervous system, the tissue from which neuroblastoma arises, is susceptible to different oncogenic insults at different times during development, which could be explored in future investigations using animal models,” the researchers wrote.
Furthermore, they found evidence to suggest the COSMIC mutational signature 18 is the most common cause of driver SNVs in neuroblastoma, including most Ras-activating and ALK variants.
Signature 18 was enriched in neuroblastomas with increased expression of mitochondrial ribosome and electron transport–associated genes, 17q gain, and MYCN amplification.
“[T]his mutagenic process, which is caused by ROS [reactive oxygen species] in other settings (though not proven in neuroblastoma), may promote evolution and heterogeneity, as many driver SNVs, such as ALK mutations, are later events in neuroblastoma,” the researchers explained.
Based on these findings, the authors concluded that neuroblastomas with 17q gain may be amenable to precision medicines, possibly through targeting altered mitochondrial function.
“[Our] findings will identify patients who might be eligible for targeted therapy and those that may be at higher risk based on a combination of genetic alterations detected by these genome-wide sequencing methods,” commented study author Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital.
The study was supported by grants from the National Cancer Institute and by the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital. One author disclosed financial affiliations with Y-mabs Therapeutics, Abpro-Labs, Eureka Therapeutics, and Biotec Pharmacon.
SOURCE: Brady SW et al. Nat Commun. 2020 Oct 14. doi: 10.1038/s41467-020-18987-4.
Insights into the genetic drivers of the disease were identified based on data from whole-genome, whole-exome, and/or transcriptome sequencing of tumor samples.
“The comprehensive genome-wide analysis performed here allowed us to discover age-associated alterations in MYCN, TERT, PTPRD, and Ras pathway alterations, which, together with ATRX, represent the majority of common driver gene alterations in neuroblastoma,” wrote study author Samuel W. Brady, PhD, of St. Jude Children’s Research Hospital in Memphis, Tenn., and colleagues.
The group’s findings were published in Nature Communications.
The researchers integrated and analyzed data from 702 neuroblastomas encompassing all age and risk categories, with the goal of identifying rare driver events and age-related molecular aberrations. Among the samples, 23 were from patients who had relapsed.
The researchers found that 40% of samples had somatic alterations in known driver genes, with the most common alterations being MYCN (19%; primarily amplification), TERT (17%; structural variations [SVs]), SHANK2 (13%; SVs), PTPRD (11%; SVs and focal deletions), ALK (10%; single nucleotide variants [SNVs] and SVs), and ATRX (8%; multiple mutation types).
MYCN and TERT alterations were more common in younger children (median age of 2.3 years and 3.8 years, respectively), while ATRX alterations were more frequently seen in older patients (median age of 5.6 years).
“These findings suggest that the sympathetic nervous system, the tissue from which neuroblastoma arises, is susceptible to different oncogenic insults at different times during development, which could be explored in future investigations using animal models,” the researchers wrote.
Furthermore, they found evidence to suggest the COSMIC mutational signature 18 is the most common cause of driver SNVs in neuroblastoma, including most Ras-activating and ALK variants.
Signature 18 was enriched in neuroblastomas with increased expression of mitochondrial ribosome and electron transport–associated genes, 17q gain, and MYCN amplification.
“[T]his mutagenic process, which is caused by ROS [reactive oxygen species] in other settings (though not proven in neuroblastoma), may promote evolution and heterogeneity, as many driver SNVs, such as ALK mutations, are later events in neuroblastoma,” the researchers explained.
Based on these findings, the authors concluded that neuroblastomas with 17q gain may be amenable to precision medicines, possibly through targeting altered mitochondrial function.
“[Our] findings will identify patients who might be eligible for targeted therapy and those that may be at higher risk based on a combination of genetic alterations detected by these genome-wide sequencing methods,” commented study author Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital.
The study was supported by grants from the National Cancer Institute and by the American Lebanese Syrian Associated Charities of St. Jude Children’s Research Hospital. One author disclosed financial affiliations with Y-mabs Therapeutics, Abpro-Labs, Eureka Therapeutics, and Biotec Pharmacon.
SOURCE: Brady SW et al. Nat Commun. 2020 Oct 14. doi: 10.1038/s41467-020-18987-4.
FROM NATURE COMMUNICATIONS
‘Disordered eating’ drops after teens undergo bariatric surgery
Kristina M. Decker, PhD, a postdoctoral fellow at Cincinnati Children’s Hospital Medical Center, presented these findings during the virtual ObesityWeek 2020.
Dr. Decker and associates examined rates of disordered eating in more than 200 adolescents (aged 13-18 years) who were severely obese, of whom 141 underwent bariatric surgery and the remainder did not.
At baseline (presurgery), the teens in both groups had rates of disordered eating ranging from 11% to 50%, with higher rates in those who went on to have bariatric surgery.
Six years later, rates of disordered eating were much lower in those who had bariatric surgery.
The data nevertheless “underscore that young adults with persistent severe obesity are at high risk for poor health and well-being,” Dr. Decker said in an interview.
“This means disordered eating behaviors should be closely monitored” in all such patients, both those who undergo surgery and those who don’t, she stressed.
Robust findings because of long follow-up and controls
The findings are not unexpected, based on adult bariatric literature, but are “novel because of the age of the patients,” senior author Margaret H. Zeller, PhD, Cincinnati Children’s Hospital Medical Center and professor at the University of Cincinnati, added.
In a comment comment, psychologist Kajsa Järvholm, PhD, of the Childhood Obesity Unit at Skåne University Hospital, Malmö̈, Sweden, who has published related work, said that this is “a needed study.”
Notably, it had “long-term follow-up and a control group,” and it “confirms that adolescents are in better control of their eating after surgery.”
However, an important additional takeaway for clinicians is that “disordered eating is associated with other mental health problems and self-worth. Clinicians treating obesity must address problems related to eating disorders to improve outcomes and well-being,” she stressed.
How does bariatric surgery impact overeating, binge eating, in teens?
“For teens with severe obesity, metabolic and bariatric surgery is the most effective treatment for improved cardiometabolic functioning, weight loss, and improved quality of life,” Dr. Decker stressed.
However, pre- and postsurgical disordered eating behaviors have been associated with a lower percentage change in body mass index (BMI), although this has not been well studied.
To investigate how disordered eating is affected by bariatric surgery in adolescents with severe obesity, researchers used data from Teen-LABS, which enrolled 242 participants aged 19 years and under who mainly underwent Roux-en-Y gastric bypass (67%) or sleeve gastrectomy (28%) from 2007 to 2012 at five adolescent bariatric surgery centers.
The current analysis examined data from 141 participants in Teen-LABS who underwent bariatric surgery at a mean age of 16.8 years. Mean BMI was 51.5, most were girls (80%), and they had diverse race/ethnicity (66% were White).
Researchers also identified a control group of 83 adolescents of a similar age and gender who had diverse race/ethnicity (54% White) and a mean BMI of 46.9.
At year 6, data were available for 123 young adults in the surgery group (who by then had a mean BMI of 39.7) and 63 young adults in the nonsurgery group (who had a mean BMI of 52.6).
At baseline and year 6, participants replied to questionnaires that identified three eating disorders: continuous eating (eating in an unplanned and repetitious way between meals and snacks), objective overeating (eating a “large” amount of food without loss of control), and objective binge eating (eating a “large” amount of food with loss of control).
At baseline, rates of continuous eating, overeating, and binge eating were higher in the surgical group (50%, 40%, and 30%, respectively) than the nonsurgical group (40%, 22%, and 11%, respectively).
Six years later, when participants were aged 19-24 years, rates of continuous eating, overeating, and binge eating had declined in the surgical group (to 17%, 5%, and 1%, respectively). In the nonsurgical group, only continuous eating and overeating declined (to 24% and 7%, respectively), and binge eating increased slightly (to 13%).
Disordered eating associated with low self-worth, anxiety, and depression
In young adulthood in both groups, disordered eating was associated with lower self-worth. In the surgical group, it was also associated with lower weight-related quality of life, and in the nonsurgical group, it was also associated with anxiety and/or depression.
“The current findings cannot tell us whether disordered eating is a direct result or caused by anxiety, depression, low self-worth, or poor quality of life,” Dr. Decker said.
“These findings do give us insight about what other areas of clinical concern might present together [in] young adults (e.g., disordered eating, low self-esteem).”
Bariatric surgery affects the amount of food people can eat at one time, she noted in reply to a question from the audience. If people eat too much at a time they can experience vomiting, dumping syndrome (where certain food is “dumped” into the small intestine without being digested, causing nausea and vomiting), and plugging (a sense of food becoming stuck).
The home environment and transition to adulthood might impact disordered eating in young adults, she said in reply to another question, but these issues were not examined in this study.
A version of this article originally appeared on Medscape.com.
Kristina M. Decker, PhD, a postdoctoral fellow at Cincinnati Children’s Hospital Medical Center, presented these findings during the virtual ObesityWeek 2020.
Dr. Decker and associates examined rates of disordered eating in more than 200 adolescents (aged 13-18 years) who were severely obese, of whom 141 underwent bariatric surgery and the remainder did not.
At baseline (presurgery), the teens in both groups had rates of disordered eating ranging from 11% to 50%, with higher rates in those who went on to have bariatric surgery.
Six years later, rates of disordered eating were much lower in those who had bariatric surgery.
The data nevertheless “underscore that young adults with persistent severe obesity are at high risk for poor health and well-being,” Dr. Decker said in an interview.
“This means disordered eating behaviors should be closely monitored” in all such patients, both those who undergo surgery and those who don’t, she stressed.
Robust findings because of long follow-up and controls
The findings are not unexpected, based on adult bariatric literature, but are “novel because of the age of the patients,” senior author Margaret H. Zeller, PhD, Cincinnati Children’s Hospital Medical Center and professor at the University of Cincinnati, added.
In a comment comment, psychologist Kajsa Järvholm, PhD, of the Childhood Obesity Unit at Skåne University Hospital, Malmö̈, Sweden, who has published related work, said that this is “a needed study.”
Notably, it had “long-term follow-up and a control group,” and it “confirms that adolescents are in better control of their eating after surgery.”
However, an important additional takeaway for clinicians is that “disordered eating is associated with other mental health problems and self-worth. Clinicians treating obesity must address problems related to eating disorders to improve outcomes and well-being,” she stressed.
How does bariatric surgery impact overeating, binge eating, in teens?
“For teens with severe obesity, metabolic and bariatric surgery is the most effective treatment for improved cardiometabolic functioning, weight loss, and improved quality of life,” Dr. Decker stressed.
However, pre- and postsurgical disordered eating behaviors have been associated with a lower percentage change in body mass index (BMI), although this has not been well studied.
To investigate how disordered eating is affected by bariatric surgery in adolescents with severe obesity, researchers used data from Teen-LABS, which enrolled 242 participants aged 19 years and under who mainly underwent Roux-en-Y gastric bypass (67%) or sleeve gastrectomy (28%) from 2007 to 2012 at five adolescent bariatric surgery centers.
The current analysis examined data from 141 participants in Teen-LABS who underwent bariatric surgery at a mean age of 16.8 years. Mean BMI was 51.5, most were girls (80%), and they had diverse race/ethnicity (66% were White).
Researchers also identified a control group of 83 adolescents of a similar age and gender who had diverse race/ethnicity (54% White) and a mean BMI of 46.9.
At year 6, data were available for 123 young adults in the surgery group (who by then had a mean BMI of 39.7) and 63 young adults in the nonsurgery group (who had a mean BMI of 52.6).
At baseline and year 6, participants replied to questionnaires that identified three eating disorders: continuous eating (eating in an unplanned and repetitious way between meals and snacks), objective overeating (eating a “large” amount of food without loss of control), and objective binge eating (eating a “large” amount of food with loss of control).
At baseline, rates of continuous eating, overeating, and binge eating were higher in the surgical group (50%, 40%, and 30%, respectively) than the nonsurgical group (40%, 22%, and 11%, respectively).
Six years later, when participants were aged 19-24 years, rates of continuous eating, overeating, and binge eating had declined in the surgical group (to 17%, 5%, and 1%, respectively). In the nonsurgical group, only continuous eating and overeating declined (to 24% and 7%, respectively), and binge eating increased slightly (to 13%).
Disordered eating associated with low self-worth, anxiety, and depression
In young adulthood in both groups, disordered eating was associated with lower self-worth. In the surgical group, it was also associated with lower weight-related quality of life, and in the nonsurgical group, it was also associated with anxiety and/or depression.
“The current findings cannot tell us whether disordered eating is a direct result or caused by anxiety, depression, low self-worth, or poor quality of life,” Dr. Decker said.
“These findings do give us insight about what other areas of clinical concern might present together [in] young adults (e.g., disordered eating, low self-esteem).”
Bariatric surgery affects the amount of food people can eat at one time, she noted in reply to a question from the audience. If people eat too much at a time they can experience vomiting, dumping syndrome (where certain food is “dumped” into the small intestine without being digested, causing nausea and vomiting), and plugging (a sense of food becoming stuck).
The home environment and transition to adulthood might impact disordered eating in young adults, she said in reply to another question, but these issues were not examined in this study.
A version of this article originally appeared on Medscape.com.
Kristina M. Decker, PhD, a postdoctoral fellow at Cincinnati Children’s Hospital Medical Center, presented these findings during the virtual ObesityWeek 2020.
Dr. Decker and associates examined rates of disordered eating in more than 200 adolescents (aged 13-18 years) who were severely obese, of whom 141 underwent bariatric surgery and the remainder did not.
At baseline (presurgery), the teens in both groups had rates of disordered eating ranging from 11% to 50%, with higher rates in those who went on to have bariatric surgery.
Six years later, rates of disordered eating were much lower in those who had bariatric surgery.
The data nevertheless “underscore that young adults with persistent severe obesity are at high risk for poor health and well-being,” Dr. Decker said in an interview.
“This means disordered eating behaviors should be closely monitored” in all such patients, both those who undergo surgery and those who don’t, she stressed.
Robust findings because of long follow-up and controls
The findings are not unexpected, based on adult bariatric literature, but are “novel because of the age of the patients,” senior author Margaret H. Zeller, PhD, Cincinnati Children’s Hospital Medical Center and professor at the University of Cincinnati, added.
In a comment comment, psychologist Kajsa Järvholm, PhD, of the Childhood Obesity Unit at Skåne University Hospital, Malmö̈, Sweden, who has published related work, said that this is “a needed study.”
Notably, it had “long-term follow-up and a control group,” and it “confirms that adolescents are in better control of their eating after surgery.”
However, an important additional takeaway for clinicians is that “disordered eating is associated with other mental health problems and self-worth. Clinicians treating obesity must address problems related to eating disorders to improve outcomes and well-being,” she stressed.
How does bariatric surgery impact overeating, binge eating, in teens?
“For teens with severe obesity, metabolic and bariatric surgery is the most effective treatment for improved cardiometabolic functioning, weight loss, and improved quality of life,” Dr. Decker stressed.
However, pre- and postsurgical disordered eating behaviors have been associated with a lower percentage change in body mass index (BMI), although this has not been well studied.
To investigate how disordered eating is affected by bariatric surgery in adolescents with severe obesity, researchers used data from Teen-LABS, which enrolled 242 participants aged 19 years and under who mainly underwent Roux-en-Y gastric bypass (67%) or sleeve gastrectomy (28%) from 2007 to 2012 at five adolescent bariatric surgery centers.
The current analysis examined data from 141 participants in Teen-LABS who underwent bariatric surgery at a mean age of 16.8 years. Mean BMI was 51.5, most were girls (80%), and they had diverse race/ethnicity (66% were White).
Researchers also identified a control group of 83 adolescents of a similar age and gender who had diverse race/ethnicity (54% White) and a mean BMI of 46.9.
At year 6, data were available for 123 young adults in the surgery group (who by then had a mean BMI of 39.7) and 63 young adults in the nonsurgery group (who had a mean BMI of 52.6).
At baseline and year 6, participants replied to questionnaires that identified three eating disorders: continuous eating (eating in an unplanned and repetitious way between meals and snacks), objective overeating (eating a “large” amount of food without loss of control), and objective binge eating (eating a “large” amount of food with loss of control).
At baseline, rates of continuous eating, overeating, and binge eating were higher in the surgical group (50%, 40%, and 30%, respectively) than the nonsurgical group (40%, 22%, and 11%, respectively).
Six years later, when participants were aged 19-24 years, rates of continuous eating, overeating, and binge eating had declined in the surgical group (to 17%, 5%, and 1%, respectively). In the nonsurgical group, only continuous eating and overeating declined (to 24% and 7%, respectively), and binge eating increased slightly (to 13%).
Disordered eating associated with low self-worth, anxiety, and depression
In young adulthood in both groups, disordered eating was associated with lower self-worth. In the surgical group, it was also associated with lower weight-related quality of life, and in the nonsurgical group, it was also associated with anxiety and/or depression.
“The current findings cannot tell us whether disordered eating is a direct result or caused by anxiety, depression, low self-worth, or poor quality of life,” Dr. Decker said.
“These findings do give us insight about what other areas of clinical concern might present together [in] young adults (e.g., disordered eating, low self-esteem).”
Bariatric surgery affects the amount of food people can eat at one time, she noted in reply to a question from the audience. If people eat too much at a time they can experience vomiting, dumping syndrome (where certain food is “dumped” into the small intestine without being digested, causing nausea and vomiting), and plugging (a sense of food becoming stuck).
The home environment and transition to adulthood might impact disordered eating in young adults, she said in reply to another question, but these issues were not examined in this study.
A version of this article originally appeared on Medscape.com.
Higher cardiovascular risks in Kawasaki disease persist 10-plus years
Risks are highest in first year.
Survivors of Kawasaki disease remain at a higher long-term risk for cardiovascular events into young adulthood, including myocardial infarction, compared to people without the disease, new evidence reveals. The elevated risks emerged in survivors both with and without cardiovascular involvement at the time of initial diagnosis.
Overall risk of cardiovascular events was highest in the first year following Kawasaki disease diagnosis, and about 10 times greater than in healthy children, Cal Robinson, MD, said during a press conference at the virtual annual meeting of the American College of Rheumatology.
“The risk gradually decreased over time. However, even 10 years after diagnosis of their illness, they still had a 39% higher risk,” said study author Dr. Robinson, a PGY4 pediatric nephrology fellow at The Hospital for Sick Children in Toronto.
Dr. Robinson also put the numbers in perspective. “We fully acknowledged these are very rare events in children, especially healthy children, which is why we needed such a large cohort to study this. Interpret the numbers cautiously.”
In terms of patient and family counseling, “I would say children with Kawasaki disease have a higher risk of myocardial infarction, but the absolute risk is still low,” he added. For example, 16 Kawasaki disease survivors experienced a heart attack during follow-up, or 0.4% of the affected study population, compared to a rate of 0.1% among matched controls.
“These families are often very frightened after the initial Kawasaki disease diagnosis,” Dr. Robinson said. “We have to balance some discussion with what we know about Kawasaki disease without overly scaring or terrifying these families, who are already anxious.”
To quantify the incidence and timing of cardiovascular events and cardiac disease following diagnosis, Dr. Robinson and colleagues assessed large databases representing approximately 3 million children. They focused on children hospitalized with a Kawasaki disease diagnosis between 1995 and 2018. These children had a median length of stay of 3 days and 2.5% were admitted to critical care. The investigators matched his population 1:100 to unaffected children in Ontario.
Follow-up was up to 24 years (median, 11 years) in this retrospective, population-based cohort study.
Risks raised over a decade and beyond
Compared to matched controls, Kawasaki disease survivors had a higher risk for a cardiac event in the first year following diagnosis (adjusted hazard ratio, 11.65; 95% confidence interval, 10.34-13.13). The 1- to 5-year risk was lower (aHR, 3.35), a trend that continued between 5 and 10 years (aHR, 1.87) and as well as after more than 10 years (aHR, 1.39).
The risk of major adverse cardiac events (MACE, a composite of myocardial infarction, stroke, or cardiovascular death) was likewise highest in the first year after diagnosis (aHR, 3.27), followed by a 51% greater risk at 1-5 years, a 113% increased risk at 5-10 years, and a 17% elevated risk after 10 years.
The investigators compared the 144 Kawasaki disease survivors who experienced a coronary artery aneurysm (CAA) within 90 days of hospital admission to the 4,453 others who did not have a CAA. The risk for a composite cardiovascular event was elevated at each time point among those with a history of CAA, especially in the first year. The adjusted HR was 33.12 in the CAA group versus 10.44 in the non-CAA group.
“The most interesting finding of this study was that children with Kawasaki syndrome are at higher risk for composite cardiovascular events and major adverse cardiac events even if they were not diagnosed with coronary artery aneurysm,” session comoderator Shervin Assassi, MD, professor of medicine and director of division of rheumatology at the University of Texas Health Science Center at Houston, said when asked to comment.
Dr. Robinson and colleagues also looked at outcomes based on presence or absence of coronary involvement at the time of Kawasaki disease diagnosis. For example, among those with initial coronary involvement, 15% later experienced a cardiovascular event and 10% experienced a major cardiovascular event.
“However, we were specifically interested in looking at children without initial coronary involvement. In this group, we also found these children were at increased risk for cardiovascular events compared to children without Kawasaki disease,” Dr. Robinson said. He said the distinction is important because approximately 95% of children diagnosed with Kawasaki disease do not feature initial coronary involvement.
In terms of clinical care, “our data provides an early signal that Kawasaki disease survivors – including those without initial coronary involvement – may be at higher risk of cardiovascular events into early adulthood.”
A call for closer monitoring
“Based on our results, we find that Kawasaki disease survivors may benefit from additional follow-up and surveillance for cardiovascular disease risk factors, such as obesity, high blood pressure, and high cholesterol,” Dr. Robinson said. Early identification of heightened risk could allow physicians to more closely monitor this subgroup and emphasize potentially beneficial lifestyle modifications, including increasing physical activity, implementing a heart healthy diet, and avoiding smoking.
Mortality was not significantly different between groups. “Despite the risk of cardiac events we found, death was uncommon,” Dr. Robinson said. Among children with Kawasaki disease, 1 in 500 died during follow-up, so “the risk of death was actually lower than for children without Kawasaki disease.”
Similar findings of lower mortality have been reported in research out of Japan, he added during a plenary presentation at ACR 2020. Future research is warranted to evaluate this finding further, Dr. Robinson said.
Future plans
Going forward, the investigators plan to evaluate noncardiovascular outcomes in this patient population. They would also like to examine health care utilization following a diagnosis of Kawasaki disease “to better understand what kind of follow-up is happening now in Ontario,” Dr. Robinson said.
Another unanswered question is whether the cardiovascular events observed in the study stem from atherosclerotic disease or a different mechanism among survivors of Kawasaki disease.
The research was supported by a McMaster University Resident Research Grant, a Hamilton Health Sciences New Investigator Award, and Ontario’s Institute for Clinical Evaluative Sciences. Dr. Robinson had no relevant financial disclosures.
SOURCE: Robinson C et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0937.
Risks are highest in first year.
Risks are highest in first year.
Survivors of Kawasaki disease remain at a higher long-term risk for cardiovascular events into young adulthood, including myocardial infarction, compared to people without the disease, new evidence reveals. The elevated risks emerged in survivors both with and without cardiovascular involvement at the time of initial diagnosis.
Overall risk of cardiovascular events was highest in the first year following Kawasaki disease diagnosis, and about 10 times greater than in healthy children, Cal Robinson, MD, said during a press conference at the virtual annual meeting of the American College of Rheumatology.
“The risk gradually decreased over time. However, even 10 years after diagnosis of their illness, they still had a 39% higher risk,” said study author Dr. Robinson, a PGY4 pediatric nephrology fellow at The Hospital for Sick Children in Toronto.
Dr. Robinson also put the numbers in perspective. “We fully acknowledged these are very rare events in children, especially healthy children, which is why we needed such a large cohort to study this. Interpret the numbers cautiously.”
In terms of patient and family counseling, “I would say children with Kawasaki disease have a higher risk of myocardial infarction, but the absolute risk is still low,” he added. For example, 16 Kawasaki disease survivors experienced a heart attack during follow-up, or 0.4% of the affected study population, compared to a rate of 0.1% among matched controls.
“These families are often very frightened after the initial Kawasaki disease diagnosis,” Dr. Robinson said. “We have to balance some discussion with what we know about Kawasaki disease without overly scaring or terrifying these families, who are already anxious.”
To quantify the incidence and timing of cardiovascular events and cardiac disease following diagnosis, Dr. Robinson and colleagues assessed large databases representing approximately 3 million children. They focused on children hospitalized with a Kawasaki disease diagnosis between 1995 and 2018. These children had a median length of stay of 3 days and 2.5% were admitted to critical care. The investigators matched his population 1:100 to unaffected children in Ontario.
Follow-up was up to 24 years (median, 11 years) in this retrospective, population-based cohort study.
Risks raised over a decade and beyond
Compared to matched controls, Kawasaki disease survivors had a higher risk for a cardiac event in the first year following diagnosis (adjusted hazard ratio, 11.65; 95% confidence interval, 10.34-13.13). The 1- to 5-year risk was lower (aHR, 3.35), a trend that continued between 5 and 10 years (aHR, 1.87) and as well as after more than 10 years (aHR, 1.39).
The risk of major adverse cardiac events (MACE, a composite of myocardial infarction, stroke, or cardiovascular death) was likewise highest in the first year after diagnosis (aHR, 3.27), followed by a 51% greater risk at 1-5 years, a 113% increased risk at 5-10 years, and a 17% elevated risk after 10 years.
The investigators compared the 144 Kawasaki disease survivors who experienced a coronary artery aneurysm (CAA) within 90 days of hospital admission to the 4,453 others who did not have a CAA. The risk for a composite cardiovascular event was elevated at each time point among those with a history of CAA, especially in the first year. The adjusted HR was 33.12 in the CAA group versus 10.44 in the non-CAA group.
“The most interesting finding of this study was that children with Kawasaki syndrome are at higher risk for composite cardiovascular events and major adverse cardiac events even if they were not diagnosed with coronary artery aneurysm,” session comoderator Shervin Assassi, MD, professor of medicine and director of division of rheumatology at the University of Texas Health Science Center at Houston, said when asked to comment.
Dr. Robinson and colleagues also looked at outcomes based on presence or absence of coronary involvement at the time of Kawasaki disease diagnosis. For example, among those with initial coronary involvement, 15% later experienced a cardiovascular event and 10% experienced a major cardiovascular event.
“However, we were specifically interested in looking at children without initial coronary involvement. In this group, we also found these children were at increased risk for cardiovascular events compared to children without Kawasaki disease,” Dr. Robinson said. He said the distinction is important because approximately 95% of children diagnosed with Kawasaki disease do not feature initial coronary involvement.
In terms of clinical care, “our data provides an early signal that Kawasaki disease survivors – including those without initial coronary involvement – may be at higher risk of cardiovascular events into early adulthood.”
A call for closer monitoring
“Based on our results, we find that Kawasaki disease survivors may benefit from additional follow-up and surveillance for cardiovascular disease risk factors, such as obesity, high blood pressure, and high cholesterol,” Dr. Robinson said. Early identification of heightened risk could allow physicians to more closely monitor this subgroup and emphasize potentially beneficial lifestyle modifications, including increasing physical activity, implementing a heart healthy diet, and avoiding smoking.
Mortality was not significantly different between groups. “Despite the risk of cardiac events we found, death was uncommon,” Dr. Robinson said. Among children with Kawasaki disease, 1 in 500 died during follow-up, so “the risk of death was actually lower than for children without Kawasaki disease.”
Similar findings of lower mortality have been reported in research out of Japan, he added during a plenary presentation at ACR 2020. Future research is warranted to evaluate this finding further, Dr. Robinson said.
Future plans
Going forward, the investigators plan to evaluate noncardiovascular outcomes in this patient population. They would also like to examine health care utilization following a diagnosis of Kawasaki disease “to better understand what kind of follow-up is happening now in Ontario,” Dr. Robinson said.
Another unanswered question is whether the cardiovascular events observed in the study stem from atherosclerotic disease or a different mechanism among survivors of Kawasaki disease.
The research was supported by a McMaster University Resident Research Grant, a Hamilton Health Sciences New Investigator Award, and Ontario’s Institute for Clinical Evaluative Sciences. Dr. Robinson had no relevant financial disclosures.
SOURCE: Robinson C et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0937.
Survivors of Kawasaki disease remain at a higher long-term risk for cardiovascular events into young adulthood, including myocardial infarction, compared to people without the disease, new evidence reveals. The elevated risks emerged in survivors both with and without cardiovascular involvement at the time of initial diagnosis.
Overall risk of cardiovascular events was highest in the first year following Kawasaki disease diagnosis, and about 10 times greater than in healthy children, Cal Robinson, MD, said during a press conference at the virtual annual meeting of the American College of Rheumatology.
“The risk gradually decreased over time. However, even 10 years after diagnosis of their illness, they still had a 39% higher risk,” said study author Dr. Robinson, a PGY4 pediatric nephrology fellow at The Hospital for Sick Children in Toronto.
Dr. Robinson also put the numbers in perspective. “We fully acknowledged these are very rare events in children, especially healthy children, which is why we needed such a large cohort to study this. Interpret the numbers cautiously.”
In terms of patient and family counseling, “I would say children with Kawasaki disease have a higher risk of myocardial infarction, but the absolute risk is still low,” he added. For example, 16 Kawasaki disease survivors experienced a heart attack during follow-up, or 0.4% of the affected study population, compared to a rate of 0.1% among matched controls.
“These families are often very frightened after the initial Kawasaki disease diagnosis,” Dr. Robinson said. “We have to balance some discussion with what we know about Kawasaki disease without overly scaring or terrifying these families, who are already anxious.”
To quantify the incidence and timing of cardiovascular events and cardiac disease following diagnosis, Dr. Robinson and colleagues assessed large databases representing approximately 3 million children. They focused on children hospitalized with a Kawasaki disease diagnosis between 1995 and 2018. These children had a median length of stay of 3 days and 2.5% were admitted to critical care. The investigators matched his population 1:100 to unaffected children in Ontario.
Follow-up was up to 24 years (median, 11 years) in this retrospective, population-based cohort study.
Risks raised over a decade and beyond
Compared to matched controls, Kawasaki disease survivors had a higher risk for a cardiac event in the first year following diagnosis (adjusted hazard ratio, 11.65; 95% confidence interval, 10.34-13.13). The 1- to 5-year risk was lower (aHR, 3.35), a trend that continued between 5 and 10 years (aHR, 1.87) and as well as after more than 10 years (aHR, 1.39).
The risk of major adverse cardiac events (MACE, a composite of myocardial infarction, stroke, or cardiovascular death) was likewise highest in the first year after diagnosis (aHR, 3.27), followed by a 51% greater risk at 1-5 years, a 113% increased risk at 5-10 years, and a 17% elevated risk after 10 years.
The investigators compared the 144 Kawasaki disease survivors who experienced a coronary artery aneurysm (CAA) within 90 days of hospital admission to the 4,453 others who did not have a CAA. The risk for a composite cardiovascular event was elevated at each time point among those with a history of CAA, especially in the first year. The adjusted HR was 33.12 in the CAA group versus 10.44 in the non-CAA group.
“The most interesting finding of this study was that children with Kawasaki syndrome are at higher risk for composite cardiovascular events and major adverse cardiac events even if they were not diagnosed with coronary artery aneurysm,” session comoderator Shervin Assassi, MD, professor of medicine and director of division of rheumatology at the University of Texas Health Science Center at Houston, said when asked to comment.
Dr. Robinson and colleagues also looked at outcomes based on presence or absence of coronary involvement at the time of Kawasaki disease diagnosis. For example, among those with initial coronary involvement, 15% later experienced a cardiovascular event and 10% experienced a major cardiovascular event.
“However, we were specifically interested in looking at children without initial coronary involvement. In this group, we also found these children were at increased risk for cardiovascular events compared to children without Kawasaki disease,” Dr. Robinson said. He said the distinction is important because approximately 95% of children diagnosed with Kawasaki disease do not feature initial coronary involvement.
In terms of clinical care, “our data provides an early signal that Kawasaki disease survivors – including those without initial coronary involvement – may be at higher risk of cardiovascular events into early adulthood.”
A call for closer monitoring
“Based on our results, we find that Kawasaki disease survivors may benefit from additional follow-up and surveillance for cardiovascular disease risk factors, such as obesity, high blood pressure, and high cholesterol,” Dr. Robinson said. Early identification of heightened risk could allow physicians to more closely monitor this subgroup and emphasize potentially beneficial lifestyle modifications, including increasing physical activity, implementing a heart healthy diet, and avoiding smoking.
Mortality was not significantly different between groups. “Despite the risk of cardiac events we found, death was uncommon,” Dr. Robinson said. Among children with Kawasaki disease, 1 in 500 died during follow-up, so “the risk of death was actually lower than for children without Kawasaki disease.”
Similar findings of lower mortality have been reported in research out of Japan, he added during a plenary presentation at ACR 2020. Future research is warranted to evaluate this finding further, Dr. Robinson said.
Future plans
Going forward, the investigators plan to evaluate noncardiovascular outcomes in this patient population. They would also like to examine health care utilization following a diagnosis of Kawasaki disease “to better understand what kind of follow-up is happening now in Ontario,” Dr. Robinson said.
Another unanswered question is whether the cardiovascular events observed in the study stem from atherosclerotic disease or a different mechanism among survivors of Kawasaki disease.
The research was supported by a McMaster University Resident Research Grant, a Hamilton Health Sciences New Investigator Award, and Ontario’s Institute for Clinical Evaluative Sciences. Dr. Robinson had no relevant financial disclosures.
SOURCE: Robinson C et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0937.
FROM ACR 2020
Key clinical point: Kawasaki disease survivors remain at elevated long-term risk for cardiovascular events.
Major finding: Overall cardiovascular event risk was 39% higher, even after 10 years.
Study details: A retrospective, population-based cohort study of more than 4,597 Kawasaki disease survivors and 459,700 matched children without Kawasaki disease.
Disclosures: The research was supported by a McMaster University Resident Research Grant, a Hamilton Health Sciences New Investigator Award, and Ontario’s Institute for Clinical Evaluative Sciences. Dr. Robinson had no relevant financial disclosures.
Source: Robinson C et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0937.
Methotrexate users need tuberculosis tests in high-TB areas
People taking even low-dose methotrexate need tuberculosis screening and ongoing clinical care if they live in areas where TB is common, results of a study presented at the virtual annual meeting of the American College of Rheumatology suggest.
Coauthor Carol Hitchon, MD, MSc, a rheumatologist with the University of Manitoba in Winnipeg, who presented the findings, warned that methotrexate (MTX) users who also take corticosteroids or other immunosuppressants are at particular risk and need TB screening.
Current management guidelines for rheumatic disease address TB in relation to biologics, but not in relation to methotrexate, Dr. Hitchon said.
“We know that methotrexate is the foundational DMARD [disease-modifying antirheumatic drug] for many rheumatic diseases, especially rheumatoid arthritis,” Dr. Hitchon noted at a press conference. “It’s safe and effective when dosed properly. However, methotrexate does have the potential for significant liver toxicity as well as infection, particularly for infectious organisms that are targeted by cell-mediated immunity, and TB is one of those agents.”
Using multiple databases, researchers conducted a systematic review of the literature published from 1990 to 2018 on TB rates among people who take less than 30 mg of methotrexate a week. Of the 4,700 studies they examined, 31 fit the criteria for this analysis.
They collected data on tuberculosis incidence or new TB diagnoses vs. reactivation of latent TB infection as well as TB outcomes, such as pulmonary symptoms, dissemination, and mortality.
They found a modest increase in the risk of TB infections in the setting of low-dose methotrexate. In addition, rates of TB in people with rheumatic disease who are treated with either methotrexate or biologics are generally higher than in the general population.
They also found that methotrexate users had higher rates of the type of TB that spreads beyond a patient’s lungs, compared with the general population.
Safety of INH with methotrexate
Researchers also looked at the safety of isoniazid (INH), the antibiotic used to treat TB, and found that isoniazid-related liver toxicity and neutropenia were more common when people took the antibiotic along with methotrexate, but those effects were usually reversible.
TB is endemic in various regions around the world. Historically there hasn’t been much rheumatology capacity in many of these areas, but as that capacity increases more people who are at high risk for developing or reactivating TB will be receiving methotrexate for rheumatic diseases, Dr. Hitchon said.
“It’s prudent for people managing patients who may be at higher risk for TB either from where they live or from where they travel that we should have a high suspicion for TB and consider screening as part of our workup in the course of initiating treatment like methotrexate,” she said.
Narender Annapureddy, MD, a rheumatologist at Vanderbilt University, Nashville, Tenn., who was not involved in the research, pointed out that a limitation of the work is that only 27% of the studies are from developing countries, which are more likely to have endemic TB, and those studies had very few cases.
“This finding needs to be studied in larger populations in TB-endemic areas and in high-risk populations,” he said in an interview.
As for practice implications in the United States, Dr. Annapureddy noted that TB is rare in the United States and most of the cases occur in people born in other countries.
“This population may be at risk for TB and should probably be screened for TB before initiating methotrexate,” he said. “Since biologics are usually the next step, especially in RA after patients fail methotrexate, having information on TB status may also help guide management options after MTX failure.
“Since high-dose steroids are another important risk factor for TB activation,” Dr. Annapureddy continued, “rheumatologists should likely consider screening patients who are going to be on moderate to high doses of steroids with MTX.”
A version of this article originally appeared on Medscape.com.
People taking even low-dose methotrexate need tuberculosis screening and ongoing clinical care if they live in areas where TB is common, results of a study presented at the virtual annual meeting of the American College of Rheumatology suggest.
Coauthor Carol Hitchon, MD, MSc, a rheumatologist with the University of Manitoba in Winnipeg, who presented the findings, warned that methotrexate (MTX) users who also take corticosteroids or other immunosuppressants are at particular risk and need TB screening.
Current management guidelines for rheumatic disease address TB in relation to biologics, but not in relation to methotrexate, Dr. Hitchon said.
“We know that methotrexate is the foundational DMARD [disease-modifying antirheumatic drug] for many rheumatic diseases, especially rheumatoid arthritis,” Dr. Hitchon noted at a press conference. “It’s safe and effective when dosed properly. However, methotrexate does have the potential for significant liver toxicity as well as infection, particularly for infectious organisms that are targeted by cell-mediated immunity, and TB is one of those agents.”
Using multiple databases, researchers conducted a systematic review of the literature published from 1990 to 2018 on TB rates among people who take less than 30 mg of methotrexate a week. Of the 4,700 studies they examined, 31 fit the criteria for this analysis.
They collected data on tuberculosis incidence or new TB diagnoses vs. reactivation of latent TB infection as well as TB outcomes, such as pulmonary symptoms, dissemination, and mortality.
They found a modest increase in the risk of TB infections in the setting of low-dose methotrexate. In addition, rates of TB in people with rheumatic disease who are treated with either methotrexate or biologics are generally higher than in the general population.
They also found that methotrexate users had higher rates of the type of TB that spreads beyond a patient’s lungs, compared with the general population.
Safety of INH with methotrexate
Researchers also looked at the safety of isoniazid (INH), the antibiotic used to treat TB, and found that isoniazid-related liver toxicity and neutropenia were more common when people took the antibiotic along with methotrexate, but those effects were usually reversible.
TB is endemic in various regions around the world. Historically there hasn’t been much rheumatology capacity in many of these areas, but as that capacity increases more people who are at high risk for developing or reactivating TB will be receiving methotrexate for rheumatic diseases, Dr. Hitchon said.
“It’s prudent for people managing patients who may be at higher risk for TB either from where they live or from where they travel that we should have a high suspicion for TB and consider screening as part of our workup in the course of initiating treatment like methotrexate,” she said.
Narender Annapureddy, MD, a rheumatologist at Vanderbilt University, Nashville, Tenn., who was not involved in the research, pointed out that a limitation of the work is that only 27% of the studies are from developing countries, which are more likely to have endemic TB, and those studies had very few cases.
“This finding needs to be studied in larger populations in TB-endemic areas and in high-risk populations,” he said in an interview.
As for practice implications in the United States, Dr. Annapureddy noted that TB is rare in the United States and most of the cases occur in people born in other countries.
“This population may be at risk for TB and should probably be screened for TB before initiating methotrexate,” he said. “Since biologics are usually the next step, especially in RA after patients fail methotrexate, having information on TB status may also help guide management options after MTX failure.
“Since high-dose steroids are another important risk factor for TB activation,” Dr. Annapureddy continued, “rheumatologists should likely consider screening patients who are going to be on moderate to high doses of steroids with MTX.”
A version of this article originally appeared on Medscape.com.
People taking even low-dose methotrexate need tuberculosis screening and ongoing clinical care if they live in areas where TB is common, results of a study presented at the virtual annual meeting of the American College of Rheumatology suggest.
Coauthor Carol Hitchon, MD, MSc, a rheumatologist with the University of Manitoba in Winnipeg, who presented the findings, warned that methotrexate (MTX) users who also take corticosteroids or other immunosuppressants are at particular risk and need TB screening.
Current management guidelines for rheumatic disease address TB in relation to biologics, but not in relation to methotrexate, Dr. Hitchon said.
“We know that methotrexate is the foundational DMARD [disease-modifying antirheumatic drug] for many rheumatic diseases, especially rheumatoid arthritis,” Dr. Hitchon noted at a press conference. “It’s safe and effective when dosed properly. However, methotrexate does have the potential for significant liver toxicity as well as infection, particularly for infectious organisms that are targeted by cell-mediated immunity, and TB is one of those agents.”
Using multiple databases, researchers conducted a systematic review of the literature published from 1990 to 2018 on TB rates among people who take less than 30 mg of methotrexate a week. Of the 4,700 studies they examined, 31 fit the criteria for this analysis.
They collected data on tuberculosis incidence or new TB diagnoses vs. reactivation of latent TB infection as well as TB outcomes, such as pulmonary symptoms, dissemination, and mortality.
They found a modest increase in the risk of TB infections in the setting of low-dose methotrexate. In addition, rates of TB in people with rheumatic disease who are treated with either methotrexate or biologics are generally higher than in the general population.
They also found that methotrexate users had higher rates of the type of TB that spreads beyond a patient’s lungs, compared with the general population.
Safety of INH with methotrexate
Researchers also looked at the safety of isoniazid (INH), the antibiotic used to treat TB, and found that isoniazid-related liver toxicity and neutropenia were more common when people took the antibiotic along with methotrexate, but those effects were usually reversible.
TB is endemic in various regions around the world. Historically there hasn’t been much rheumatology capacity in many of these areas, but as that capacity increases more people who are at high risk for developing or reactivating TB will be receiving methotrexate for rheumatic diseases, Dr. Hitchon said.
“It’s prudent for people managing patients who may be at higher risk for TB either from where they live or from where they travel that we should have a high suspicion for TB and consider screening as part of our workup in the course of initiating treatment like methotrexate,” she said.
Narender Annapureddy, MD, a rheumatologist at Vanderbilt University, Nashville, Tenn., who was not involved in the research, pointed out that a limitation of the work is that only 27% of the studies are from developing countries, which are more likely to have endemic TB, and those studies had very few cases.
“This finding needs to be studied in larger populations in TB-endemic areas and in high-risk populations,” he said in an interview.
As for practice implications in the United States, Dr. Annapureddy noted that TB is rare in the United States and most of the cases occur in people born in other countries.
“This population may be at risk for TB and should probably be screened for TB before initiating methotrexate,” he said. “Since biologics are usually the next step, especially in RA after patients fail methotrexate, having information on TB status may also help guide management options after MTX failure.
“Since high-dose steroids are another important risk factor for TB activation,” Dr. Annapureddy continued, “rheumatologists should likely consider screening patients who are going to be on moderate to high doses of steroids with MTX.”
A version of this article originally appeared on Medscape.com.
Pregnancy can be safe with interstitial lung disease
Pregnant women with interstitial lung disease (ILD) related to autoimmune disease may not need to terminate their pregnancies if they have close monitoring before, during, and after pregnancy with a multidisciplinary team of physicians, new research suggests.
Senior author Megan Clowse, MD, MPH, associate professor of medicine in the division of rheumatology at Duke University, Durham, N.C., explained during a press conference at the virtual annual meeting of the American College of Rheumatology that women with ILD are often advised by obstetricians or rheumatologists to avoid conception or terminate their pregnancies, though evidence for that has been based on small studies of 9-15 patients that have had mixed results.
“Many of these pregnancies were delivered 20-30 years ago, definitely with different rheumatic and obstetric care than we can provide now,” she said. “It’s really time to rethink our approach to interstitial lung disease and pregnancy.”
This study showed that while adverse pregnancy outcomes are common in these women, overall maternal morbidity and mortality are low.
ILD may be a secondary disease in people who have scleroderma, lupus, and sarcoidosis.
Largest study to date
This Pfizer-sponsored retrospective study of 67 pregnant women is the largest to date, and it analyzed 94 pregnancies (including five sets of twins).
Sarah Rae Easter, MD, maternal-fetal medicine doctor in the department of obstetrics and gynecology at Brigham and Women’s Hospital, Boston, called the work “exciting” as the researchers were able to look back at a large number of cases for a rare condition for more than 20 years.
“Their data provides much-needed evidence to provide some reassurance for women affected by this type of pulmonary disease regarding the relative safety of pregnancy,” she said in an interview.
Study spanned 23 years
The researchers reviewed pregnancy records in patients diagnosed with ILD secondary to autoimmune disease at Duke University Health System from January 1996 to July 2019.
They classified the severity of ILD based on two standard breathing tests – forced vital capacity and diffusion capacity for carbon monoxide.
Overall, 69% of the women were diagnosed with sarcoidosis and the remaining 31% had a connective tissue disease associated with ILD (CTD-ILD). Of those measured for ILD severity, 11% were severe, 25% were moderate, 50% were mild, and 14% were normal. Their average maternal age was 32.1 and 83% were Black.
While 70% of the pregnancies resulted in live births, 9% were terminated. The remainder resulted in miscarriage or stillbirth.
Researchers reported a 15% rate of preeclampsia, a 34% rate of the composite measure PROMISSE-Adverse Pregnancy Outcome (APO), and a 15% rate of PROMISSE-APO SEVERE. Patients with severe disease had the highest rates of PROMISSE-APO (P = .03 across groups).
(PROMISSE stands for the Predictors of Pregnancy Outcome: Biomarkers in Antiphospholipid Antibody Syndrome and Systemic Lupus Erythematosus study.)
None of the women died
Dr. Clowse said it was a pleasant surprise to find that none of the women died, though patients with severe ILD had more adverse outcomes. Only 2.1% were treated in an intensive care unit during or soon after delivery. In 4.2%, ILD patients had significant shortness of breath due to fluid volume overload around the time of delivery.
For the women who had normal-to-moderate lung disease, Dr. Clowse said, “they really had remarkably good outcomes, really pretty comparable to the general population. About 15% delivered preterm and about 20% suffered a pregnancy loss.”
Dr. Easter, who was not involved with the study, noted the large number of Black women in the cohort.
“Focusing in on improving outcomes for Black and Brown women related to pregnancy in our country is a much-needed undertaking,” Dr. Easter said.
Being able to quote percentages from this research, based on a good-sized study “at least gives people a benchmark about what kind of risk they are willing to assume for themselves,” she said.
For providers, being able to place this rare disease within the spectrum of other diseases where there is more data is also very helpful, she said.
Dr. Clowse said in an interview that the preponderance of Black women in the study was a surprise but may be explained by two factors: Sarcoidosis is seen more frequently in Black women and in the study area in North Carolina there is a large population of Black women.
“Also, our patients with more severe lupus, the ones who are more likely to have interstitial lung disease, are often Black and that’s likely contributing as well,” she said.
Multidisciplinary teams advised
Dr. Clowse emphasized that women with ILD need multidisciplinary teams in pregnancy and should be managed at tertiary care centers where there is a full complement of obstetric and internal medicine experts.
“We do recommend evaluating the severity of their lungs and their heart disease around the time of pregnancy and during pregnancy if they have shortness of breath,” she said.
“We currently recommend that these patients with moderate or severe disease stay in the hospital for up to a week, just for monitoring,” she said.
Dr. Easter said having that kind of access to a large academic healthcare center should be an important part of the decision-making.
Patients need to think about whether they would have access to care similar to what the researchers are describing when they are making the decision to pursue or continue pregnancy, she said.
The study was sponsored by Pfizer Inc. Dr. Clowse reported relationships with UCB, GlaxoSmithKline, AstraZeneca, and Pfizer. Dr. Easter has disclosed no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Pregnant women with interstitial lung disease (ILD) related to autoimmune disease may not need to terminate their pregnancies if they have close monitoring before, during, and after pregnancy with a multidisciplinary team of physicians, new research suggests.
Senior author Megan Clowse, MD, MPH, associate professor of medicine in the division of rheumatology at Duke University, Durham, N.C., explained during a press conference at the virtual annual meeting of the American College of Rheumatology that women with ILD are often advised by obstetricians or rheumatologists to avoid conception or terminate their pregnancies, though evidence for that has been based on small studies of 9-15 patients that have had mixed results.
“Many of these pregnancies were delivered 20-30 years ago, definitely with different rheumatic and obstetric care than we can provide now,” she said. “It’s really time to rethink our approach to interstitial lung disease and pregnancy.”
This study showed that while adverse pregnancy outcomes are common in these women, overall maternal morbidity and mortality are low.
ILD may be a secondary disease in people who have scleroderma, lupus, and sarcoidosis.
Largest study to date
This Pfizer-sponsored retrospective study of 67 pregnant women is the largest to date, and it analyzed 94 pregnancies (including five sets of twins).
Sarah Rae Easter, MD, maternal-fetal medicine doctor in the department of obstetrics and gynecology at Brigham and Women’s Hospital, Boston, called the work “exciting” as the researchers were able to look back at a large number of cases for a rare condition for more than 20 years.
“Their data provides much-needed evidence to provide some reassurance for women affected by this type of pulmonary disease regarding the relative safety of pregnancy,” she said in an interview.
Study spanned 23 years
The researchers reviewed pregnancy records in patients diagnosed with ILD secondary to autoimmune disease at Duke University Health System from January 1996 to July 2019.
They classified the severity of ILD based on two standard breathing tests – forced vital capacity and diffusion capacity for carbon monoxide.
Overall, 69% of the women were diagnosed with sarcoidosis and the remaining 31% had a connective tissue disease associated with ILD (CTD-ILD). Of those measured for ILD severity, 11% were severe, 25% were moderate, 50% were mild, and 14% were normal. Their average maternal age was 32.1 and 83% were Black.
While 70% of the pregnancies resulted in live births, 9% were terminated. The remainder resulted in miscarriage or stillbirth.
Researchers reported a 15% rate of preeclampsia, a 34% rate of the composite measure PROMISSE-Adverse Pregnancy Outcome (APO), and a 15% rate of PROMISSE-APO SEVERE. Patients with severe disease had the highest rates of PROMISSE-APO (P = .03 across groups).
(PROMISSE stands for the Predictors of Pregnancy Outcome: Biomarkers in Antiphospholipid Antibody Syndrome and Systemic Lupus Erythematosus study.)
None of the women died
Dr. Clowse said it was a pleasant surprise to find that none of the women died, though patients with severe ILD had more adverse outcomes. Only 2.1% were treated in an intensive care unit during or soon after delivery. In 4.2%, ILD patients had significant shortness of breath due to fluid volume overload around the time of delivery.
For the women who had normal-to-moderate lung disease, Dr. Clowse said, “they really had remarkably good outcomes, really pretty comparable to the general population. About 15% delivered preterm and about 20% suffered a pregnancy loss.”
Dr. Easter, who was not involved with the study, noted the large number of Black women in the cohort.
“Focusing in on improving outcomes for Black and Brown women related to pregnancy in our country is a much-needed undertaking,” Dr. Easter said.
Being able to quote percentages from this research, based on a good-sized study “at least gives people a benchmark about what kind of risk they are willing to assume for themselves,” she said.
For providers, being able to place this rare disease within the spectrum of other diseases where there is more data is also very helpful, she said.
Dr. Clowse said in an interview that the preponderance of Black women in the study was a surprise but may be explained by two factors: Sarcoidosis is seen more frequently in Black women and in the study area in North Carolina there is a large population of Black women.
“Also, our patients with more severe lupus, the ones who are more likely to have interstitial lung disease, are often Black and that’s likely contributing as well,” she said.
Multidisciplinary teams advised
Dr. Clowse emphasized that women with ILD need multidisciplinary teams in pregnancy and should be managed at tertiary care centers where there is a full complement of obstetric and internal medicine experts.
“We do recommend evaluating the severity of their lungs and their heart disease around the time of pregnancy and during pregnancy if they have shortness of breath,” she said.
“We currently recommend that these patients with moderate or severe disease stay in the hospital for up to a week, just for monitoring,” she said.
Dr. Easter said having that kind of access to a large academic healthcare center should be an important part of the decision-making.
Patients need to think about whether they would have access to care similar to what the researchers are describing when they are making the decision to pursue or continue pregnancy, she said.
The study was sponsored by Pfizer Inc. Dr. Clowse reported relationships with UCB, GlaxoSmithKline, AstraZeneca, and Pfizer. Dr. Easter has disclosed no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Pregnant women with interstitial lung disease (ILD) related to autoimmune disease may not need to terminate their pregnancies if they have close monitoring before, during, and after pregnancy with a multidisciplinary team of physicians, new research suggests.
Senior author Megan Clowse, MD, MPH, associate professor of medicine in the division of rheumatology at Duke University, Durham, N.C., explained during a press conference at the virtual annual meeting of the American College of Rheumatology that women with ILD are often advised by obstetricians or rheumatologists to avoid conception or terminate their pregnancies, though evidence for that has been based on small studies of 9-15 patients that have had mixed results.
“Many of these pregnancies were delivered 20-30 years ago, definitely with different rheumatic and obstetric care than we can provide now,” she said. “It’s really time to rethink our approach to interstitial lung disease and pregnancy.”
This study showed that while adverse pregnancy outcomes are common in these women, overall maternal morbidity and mortality are low.
ILD may be a secondary disease in people who have scleroderma, lupus, and sarcoidosis.
Largest study to date
This Pfizer-sponsored retrospective study of 67 pregnant women is the largest to date, and it analyzed 94 pregnancies (including five sets of twins).
Sarah Rae Easter, MD, maternal-fetal medicine doctor in the department of obstetrics and gynecology at Brigham and Women’s Hospital, Boston, called the work “exciting” as the researchers were able to look back at a large number of cases for a rare condition for more than 20 years.
“Their data provides much-needed evidence to provide some reassurance for women affected by this type of pulmonary disease regarding the relative safety of pregnancy,” she said in an interview.
Study spanned 23 years
The researchers reviewed pregnancy records in patients diagnosed with ILD secondary to autoimmune disease at Duke University Health System from January 1996 to July 2019.
They classified the severity of ILD based on two standard breathing tests – forced vital capacity and diffusion capacity for carbon monoxide.
Overall, 69% of the women were diagnosed with sarcoidosis and the remaining 31% had a connective tissue disease associated with ILD (CTD-ILD). Of those measured for ILD severity, 11% were severe, 25% were moderate, 50% were mild, and 14% were normal. Their average maternal age was 32.1 and 83% were Black.
While 70% of the pregnancies resulted in live births, 9% were terminated. The remainder resulted in miscarriage or stillbirth.
Researchers reported a 15% rate of preeclampsia, a 34% rate of the composite measure PROMISSE-Adverse Pregnancy Outcome (APO), and a 15% rate of PROMISSE-APO SEVERE. Patients with severe disease had the highest rates of PROMISSE-APO (P = .03 across groups).
(PROMISSE stands for the Predictors of Pregnancy Outcome: Biomarkers in Antiphospholipid Antibody Syndrome and Systemic Lupus Erythematosus study.)
None of the women died
Dr. Clowse said it was a pleasant surprise to find that none of the women died, though patients with severe ILD had more adverse outcomes. Only 2.1% were treated in an intensive care unit during or soon after delivery. In 4.2%, ILD patients had significant shortness of breath due to fluid volume overload around the time of delivery.
For the women who had normal-to-moderate lung disease, Dr. Clowse said, “they really had remarkably good outcomes, really pretty comparable to the general population. About 15% delivered preterm and about 20% suffered a pregnancy loss.”
Dr. Easter, who was not involved with the study, noted the large number of Black women in the cohort.
“Focusing in on improving outcomes for Black and Brown women related to pregnancy in our country is a much-needed undertaking,” Dr. Easter said.
Being able to quote percentages from this research, based on a good-sized study “at least gives people a benchmark about what kind of risk they are willing to assume for themselves,” she said.
For providers, being able to place this rare disease within the spectrum of other diseases where there is more data is also very helpful, she said.
Dr. Clowse said in an interview that the preponderance of Black women in the study was a surprise but may be explained by two factors: Sarcoidosis is seen more frequently in Black women and in the study area in North Carolina there is a large population of Black women.
“Also, our patients with more severe lupus, the ones who are more likely to have interstitial lung disease, are often Black and that’s likely contributing as well,” she said.
Multidisciplinary teams advised
Dr. Clowse emphasized that women with ILD need multidisciplinary teams in pregnancy and should be managed at tertiary care centers where there is a full complement of obstetric and internal medicine experts.
“We do recommend evaluating the severity of their lungs and their heart disease around the time of pregnancy and during pregnancy if they have shortness of breath,” she said.
“We currently recommend that these patients with moderate or severe disease stay in the hospital for up to a week, just for monitoring,” she said.
Dr. Easter said having that kind of access to a large academic healthcare center should be an important part of the decision-making.
Patients need to think about whether they would have access to care similar to what the researchers are describing when they are making the decision to pursue or continue pregnancy, she said.
The study was sponsored by Pfizer Inc. Dr. Clowse reported relationships with UCB, GlaxoSmithKline, AstraZeneca, and Pfizer. Dr. Easter has disclosed no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
Warfarin use linked to knee and hip replacement in osteoarthritis patients
Patients who take the vitamin K antagonist warfarin to prevent thromboembolic events are significantly more likely to require knee or hip replacement surgery – a surrogate endpoint for end-stage osteoarthritis – than are patients who take direct oral anticoagulants (DOACs), results of a U.K.-based study showed.
In a nested case-control study, warfarin use was associated with a 1.5-fold risk for knee and hip replacement, compared with use of DOACs.
The findings provide additional evidence for the role of vitamin K and vitamin K–dependent proteins for limiting osteoarthritis progression, said lead author Priyanka Ballal, MD, a rheumatology fellow at Boston University.
“Given the prevalence and impact of osteoarthritis, our data, along with the existing literature, support the need for a well-powered, randomized, controlled trial for evaluating vitamin K supplementation in osteoarthritis. Our study also raises the consideration of using DOACs over warfarin when indicated in people with or at risk of osteoarthritis,“ she said in a plenary session at the virtual annual meeting of the American College of Rheumatology.
Warfarin targets vitamin K for its role in coagulation, but vitamin K is also an essential co-factor for vitamin K-dependent proteins in bone and cartilage, Dr. Ballal said,
Inadequate vitamin K levels are associated with abnormal joint tissue mineralization, and with increased incidence and prevalence of osteoarthritis. In a randomized, controlled trial, vitamin K supplementation was associated with trends toward less osteoarthritis progression among patients with vitamin K deficiency, she said.
To see whether warfarin therapy has biologic effects similar to that seen in patients with vitamin K deficiency, Dr. Ballal and colleagues conducted a nested, case-control study using data from The Health Improvement Network (THIN), an electronic medical record database of patients enrolled with general practitioners in the United Kingdom.
The sample included adults aged 40-80 years with atrial fibrillation who had received one or more prescriptions for warfarin or a DOAC beginning in 2009, a year after DOACs were first marketed in the United Kingdom, and within 1 year of the index date (date of joint replacement surgery). The researchers excluded patients with knee or hip replacements before 2014, severe comorbidities that would limit joint replacement, or who had used either warfarin or a DOAC prior to study entry. Each case was matched by age, gender, and index date with up to four control patients (those who did not have surgery).
A total of 913 cases and 3,652 controls were included. The groups had similar characteristics (sex, age, cancer, renal disease, chronic lung disease, hypertension, and incidence of venous thromboembolism [VTE]), except for somewhat higher rates of diabetes and heart failure among controls, and a higher rate of obesity among cases.
The investigators first looked at warfarin use among all knee and/or hip replacement cases and controls and calculated an odds ratio of 1.57 (95% confidence interval [CI], 1.30-1.89) for knee and hip replacement with warfarin after adjustment for body mass index, factors influencing choice of anticoagulant, comorbidities, other medications, general practitioner visits, and hospitalizations.
The association between warfarin and joint replacement held up in an analysis restricted to knee replacement only, with an adjusted OR of 1.48 (95% CI, 1.16-1.89).
There was also a clear association between duration of warfarin use and risk of knee and hip replacement.
“This abstract suggests the role of adequate vitamin K may be important in decreasing progression of osteoarthritis, which would then favor patients with OA who are on warfarin to consider changing to a DOAC; however, further studies are needed to confirm this finding and consider its impact on VTE and wound healing postop,” said Minna Kohler, MD, director of the rheumatology musculoskeletal ultrasound program at Massachusetts General Hospital in Boston. Dr. Kohler, who was not involved in the study, replied to an email request for comment.
The study was supported by grants from the National Institutes of Health. Dr. Ballal and Dr. Kohler reported having no conflicts of interest to disclose.
SOURCE: Ballal P et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0934.
Patients who take the vitamin K antagonist warfarin to prevent thromboembolic events are significantly more likely to require knee or hip replacement surgery – a surrogate endpoint for end-stage osteoarthritis – than are patients who take direct oral anticoagulants (DOACs), results of a U.K.-based study showed.
In a nested case-control study, warfarin use was associated with a 1.5-fold risk for knee and hip replacement, compared with use of DOACs.
The findings provide additional evidence for the role of vitamin K and vitamin K–dependent proteins for limiting osteoarthritis progression, said lead author Priyanka Ballal, MD, a rheumatology fellow at Boston University.
“Given the prevalence and impact of osteoarthritis, our data, along with the existing literature, support the need for a well-powered, randomized, controlled trial for evaluating vitamin K supplementation in osteoarthritis. Our study also raises the consideration of using DOACs over warfarin when indicated in people with or at risk of osteoarthritis,“ she said in a plenary session at the virtual annual meeting of the American College of Rheumatology.
Warfarin targets vitamin K for its role in coagulation, but vitamin K is also an essential co-factor for vitamin K-dependent proteins in bone and cartilage, Dr. Ballal said,
Inadequate vitamin K levels are associated with abnormal joint tissue mineralization, and with increased incidence and prevalence of osteoarthritis. In a randomized, controlled trial, vitamin K supplementation was associated with trends toward less osteoarthritis progression among patients with vitamin K deficiency, she said.
To see whether warfarin therapy has biologic effects similar to that seen in patients with vitamin K deficiency, Dr. Ballal and colleagues conducted a nested, case-control study using data from The Health Improvement Network (THIN), an electronic medical record database of patients enrolled with general practitioners in the United Kingdom.
The sample included adults aged 40-80 years with atrial fibrillation who had received one or more prescriptions for warfarin or a DOAC beginning in 2009, a year after DOACs were first marketed in the United Kingdom, and within 1 year of the index date (date of joint replacement surgery). The researchers excluded patients with knee or hip replacements before 2014, severe comorbidities that would limit joint replacement, or who had used either warfarin or a DOAC prior to study entry. Each case was matched by age, gender, and index date with up to four control patients (those who did not have surgery).
A total of 913 cases and 3,652 controls were included. The groups had similar characteristics (sex, age, cancer, renal disease, chronic lung disease, hypertension, and incidence of venous thromboembolism [VTE]), except for somewhat higher rates of diabetes and heart failure among controls, and a higher rate of obesity among cases.
The investigators first looked at warfarin use among all knee and/or hip replacement cases and controls and calculated an odds ratio of 1.57 (95% confidence interval [CI], 1.30-1.89) for knee and hip replacement with warfarin after adjustment for body mass index, factors influencing choice of anticoagulant, comorbidities, other medications, general practitioner visits, and hospitalizations.
The association between warfarin and joint replacement held up in an analysis restricted to knee replacement only, with an adjusted OR of 1.48 (95% CI, 1.16-1.89).
There was also a clear association between duration of warfarin use and risk of knee and hip replacement.
“This abstract suggests the role of adequate vitamin K may be important in decreasing progression of osteoarthritis, which would then favor patients with OA who are on warfarin to consider changing to a DOAC; however, further studies are needed to confirm this finding and consider its impact on VTE and wound healing postop,” said Minna Kohler, MD, director of the rheumatology musculoskeletal ultrasound program at Massachusetts General Hospital in Boston. Dr. Kohler, who was not involved in the study, replied to an email request for comment.
The study was supported by grants from the National Institutes of Health. Dr. Ballal and Dr. Kohler reported having no conflicts of interest to disclose.
SOURCE: Ballal P et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0934.
Patients who take the vitamin K antagonist warfarin to prevent thromboembolic events are significantly more likely to require knee or hip replacement surgery – a surrogate endpoint for end-stage osteoarthritis – than are patients who take direct oral anticoagulants (DOACs), results of a U.K.-based study showed.
In a nested case-control study, warfarin use was associated with a 1.5-fold risk for knee and hip replacement, compared with use of DOACs.
The findings provide additional evidence for the role of vitamin K and vitamin K–dependent proteins for limiting osteoarthritis progression, said lead author Priyanka Ballal, MD, a rheumatology fellow at Boston University.
“Given the prevalence and impact of osteoarthritis, our data, along with the existing literature, support the need for a well-powered, randomized, controlled trial for evaluating vitamin K supplementation in osteoarthritis. Our study also raises the consideration of using DOACs over warfarin when indicated in people with or at risk of osteoarthritis,“ she said in a plenary session at the virtual annual meeting of the American College of Rheumatology.
Warfarin targets vitamin K for its role in coagulation, but vitamin K is also an essential co-factor for vitamin K-dependent proteins in bone and cartilage, Dr. Ballal said,
Inadequate vitamin K levels are associated with abnormal joint tissue mineralization, and with increased incidence and prevalence of osteoarthritis. In a randomized, controlled trial, vitamin K supplementation was associated with trends toward less osteoarthritis progression among patients with vitamin K deficiency, she said.
To see whether warfarin therapy has biologic effects similar to that seen in patients with vitamin K deficiency, Dr. Ballal and colleagues conducted a nested, case-control study using data from The Health Improvement Network (THIN), an electronic medical record database of patients enrolled with general practitioners in the United Kingdom.
The sample included adults aged 40-80 years with atrial fibrillation who had received one or more prescriptions for warfarin or a DOAC beginning in 2009, a year after DOACs were first marketed in the United Kingdom, and within 1 year of the index date (date of joint replacement surgery). The researchers excluded patients with knee or hip replacements before 2014, severe comorbidities that would limit joint replacement, or who had used either warfarin or a DOAC prior to study entry. Each case was matched by age, gender, and index date with up to four control patients (those who did not have surgery).
A total of 913 cases and 3,652 controls were included. The groups had similar characteristics (sex, age, cancer, renal disease, chronic lung disease, hypertension, and incidence of venous thromboembolism [VTE]), except for somewhat higher rates of diabetes and heart failure among controls, and a higher rate of obesity among cases.
The investigators first looked at warfarin use among all knee and/or hip replacement cases and controls and calculated an odds ratio of 1.57 (95% confidence interval [CI], 1.30-1.89) for knee and hip replacement with warfarin after adjustment for body mass index, factors influencing choice of anticoagulant, comorbidities, other medications, general practitioner visits, and hospitalizations.
The association between warfarin and joint replacement held up in an analysis restricted to knee replacement only, with an adjusted OR of 1.48 (95% CI, 1.16-1.89).
There was also a clear association between duration of warfarin use and risk of knee and hip replacement.
“This abstract suggests the role of adequate vitamin K may be important in decreasing progression of osteoarthritis, which would then favor patients with OA who are on warfarin to consider changing to a DOAC; however, further studies are needed to confirm this finding and consider its impact on VTE and wound healing postop,” said Minna Kohler, MD, director of the rheumatology musculoskeletal ultrasound program at Massachusetts General Hospital in Boston. Dr. Kohler, who was not involved in the study, replied to an email request for comment.
The study was supported by grants from the National Institutes of Health. Dr. Ballal and Dr. Kohler reported having no conflicts of interest to disclose.
SOURCE: Ballal P et al. Arthritis Rheumatol. 2020;72(suppl 10): Abstract 0934.
FROM ACR 2020