User login
Real-world study shows subcutaneous vedolizumab effective for maintenance in IBD
Switching from intravenous to subcutaneous vedolizumab for maintenance treatment of inflammatory bowel diseases appears to be effective, according to a study providing real-world data.
Subcutaneous treatment could reduce direct health care costs because no infusion equipment is necessary, as well as societal costs because patients don’t need to take time off work or travel to infusion locations, wrote the researchers, led by Adriaan Volkers, MD, a doctoral candidate in gastroenterology and hepatology at the Amsterdam Gastroenterology Endocrinology Metabolism Research Institute at the University of Amsterdam in The Netherlands.
“The option of a SC formulation of VDZ [vedolizumab] offers patients a choice regarding the route of administration,” they wrote. The study was published in Alimentary Pharmacology and Therapeutics.
Dr. Volkers and colleagues assessed the effectiveness, safety, drug discontinuation, and pharmacokinetics of a switch from intravenous to subcutaneous maintenance vedolizumab in a prospective real-world cohort of patients from two separate studies in The Netherlands between July 2020 and November 2021.
The cohort comprised 135 adults who had greater than 4 months of IV vedolizumab: 82 patients with Crohn’s disease and 53 with ulcerative colitis. Prospective follow-up took place during scheduled outpatient clinic visits at weeks 12 and 24 after switching administration. Patients received 108 mg of subcutaneous vedolizumab once every 2 weeks.
Overall, 16 patients (11.9%) discontinued subcutaneous administration, including 11 patients (13.4%) with Crohn’s disease who stopped after a median of 18 weeks, as well as 5 patients (9.4%) with ulcerative colitis who stopped after a median of 6 weeks. Four patients, who all had Crohn’s disease, discontinued vedolizumab and switched to a different treatment because of loss of response. Nine patients switched back to IV administration because of adverse events, and three switched back because of fear of needles.
In total, there were 59 adverse events and 13 infections that were possibly or probably related to subcutaneous injection among 42 patients. The most common adverse events that were probably related were injection site reactions such as pain or swelling, reported among 15 patients, and headaches, reported among 6 patients.
At the initiation of therapy, 57 of 81 Crohn’s disease patients (70.4%) were in corticosteroid-free clinical remission and 53 of 80 (66.3%) were in biochemical remission, which was defined as C-reactive protein levels of 5 mg/L or less and fecal calprotectin levels of 250 mcg/g or less. For ulcerative colitis patients, 35 of 49 (71.4%) were in corticosteroid-free clinical remission and 41 of 51 (80.4%) were in biochemical remission. Median clinical and biochemical disease levels remained stable after the switch to subcutaneous treatment and weren’t significantly different, compared with baseline measurements.
Median vedolizumab serum concentrations increased from 19 mcg/mL at the time of the switch to 31 mcg/mL at 12 weeks after the switch and 37 mcg/mL at 24 weeks. Serum concentrations of less than 25 mcg/mL were associated with lower rates of corticosteroid-free clinical remission, and serum concentrations of greater than 40 mcg/mL were associated with higher biochemical remission rates.
Importantly, there was no association between vedolizumab serum concentrations and the risk of adverse events that were deemed probably related to subcutaneous injection or infections.
“The most important point to understand here is that SC VDZ can be used to maintain clinical remission after IV VDZ induction in a real-world setting,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at Washington University, St. Louis.
Dr. DeBosch, who wasn’t involved with this study, noted that previous data have indicated that switching from intravenous to subcutaneous treatment after a 6-week induction is superior to placebo in maintaining clinical and biochemical remission. However, studies haven’t quantified the optimal timing and therapeutic efficacy of switching.
“This is critical to quantify because SC VDZ has slower and lower peak bioavailability when compared with IV administration,” he said. “These data indicate that IV induction overcomes the known pharmacokinetic limitations of SC VDZ during the induction phase.”
However, there are still some limitations and areas for future research around switching administration, Dr. DeBosch noted.
“A key comparison lacking in the study is the mean and trough serum VDZ, and proportion of patients with relapsing disease in patients on continued IV VDZ,” he said. “Yet, these data nevertheless indicate that tandem IV-SC drug administration can maximize the induction and maintenance of remission in IBD, while also mitigating some of the barriers associated with long-term, continued IV VDZ administration.”
The study authors reported advisory fees and speaker fees from several pharmaceutical companies, and some authors have received funding or served on advisory boards for Takeda Pharmaceuticals, which manufactures vedolizumab. Dr. DeBosch reported no relevant disclosures.
Switching from intravenous to subcutaneous vedolizumab for maintenance treatment of inflammatory bowel diseases appears to be effective, according to a study providing real-world data.
Subcutaneous treatment could reduce direct health care costs because no infusion equipment is necessary, as well as societal costs because patients don’t need to take time off work or travel to infusion locations, wrote the researchers, led by Adriaan Volkers, MD, a doctoral candidate in gastroenterology and hepatology at the Amsterdam Gastroenterology Endocrinology Metabolism Research Institute at the University of Amsterdam in The Netherlands.
“The option of a SC formulation of VDZ [vedolizumab] offers patients a choice regarding the route of administration,” they wrote. The study was published in Alimentary Pharmacology and Therapeutics.
Dr. Volkers and colleagues assessed the effectiveness, safety, drug discontinuation, and pharmacokinetics of a switch from intravenous to subcutaneous maintenance vedolizumab in a prospective real-world cohort of patients from two separate studies in The Netherlands between July 2020 and November 2021.
The cohort comprised 135 adults who had greater than 4 months of IV vedolizumab: 82 patients with Crohn’s disease and 53 with ulcerative colitis. Prospective follow-up took place during scheduled outpatient clinic visits at weeks 12 and 24 after switching administration. Patients received 108 mg of subcutaneous vedolizumab once every 2 weeks.
Overall, 16 patients (11.9%) discontinued subcutaneous administration, including 11 patients (13.4%) with Crohn’s disease who stopped after a median of 18 weeks, as well as 5 patients (9.4%) with ulcerative colitis who stopped after a median of 6 weeks. Four patients, who all had Crohn’s disease, discontinued vedolizumab and switched to a different treatment because of loss of response. Nine patients switched back to IV administration because of adverse events, and three switched back because of fear of needles.
In total, there were 59 adverse events and 13 infections that were possibly or probably related to subcutaneous injection among 42 patients. The most common adverse events that were probably related were injection site reactions such as pain or swelling, reported among 15 patients, and headaches, reported among 6 patients.
At the initiation of therapy, 57 of 81 Crohn’s disease patients (70.4%) were in corticosteroid-free clinical remission and 53 of 80 (66.3%) were in biochemical remission, which was defined as C-reactive protein levels of 5 mg/L or less and fecal calprotectin levels of 250 mcg/g or less. For ulcerative colitis patients, 35 of 49 (71.4%) were in corticosteroid-free clinical remission and 41 of 51 (80.4%) were in biochemical remission. Median clinical and biochemical disease levels remained stable after the switch to subcutaneous treatment and weren’t significantly different, compared with baseline measurements.
Median vedolizumab serum concentrations increased from 19 mcg/mL at the time of the switch to 31 mcg/mL at 12 weeks after the switch and 37 mcg/mL at 24 weeks. Serum concentrations of less than 25 mcg/mL were associated with lower rates of corticosteroid-free clinical remission, and serum concentrations of greater than 40 mcg/mL were associated with higher biochemical remission rates.
Importantly, there was no association between vedolizumab serum concentrations and the risk of adverse events that were deemed probably related to subcutaneous injection or infections.
“The most important point to understand here is that SC VDZ can be used to maintain clinical remission after IV VDZ induction in a real-world setting,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at Washington University, St. Louis.
Dr. DeBosch, who wasn’t involved with this study, noted that previous data have indicated that switching from intravenous to subcutaneous treatment after a 6-week induction is superior to placebo in maintaining clinical and biochemical remission. However, studies haven’t quantified the optimal timing and therapeutic efficacy of switching.
“This is critical to quantify because SC VDZ has slower and lower peak bioavailability when compared with IV administration,” he said. “These data indicate that IV induction overcomes the known pharmacokinetic limitations of SC VDZ during the induction phase.”
However, there are still some limitations and areas for future research around switching administration, Dr. DeBosch noted.
“A key comparison lacking in the study is the mean and trough serum VDZ, and proportion of patients with relapsing disease in patients on continued IV VDZ,” he said. “Yet, these data nevertheless indicate that tandem IV-SC drug administration can maximize the induction and maintenance of remission in IBD, while also mitigating some of the barriers associated with long-term, continued IV VDZ administration.”
The study authors reported advisory fees and speaker fees from several pharmaceutical companies, and some authors have received funding or served on advisory boards for Takeda Pharmaceuticals, which manufactures vedolizumab. Dr. DeBosch reported no relevant disclosures.
Switching from intravenous to subcutaneous vedolizumab for maintenance treatment of inflammatory bowel diseases appears to be effective, according to a study providing real-world data.
Subcutaneous treatment could reduce direct health care costs because no infusion equipment is necessary, as well as societal costs because patients don’t need to take time off work or travel to infusion locations, wrote the researchers, led by Adriaan Volkers, MD, a doctoral candidate in gastroenterology and hepatology at the Amsterdam Gastroenterology Endocrinology Metabolism Research Institute at the University of Amsterdam in The Netherlands.
“The option of a SC formulation of VDZ [vedolizumab] offers patients a choice regarding the route of administration,” they wrote. The study was published in Alimentary Pharmacology and Therapeutics.
Dr. Volkers and colleagues assessed the effectiveness, safety, drug discontinuation, and pharmacokinetics of a switch from intravenous to subcutaneous maintenance vedolizumab in a prospective real-world cohort of patients from two separate studies in The Netherlands between July 2020 and November 2021.
The cohort comprised 135 adults who had greater than 4 months of IV vedolizumab: 82 patients with Crohn’s disease and 53 with ulcerative colitis. Prospective follow-up took place during scheduled outpatient clinic visits at weeks 12 and 24 after switching administration. Patients received 108 mg of subcutaneous vedolizumab once every 2 weeks.
Overall, 16 patients (11.9%) discontinued subcutaneous administration, including 11 patients (13.4%) with Crohn’s disease who stopped after a median of 18 weeks, as well as 5 patients (9.4%) with ulcerative colitis who stopped after a median of 6 weeks. Four patients, who all had Crohn’s disease, discontinued vedolizumab and switched to a different treatment because of loss of response. Nine patients switched back to IV administration because of adverse events, and three switched back because of fear of needles.
In total, there were 59 adverse events and 13 infections that were possibly or probably related to subcutaneous injection among 42 patients. The most common adverse events that were probably related were injection site reactions such as pain or swelling, reported among 15 patients, and headaches, reported among 6 patients.
At the initiation of therapy, 57 of 81 Crohn’s disease patients (70.4%) were in corticosteroid-free clinical remission and 53 of 80 (66.3%) were in biochemical remission, which was defined as C-reactive protein levels of 5 mg/L or less and fecal calprotectin levels of 250 mcg/g or less. For ulcerative colitis patients, 35 of 49 (71.4%) were in corticosteroid-free clinical remission and 41 of 51 (80.4%) were in biochemical remission. Median clinical and biochemical disease levels remained stable after the switch to subcutaneous treatment and weren’t significantly different, compared with baseline measurements.
Median vedolizumab serum concentrations increased from 19 mcg/mL at the time of the switch to 31 mcg/mL at 12 weeks after the switch and 37 mcg/mL at 24 weeks. Serum concentrations of less than 25 mcg/mL were associated with lower rates of corticosteroid-free clinical remission, and serum concentrations of greater than 40 mcg/mL were associated with higher biochemical remission rates.
Importantly, there was no association between vedolizumab serum concentrations and the risk of adverse events that were deemed probably related to subcutaneous injection or infections.
“The most important point to understand here is that SC VDZ can be used to maintain clinical remission after IV VDZ induction in a real-world setting,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at Washington University, St. Louis.
Dr. DeBosch, who wasn’t involved with this study, noted that previous data have indicated that switching from intravenous to subcutaneous treatment after a 6-week induction is superior to placebo in maintaining clinical and biochemical remission. However, studies haven’t quantified the optimal timing and therapeutic efficacy of switching.
“This is critical to quantify because SC VDZ has slower and lower peak bioavailability when compared with IV administration,” he said. “These data indicate that IV induction overcomes the known pharmacokinetic limitations of SC VDZ during the induction phase.”
However, there are still some limitations and areas for future research around switching administration, Dr. DeBosch noted.
“A key comparison lacking in the study is the mean and trough serum VDZ, and proportion of patients with relapsing disease in patients on continued IV VDZ,” he said. “Yet, these data nevertheless indicate that tandem IV-SC drug administration can maximize the induction and maintenance of remission in IBD, while also mitigating some of the barriers associated with long-term, continued IV VDZ administration.”
The study authors reported advisory fees and speaker fees from several pharmaceutical companies, and some authors have received funding or served on advisory boards for Takeda Pharmaceuticals, which manufactures vedolizumab. Dr. DeBosch reported no relevant disclosures.
FROM ALIMENTARY PHARMACOLOGY & THERAPEUTICS
‘Stop pretending’ there’s a magic formula to weight loss
Is there a diet or weight-loss program out there that doesn’t work for those who stick with it during its first 12 weeks?
Truly, the world’s most backwards, upside-down, anti-science, nonsensical diets work over the short haul, fueled by the fact that short-term suffering for weight loss is a skill set that humanity has assiduously cultivated for at least the past 100 years. We’re really good at it!
It’s the keeping the weight off, though, that’s the hitch. Which leads me to the question, why are medical journals, even preeminent nonpredatory ones, publishing 12-week weight-loss program studies as if they have value? And does anyone truly imagine that after over 100 years of trying, there’ll be a short-term diet or program that’ll have the durable, reproducible results that no other short-term diet or program ever has?
Take this study published by Obesity: “Pragmatic implementation of a fully automated online obesity treatment in primary care.” It details a 12-week online, automated, weight-loss program that led completers to lose the roughly 5% of weight that many diets and programs see lost over their first 12 weeks. By its description, aside from its automated provision, the program sounds like pretty much the same boilerplate weight management advice and recommendations that haven’t been shown to lead large numbers of people to sustain long-term weight loss.
Participants were provided with weekly lessons which no doubt in some manner told them that high-calorie foods had high numbers of calories and should be minimized, along with other weight-loss secrets. Users were to upload weekly self-monitored weight, energy intake, and exercise minutes and were told to use a food diary. Their goal was losing 10% of their body weight by consuming 1,200-1,500 calories per day if they weighed less than 250 pounds (113 kg) and 1,500-1,800 calories if they weighed more than 250 pounds, while also telling them to aim for 200 minutes per week of moderate- to vigorous-intensity physical activity.
What was found was wholly unsurprising. Perhaps speaking to the tremendous and wide-ranging degrees of privilege that are required to prioritize intentional behavior change in the name of health, 79% of those who were given a prescription for the program either didn’t start it or stopped it before the end of the first week.
Of those who actually started the program and completed more than 1 week, despite having been selected as appropriate and interested participants by their physicians, only 20% watched all of the automated programs’ video lessons while only 32% actually bothered to submit all 12 weeks of weight data. Of course, the authors found that those who watched the greatest number of videos and submitted the most self-reported weights lost more weight and ascribed that loss to the program. What the authors did not entertain was the possibility that those who weren’t losing weight, or who were gaining, might simply be less inclined to continue with a program that wasn’t leading them to their desired outcomes or to want to submit their lack of loss or gains.
Short-term weight-loss studies help no one and when, as in this case, the outcomes aren’t even mediocre, and the completion and engagement rates are terrible, the study is still presented as significant and important. This bolsters the harmful stereotype that weight management is achievable by way of simple messages and generic goals. It suggests that it’s individuals who fail programs by not trying hard enough and that those who do, or who want it the most, will succeed. It may also lead patients and clinicians to second-guess the use of antiobesity medications, the current generation of which lead to far greater weight loss and reproducibility than any behavioral program or diet ever has.
The good news here at least is that the small percentage of participants who made it through this program’s 12 weeks are being randomly assigned to differing 9-month maintenance programs which at least will then lead to a 1-year analysis on the completers.
Why this study was published now, rather than pushed until the 1-year data were available, speaks to the pervasiveness of the toxic weight-biased notion that simple education will overcome the physiology forged over millions of years of extreme dietary insecurity.
Our food environment is a veritable floodplain of hyperpalatable foods, and social determinants of health make intentional behavior change in the name of health an unattainable luxury for a huge swath of the population.
Dr. Freedhoff is an associate professor of family medicine at the University of Ottawa and medical director of the Bariatric Medical Institute. He reported serving as a director, officer, partner, employee, adviser, consultant, or trustee for Bariatric Medical Institute and Constant Health and receiving research grants from Novo Nordisk. A version of this article first appeared on Medscape.com.
Is there a diet or weight-loss program out there that doesn’t work for those who stick with it during its first 12 weeks?
Truly, the world’s most backwards, upside-down, anti-science, nonsensical diets work over the short haul, fueled by the fact that short-term suffering for weight loss is a skill set that humanity has assiduously cultivated for at least the past 100 years. We’re really good at it!
It’s the keeping the weight off, though, that’s the hitch. Which leads me to the question, why are medical journals, even preeminent nonpredatory ones, publishing 12-week weight-loss program studies as if they have value? And does anyone truly imagine that after over 100 years of trying, there’ll be a short-term diet or program that’ll have the durable, reproducible results that no other short-term diet or program ever has?
Take this study published by Obesity: “Pragmatic implementation of a fully automated online obesity treatment in primary care.” It details a 12-week online, automated, weight-loss program that led completers to lose the roughly 5% of weight that many diets and programs see lost over their first 12 weeks. By its description, aside from its automated provision, the program sounds like pretty much the same boilerplate weight management advice and recommendations that haven’t been shown to lead large numbers of people to sustain long-term weight loss.
Participants were provided with weekly lessons which no doubt in some manner told them that high-calorie foods had high numbers of calories and should be minimized, along with other weight-loss secrets. Users were to upload weekly self-monitored weight, energy intake, and exercise minutes and were told to use a food diary. Their goal was losing 10% of their body weight by consuming 1,200-1,500 calories per day if they weighed less than 250 pounds (113 kg) and 1,500-1,800 calories if they weighed more than 250 pounds, while also telling them to aim for 200 minutes per week of moderate- to vigorous-intensity physical activity.
What was found was wholly unsurprising. Perhaps speaking to the tremendous and wide-ranging degrees of privilege that are required to prioritize intentional behavior change in the name of health, 79% of those who were given a prescription for the program either didn’t start it or stopped it before the end of the first week.
Of those who actually started the program and completed more than 1 week, despite having been selected as appropriate and interested participants by their physicians, only 20% watched all of the automated programs’ video lessons while only 32% actually bothered to submit all 12 weeks of weight data. Of course, the authors found that those who watched the greatest number of videos and submitted the most self-reported weights lost more weight and ascribed that loss to the program. What the authors did not entertain was the possibility that those who weren’t losing weight, or who were gaining, might simply be less inclined to continue with a program that wasn’t leading them to their desired outcomes or to want to submit their lack of loss or gains.
Short-term weight-loss studies help no one and when, as in this case, the outcomes aren’t even mediocre, and the completion and engagement rates are terrible, the study is still presented as significant and important. This bolsters the harmful stereotype that weight management is achievable by way of simple messages and generic goals. It suggests that it’s individuals who fail programs by not trying hard enough and that those who do, or who want it the most, will succeed. It may also lead patients and clinicians to second-guess the use of antiobesity medications, the current generation of which lead to far greater weight loss and reproducibility than any behavioral program or diet ever has.
The good news here at least is that the small percentage of participants who made it through this program’s 12 weeks are being randomly assigned to differing 9-month maintenance programs which at least will then lead to a 1-year analysis on the completers.
Why this study was published now, rather than pushed until the 1-year data were available, speaks to the pervasiveness of the toxic weight-biased notion that simple education will overcome the physiology forged over millions of years of extreme dietary insecurity.
Our food environment is a veritable floodplain of hyperpalatable foods, and social determinants of health make intentional behavior change in the name of health an unattainable luxury for a huge swath of the population.
Dr. Freedhoff is an associate professor of family medicine at the University of Ottawa and medical director of the Bariatric Medical Institute. He reported serving as a director, officer, partner, employee, adviser, consultant, or trustee for Bariatric Medical Institute and Constant Health and receiving research grants from Novo Nordisk. A version of this article first appeared on Medscape.com.
Is there a diet or weight-loss program out there that doesn’t work for those who stick with it during its first 12 weeks?
Truly, the world’s most backwards, upside-down, anti-science, nonsensical diets work over the short haul, fueled by the fact that short-term suffering for weight loss is a skill set that humanity has assiduously cultivated for at least the past 100 years. We’re really good at it!
It’s the keeping the weight off, though, that’s the hitch. Which leads me to the question, why are medical journals, even preeminent nonpredatory ones, publishing 12-week weight-loss program studies as if they have value? And does anyone truly imagine that after over 100 years of trying, there’ll be a short-term diet or program that’ll have the durable, reproducible results that no other short-term diet or program ever has?
Take this study published by Obesity: “Pragmatic implementation of a fully automated online obesity treatment in primary care.” It details a 12-week online, automated, weight-loss program that led completers to lose the roughly 5% of weight that many diets and programs see lost over their first 12 weeks. By its description, aside from its automated provision, the program sounds like pretty much the same boilerplate weight management advice and recommendations that haven’t been shown to lead large numbers of people to sustain long-term weight loss.
Participants were provided with weekly lessons which no doubt in some manner told them that high-calorie foods had high numbers of calories and should be minimized, along with other weight-loss secrets. Users were to upload weekly self-monitored weight, energy intake, and exercise minutes and were told to use a food diary. Their goal was losing 10% of their body weight by consuming 1,200-1,500 calories per day if they weighed less than 250 pounds (113 kg) and 1,500-1,800 calories if they weighed more than 250 pounds, while also telling them to aim for 200 minutes per week of moderate- to vigorous-intensity physical activity.
What was found was wholly unsurprising. Perhaps speaking to the tremendous and wide-ranging degrees of privilege that are required to prioritize intentional behavior change in the name of health, 79% of those who were given a prescription for the program either didn’t start it or stopped it before the end of the first week.
Of those who actually started the program and completed more than 1 week, despite having been selected as appropriate and interested participants by their physicians, only 20% watched all of the automated programs’ video lessons while only 32% actually bothered to submit all 12 weeks of weight data. Of course, the authors found that those who watched the greatest number of videos and submitted the most self-reported weights lost more weight and ascribed that loss to the program. What the authors did not entertain was the possibility that those who weren’t losing weight, or who were gaining, might simply be less inclined to continue with a program that wasn’t leading them to their desired outcomes or to want to submit their lack of loss or gains.
Short-term weight-loss studies help no one and when, as in this case, the outcomes aren’t even mediocre, and the completion and engagement rates are terrible, the study is still presented as significant and important. This bolsters the harmful stereotype that weight management is achievable by way of simple messages and generic goals. It suggests that it’s individuals who fail programs by not trying hard enough and that those who do, or who want it the most, will succeed. It may also lead patients and clinicians to second-guess the use of antiobesity medications, the current generation of which lead to far greater weight loss and reproducibility than any behavioral program or diet ever has.
The good news here at least is that the small percentage of participants who made it through this program’s 12 weeks are being randomly assigned to differing 9-month maintenance programs which at least will then lead to a 1-year analysis on the completers.
Why this study was published now, rather than pushed until the 1-year data were available, speaks to the pervasiveness of the toxic weight-biased notion that simple education will overcome the physiology forged over millions of years of extreme dietary insecurity.
Our food environment is a veritable floodplain of hyperpalatable foods, and social determinants of health make intentional behavior change in the name of health an unattainable luxury for a huge swath of the population.
Dr. Freedhoff is an associate professor of family medicine at the University of Ottawa and medical director of the Bariatric Medical Institute. He reported serving as a director, officer, partner, employee, adviser, consultant, or trustee for Bariatric Medical Institute and Constant Health and receiving research grants from Novo Nordisk. A version of this article first appeared on Medscape.com.
Lung adverse effects in patients taking trastuzumab deruxtecan
although the benefit-to-risk relationship with use of the drug is still positive, say researchers who report a review of early clinical trials with the drug.
T-DXd is a monoclonal antibody that targets HER2. It is approved for use in HER2-positive breast, gastric, and lung cancers.
In the new study, investigators analyzed data from early clinical trials that involved patients with advanced cancers who had been heavily pretreated. They found an incidence of just over 15% for interstitial lung disease (ILD)/pneumonitis associated with the drug. Most patients (77.4%) had grade 1 or 2 ILD, but 2.2% of patients had grade 5 ILD.
“Interstitial lung disease is a known risk factor in patients treated with antibody conjugates for cancer,” commented lead author Charles Powell, MD, Icahn School of Medicine at Mount Sinai, New York. This adverse effect can lead to lung fibrosis and can become severe, life threatening, and even fatal, the authors warned.
The authors also discussed management of the event, which involves corticosteroids, and recommended that any patient who develops ILD of grade 3 or higher be hospitalized.
Close monitoring and proactive management may reduce the risk of ILD, they suggested.
Indeed, the incidence of this adverse effect was lower in a later phase 3 trial of the drug (10.5% in the DESTINY-Breast03 trial) and that the adverse events were less severe in this patient population (none of these events were of grade 4 or 5).
“Increased knowledge ... and implementation of ILD/pneumonitis monitoring, diagnosis, and management guidelines” may have resulted in this adverse effect being identified early and treated before it progressed, they commented.
ILD is highlighted in a boxed warning on the product label.
The study was published online in ESMO Open.
In their review, the investigators evaluated nine early-stage monotherapy clinical trials (phases 1 and 2) involving a total of 1,150 patients (breast cancer, 44.3%; gastric cancer, 25.6%; lung cancer, 17.7%; colorectal cancer, 9.3%, other cancers, 3.0%).
These patients had advanced cancer and had been heavily pretreated with a median of four prior lines of therapy. They received one or more doses of at least 5.4 mg/kg of T-DXd.
Nearly half of the cohort were treated for more than 6 months. A total of 276 potential ILD/pneumonitis events were sent for adjudication; of those, 85% were adjudicated as ILD/pneumonitis.
The overall incidence of adjudicated ILD/pneumonitis events was 15.4%; most were low-grade events. Some 87% of patients experienced their first ILD event within 12 months of treatment. The median time to experiencing an ILD/pneumonitis event was 5.4 months.
Some of the patients who developed grade 1 ILD/pneumonitis were treated and the adverse event resolved. These patients were then rechallenged with the drug. Only 3 of the 47 rechallenged patients experienced recurrence of ILD/pneumonitis, the authors noted.
“Rechallenge with T-DXd after complete resolution of grade 1 events is possible and warrants further investigation,” they commented. They cautioned, however, that rechallenge is not recommended for all patients, at least not for those with grade 2 or higher ILD/pneumonitis.
Overall, the authors concluded that the “benefit-risk of T-DXd treatment is positive,” but they warned that some patients may be at increased risk of developing ILD/pneumonitis
Baseline factors that increase the risk of developing an ILD/pneumonitis event include the following: being younger than 65 years, receiving a T-DXd dose of more than6.4 mg/kg, having a baseline oxygen saturation level of less than 95%, having moderate to severe renal impairment, and having lung comorbidities. In addition, patients who had initially been diagnosed with cancer more than 4 years before receiving the drug were at higher risk of developing ILD/pneumonitis.
“Using learnings from the early clinical trials experience, physician education and patient management protocols were revised and disseminated by the study sponsors [and] more recent trial data in earlier lines of therapy has demonstrated lower rates of ILD events, suggesting close monitoring and proactive management of ILD/pneumonitis is warranted for all patients,” Dr. Powell said in a statement.
The T-DXd clinical trials were sponsored by AstraZeneca and Daiichi Sankyo. Dr. Powell has received fees from Daiichi Sankyo, AstraZeneca, and Voluntis.
A version of this article first appeared on Medscape.com.
although the benefit-to-risk relationship with use of the drug is still positive, say researchers who report a review of early clinical trials with the drug.
T-DXd is a monoclonal antibody that targets HER2. It is approved for use in HER2-positive breast, gastric, and lung cancers.
In the new study, investigators analyzed data from early clinical trials that involved patients with advanced cancers who had been heavily pretreated. They found an incidence of just over 15% for interstitial lung disease (ILD)/pneumonitis associated with the drug. Most patients (77.4%) had grade 1 or 2 ILD, but 2.2% of patients had grade 5 ILD.
“Interstitial lung disease is a known risk factor in patients treated with antibody conjugates for cancer,” commented lead author Charles Powell, MD, Icahn School of Medicine at Mount Sinai, New York. This adverse effect can lead to lung fibrosis and can become severe, life threatening, and even fatal, the authors warned.
The authors also discussed management of the event, which involves corticosteroids, and recommended that any patient who develops ILD of grade 3 or higher be hospitalized.
Close monitoring and proactive management may reduce the risk of ILD, they suggested.
Indeed, the incidence of this adverse effect was lower in a later phase 3 trial of the drug (10.5% in the DESTINY-Breast03 trial) and that the adverse events were less severe in this patient population (none of these events were of grade 4 or 5).
“Increased knowledge ... and implementation of ILD/pneumonitis monitoring, diagnosis, and management guidelines” may have resulted in this adverse effect being identified early and treated before it progressed, they commented.
ILD is highlighted in a boxed warning on the product label.
The study was published online in ESMO Open.
In their review, the investigators evaluated nine early-stage monotherapy clinical trials (phases 1 and 2) involving a total of 1,150 patients (breast cancer, 44.3%; gastric cancer, 25.6%; lung cancer, 17.7%; colorectal cancer, 9.3%, other cancers, 3.0%).
These patients had advanced cancer and had been heavily pretreated with a median of four prior lines of therapy. They received one or more doses of at least 5.4 mg/kg of T-DXd.
Nearly half of the cohort were treated for more than 6 months. A total of 276 potential ILD/pneumonitis events were sent for adjudication; of those, 85% were adjudicated as ILD/pneumonitis.
The overall incidence of adjudicated ILD/pneumonitis events was 15.4%; most were low-grade events. Some 87% of patients experienced their first ILD event within 12 months of treatment. The median time to experiencing an ILD/pneumonitis event was 5.4 months.
Some of the patients who developed grade 1 ILD/pneumonitis were treated and the adverse event resolved. These patients were then rechallenged with the drug. Only 3 of the 47 rechallenged patients experienced recurrence of ILD/pneumonitis, the authors noted.
“Rechallenge with T-DXd after complete resolution of grade 1 events is possible and warrants further investigation,” they commented. They cautioned, however, that rechallenge is not recommended for all patients, at least not for those with grade 2 or higher ILD/pneumonitis.
Overall, the authors concluded that the “benefit-risk of T-DXd treatment is positive,” but they warned that some patients may be at increased risk of developing ILD/pneumonitis
Baseline factors that increase the risk of developing an ILD/pneumonitis event include the following: being younger than 65 years, receiving a T-DXd dose of more than6.4 mg/kg, having a baseline oxygen saturation level of less than 95%, having moderate to severe renal impairment, and having lung comorbidities. In addition, patients who had initially been diagnosed with cancer more than 4 years before receiving the drug were at higher risk of developing ILD/pneumonitis.
“Using learnings from the early clinical trials experience, physician education and patient management protocols were revised and disseminated by the study sponsors [and] more recent trial data in earlier lines of therapy has demonstrated lower rates of ILD events, suggesting close monitoring and proactive management of ILD/pneumonitis is warranted for all patients,” Dr. Powell said in a statement.
The T-DXd clinical trials were sponsored by AstraZeneca and Daiichi Sankyo. Dr. Powell has received fees from Daiichi Sankyo, AstraZeneca, and Voluntis.
A version of this article first appeared on Medscape.com.
although the benefit-to-risk relationship with use of the drug is still positive, say researchers who report a review of early clinical trials with the drug.
T-DXd is a monoclonal antibody that targets HER2. It is approved for use in HER2-positive breast, gastric, and lung cancers.
In the new study, investigators analyzed data from early clinical trials that involved patients with advanced cancers who had been heavily pretreated. They found an incidence of just over 15% for interstitial lung disease (ILD)/pneumonitis associated with the drug. Most patients (77.4%) had grade 1 or 2 ILD, but 2.2% of patients had grade 5 ILD.
“Interstitial lung disease is a known risk factor in patients treated with antibody conjugates for cancer,” commented lead author Charles Powell, MD, Icahn School of Medicine at Mount Sinai, New York. This adverse effect can lead to lung fibrosis and can become severe, life threatening, and even fatal, the authors warned.
The authors also discussed management of the event, which involves corticosteroids, and recommended that any patient who develops ILD of grade 3 or higher be hospitalized.
Close monitoring and proactive management may reduce the risk of ILD, they suggested.
Indeed, the incidence of this adverse effect was lower in a later phase 3 trial of the drug (10.5% in the DESTINY-Breast03 trial) and that the adverse events were less severe in this patient population (none of these events were of grade 4 or 5).
“Increased knowledge ... and implementation of ILD/pneumonitis monitoring, diagnosis, and management guidelines” may have resulted in this adverse effect being identified early and treated before it progressed, they commented.
ILD is highlighted in a boxed warning on the product label.
The study was published online in ESMO Open.
In their review, the investigators evaluated nine early-stage monotherapy clinical trials (phases 1 and 2) involving a total of 1,150 patients (breast cancer, 44.3%; gastric cancer, 25.6%; lung cancer, 17.7%; colorectal cancer, 9.3%, other cancers, 3.0%).
These patients had advanced cancer and had been heavily pretreated with a median of four prior lines of therapy. They received one or more doses of at least 5.4 mg/kg of T-DXd.
Nearly half of the cohort were treated for more than 6 months. A total of 276 potential ILD/pneumonitis events were sent for adjudication; of those, 85% were adjudicated as ILD/pneumonitis.
The overall incidence of adjudicated ILD/pneumonitis events was 15.4%; most were low-grade events. Some 87% of patients experienced their first ILD event within 12 months of treatment. The median time to experiencing an ILD/pneumonitis event was 5.4 months.
Some of the patients who developed grade 1 ILD/pneumonitis were treated and the adverse event resolved. These patients were then rechallenged with the drug. Only 3 of the 47 rechallenged patients experienced recurrence of ILD/pneumonitis, the authors noted.
“Rechallenge with T-DXd after complete resolution of grade 1 events is possible and warrants further investigation,” they commented. They cautioned, however, that rechallenge is not recommended for all patients, at least not for those with grade 2 or higher ILD/pneumonitis.
Overall, the authors concluded that the “benefit-risk of T-DXd treatment is positive,” but they warned that some patients may be at increased risk of developing ILD/pneumonitis
Baseline factors that increase the risk of developing an ILD/pneumonitis event include the following: being younger than 65 years, receiving a T-DXd dose of more than6.4 mg/kg, having a baseline oxygen saturation level of less than 95%, having moderate to severe renal impairment, and having lung comorbidities. In addition, patients who had initially been diagnosed with cancer more than 4 years before receiving the drug were at higher risk of developing ILD/pneumonitis.
“Using learnings from the early clinical trials experience, physician education and patient management protocols were revised and disseminated by the study sponsors [and] more recent trial data in earlier lines of therapy has demonstrated lower rates of ILD events, suggesting close monitoring and proactive management of ILD/pneumonitis is warranted for all patients,” Dr. Powell said in a statement.
The T-DXd clinical trials were sponsored by AstraZeneca and Daiichi Sankyo. Dr. Powell has received fees from Daiichi Sankyo, AstraZeneca, and Voluntis.
A version of this article first appeared on Medscape.com.
FROM ESMO OPEN
The ‘great dynamism’ of radiation oncology
The field of radiation oncology has rapidly evolved in recent years, thanks in large part to findings from randomized clinical trials (RCTs) that have helped shift therapeutic standards, a review of the literature shows.
Highlights from this research reveal how high-tech radiotherapy, such as hypofractionation and stereotactic body radiotherapy, has improved care for many patients, how personalized radiotherapy using image-based guidance has helped tailor treatments, and how endpoints that focus on quality of life and patient satisfaction are emerging.
For instance, Charles B. Simone II, MD, FACRO, who was not involved in the current work, pointed to “a proliferation of trials assessing hypofractionation in the curative setting and stereotactic body radiation therapy in the curative and poly- and oligometastatic settings that have allowed for increased patient convenience and dose intensification, respectively.”
Dr. Simone, chief medical officer, New York Proton Center, Memorial Sloan Kettering Cancer Center, also noted that the first personalized radiotherapy trials using imaging and biological markers have “the profound potential to individualize treatment and improve patient outcomes.”
The review was published in the European Journal of Cancer.
An evolving field
Given the fast-changing landscape for cancer therapeutics and a deluge of research studies, the authors wanted to understand the most notable advances established in recent trials as well as caveats to some approaches and emerging areas to watch.
In the review, Sophie Espenel, MD, from the department of radiation oncology, Gustave Roussy Cancer Campus, Villejuif, France, and colleagues identified 1,347 radiotherapy RCTs that were conducted from January 2018 to December 2021. Of these, the authors selected 110 large phase 2 or 3 RCTs that contained data showing practice-changing or emerging concepts.
Overall, the studies showed “great dynamism” in radiation oncology research and covered a wide range of radiotherapy practices, according to Dr. Espenel and coauthors.
A central area of research has focused on radioimmunotherapy, an approach that aims to enhance the antitumor immune response. One RCT in the preoperative setting showed, for instance, that concurrent stereotactic body radiotherapy delivered at 24 Gy over eight fractions, along with the anti–PD-L1 agent durvalumab, increased major pathologic complete response rates almost eightfold in comparison with durvalumab alone for patients with early-stage lung cancer (53.3% vs. 6.7%).
Although promising, not all trials that evaluated a concurrent chemoradiotherapy-immunotherapy strategy showed positive results. One RCT of locally advanced head and neck squamous cell carcinoma, for instance, found that median progression-free survival was not reached when adding the anti–PD-L1 avelumab to chemoradiotherapy. In addition, trials in the metastatic setting have shown conflicting results, the authors note.
Another topic of interest is that of newer radiosensitizers. A trial that evaluated high-risk locoregionally advanced head and neck squamous cell carcinoma highlighted the efficacy of xevinapant, a pro-apoptotic agent that inhibits apoptosis proteins. Xevinapant was used for the first time in conjunction with a standard high-dose cisplatin chemoradiotherapy. In this study, locoregional control at 18 months was achieved for 54% of patients who received xevinapant vs. 33% of those who received standard care. The toxicity profiles were similar.
The use of high-tech radiotherapy is gaining ground. It allows patients to receive more targeted treatments at lower doses and in shorter time frames. One trial found, for instance, that a more hypofractionated adjuvant whole breast approach, using 26 Gy in five fractions over a week, is as effective and safe as 40 Gy in 15 fractions over 3 weeks. The researchers found that there was no difference in the incidence of locoregional relapses, disease-free survival, and overall survival between the regimens.
Dr. Simone also noted that advanced treatment modalities, such as intensity-modulated radiotherapy, stereotactic radiosurgery, and proton therapy, have the potential to improve patient-reported adverse events and clinical outcomes. “I have seen this both in my clinical practice and in several recent publications,” he says.
Personalization of radiotherapy is also an emerging area that may allow for more tailored treatments with improved outcomes. The authors highlighted a study that found that PMSA PET-CT was better than conventional CT for accurately staging prostate cancer. This approach was also less expensive and led to less radiation exposure.
On the basis of this research, “PMSA PET-CT has since become the [standard of care] for prostate cancer staging,” the authors explain.
Dr. Espenel and colleagues note that as patients survive longer, quality of life and patient satisfaction are increasingly becoming endpoints in RCTs. Experts are focusing more attention on sequelae of treatments and advances in technology that can spare critical organs from radiation and reduce overall treatment time.
Shared decision-making is becoming increasingly possible in many cases as well. For example, with some clinical trials that involved different treatment modalities, outcomes were equivalent, but toxicity profiles differed, allowing patients to choose therapeutic options tailored to their preferences.
Overall, these data demonstrate “a great dynamism of radiation oncology research in most primary tumor types,” the researchers write.
The study received no outside financial support. The authors have disclosed no relevant financial relationships. Dr. Simone is chair of the American Society for Radiation Oncology Lung Resource Panel and the American Society for Radiation Oncology Veteran Affairs Radiation Oncology Quality Surveillance Blue Ribbon Lung Panel and has received honorarium from Varian Medical Systems.
A version of this article first appeared on Medscape.com.
The field of radiation oncology has rapidly evolved in recent years, thanks in large part to findings from randomized clinical trials (RCTs) that have helped shift therapeutic standards, a review of the literature shows.
Highlights from this research reveal how high-tech radiotherapy, such as hypofractionation and stereotactic body radiotherapy, has improved care for many patients, how personalized radiotherapy using image-based guidance has helped tailor treatments, and how endpoints that focus on quality of life and patient satisfaction are emerging.
For instance, Charles B. Simone II, MD, FACRO, who was not involved in the current work, pointed to “a proliferation of trials assessing hypofractionation in the curative setting and stereotactic body radiation therapy in the curative and poly- and oligometastatic settings that have allowed for increased patient convenience and dose intensification, respectively.”
Dr. Simone, chief medical officer, New York Proton Center, Memorial Sloan Kettering Cancer Center, also noted that the first personalized radiotherapy trials using imaging and biological markers have “the profound potential to individualize treatment and improve patient outcomes.”
The review was published in the European Journal of Cancer.
An evolving field
Given the fast-changing landscape for cancer therapeutics and a deluge of research studies, the authors wanted to understand the most notable advances established in recent trials as well as caveats to some approaches and emerging areas to watch.
In the review, Sophie Espenel, MD, from the department of radiation oncology, Gustave Roussy Cancer Campus, Villejuif, France, and colleagues identified 1,347 radiotherapy RCTs that were conducted from January 2018 to December 2021. Of these, the authors selected 110 large phase 2 or 3 RCTs that contained data showing practice-changing or emerging concepts.
Overall, the studies showed “great dynamism” in radiation oncology research and covered a wide range of radiotherapy practices, according to Dr. Espenel and coauthors.
A central area of research has focused on radioimmunotherapy, an approach that aims to enhance the antitumor immune response. One RCT in the preoperative setting showed, for instance, that concurrent stereotactic body radiotherapy delivered at 24 Gy over eight fractions, along with the anti–PD-L1 agent durvalumab, increased major pathologic complete response rates almost eightfold in comparison with durvalumab alone for patients with early-stage lung cancer (53.3% vs. 6.7%).
Although promising, not all trials that evaluated a concurrent chemoradiotherapy-immunotherapy strategy showed positive results. One RCT of locally advanced head and neck squamous cell carcinoma, for instance, found that median progression-free survival was not reached when adding the anti–PD-L1 avelumab to chemoradiotherapy. In addition, trials in the metastatic setting have shown conflicting results, the authors note.
Another topic of interest is that of newer radiosensitizers. A trial that evaluated high-risk locoregionally advanced head and neck squamous cell carcinoma highlighted the efficacy of xevinapant, a pro-apoptotic agent that inhibits apoptosis proteins. Xevinapant was used for the first time in conjunction with a standard high-dose cisplatin chemoradiotherapy. In this study, locoregional control at 18 months was achieved for 54% of patients who received xevinapant vs. 33% of those who received standard care. The toxicity profiles were similar.
The use of high-tech radiotherapy is gaining ground. It allows patients to receive more targeted treatments at lower doses and in shorter time frames. One trial found, for instance, that a more hypofractionated adjuvant whole breast approach, using 26 Gy in five fractions over a week, is as effective and safe as 40 Gy in 15 fractions over 3 weeks. The researchers found that there was no difference in the incidence of locoregional relapses, disease-free survival, and overall survival between the regimens.
Dr. Simone also noted that advanced treatment modalities, such as intensity-modulated radiotherapy, stereotactic radiosurgery, and proton therapy, have the potential to improve patient-reported adverse events and clinical outcomes. “I have seen this both in my clinical practice and in several recent publications,” he says.
Personalization of radiotherapy is also an emerging area that may allow for more tailored treatments with improved outcomes. The authors highlighted a study that found that PMSA PET-CT was better than conventional CT for accurately staging prostate cancer. This approach was also less expensive and led to less radiation exposure.
On the basis of this research, “PMSA PET-CT has since become the [standard of care] for prostate cancer staging,” the authors explain.
Dr. Espenel and colleagues note that as patients survive longer, quality of life and patient satisfaction are increasingly becoming endpoints in RCTs. Experts are focusing more attention on sequelae of treatments and advances in technology that can spare critical organs from radiation and reduce overall treatment time.
Shared decision-making is becoming increasingly possible in many cases as well. For example, with some clinical trials that involved different treatment modalities, outcomes were equivalent, but toxicity profiles differed, allowing patients to choose therapeutic options tailored to their preferences.
Overall, these data demonstrate “a great dynamism of radiation oncology research in most primary tumor types,” the researchers write.
The study received no outside financial support. The authors have disclosed no relevant financial relationships. Dr. Simone is chair of the American Society for Radiation Oncology Lung Resource Panel and the American Society for Radiation Oncology Veteran Affairs Radiation Oncology Quality Surveillance Blue Ribbon Lung Panel and has received honorarium from Varian Medical Systems.
A version of this article first appeared on Medscape.com.
The field of radiation oncology has rapidly evolved in recent years, thanks in large part to findings from randomized clinical trials (RCTs) that have helped shift therapeutic standards, a review of the literature shows.
Highlights from this research reveal how high-tech radiotherapy, such as hypofractionation and stereotactic body radiotherapy, has improved care for many patients, how personalized radiotherapy using image-based guidance has helped tailor treatments, and how endpoints that focus on quality of life and patient satisfaction are emerging.
For instance, Charles B. Simone II, MD, FACRO, who was not involved in the current work, pointed to “a proliferation of trials assessing hypofractionation in the curative setting and stereotactic body radiation therapy in the curative and poly- and oligometastatic settings that have allowed for increased patient convenience and dose intensification, respectively.”
Dr. Simone, chief medical officer, New York Proton Center, Memorial Sloan Kettering Cancer Center, also noted that the first personalized radiotherapy trials using imaging and biological markers have “the profound potential to individualize treatment and improve patient outcomes.”
The review was published in the European Journal of Cancer.
An evolving field
Given the fast-changing landscape for cancer therapeutics and a deluge of research studies, the authors wanted to understand the most notable advances established in recent trials as well as caveats to some approaches and emerging areas to watch.
In the review, Sophie Espenel, MD, from the department of radiation oncology, Gustave Roussy Cancer Campus, Villejuif, France, and colleagues identified 1,347 radiotherapy RCTs that were conducted from January 2018 to December 2021. Of these, the authors selected 110 large phase 2 or 3 RCTs that contained data showing practice-changing or emerging concepts.
Overall, the studies showed “great dynamism” in radiation oncology research and covered a wide range of radiotherapy practices, according to Dr. Espenel and coauthors.
A central area of research has focused on radioimmunotherapy, an approach that aims to enhance the antitumor immune response. One RCT in the preoperative setting showed, for instance, that concurrent stereotactic body radiotherapy delivered at 24 Gy over eight fractions, along with the anti–PD-L1 agent durvalumab, increased major pathologic complete response rates almost eightfold in comparison with durvalumab alone for patients with early-stage lung cancer (53.3% vs. 6.7%).
Although promising, not all trials that evaluated a concurrent chemoradiotherapy-immunotherapy strategy showed positive results. One RCT of locally advanced head and neck squamous cell carcinoma, for instance, found that median progression-free survival was not reached when adding the anti–PD-L1 avelumab to chemoradiotherapy. In addition, trials in the metastatic setting have shown conflicting results, the authors note.
Another topic of interest is that of newer radiosensitizers. A trial that evaluated high-risk locoregionally advanced head and neck squamous cell carcinoma highlighted the efficacy of xevinapant, a pro-apoptotic agent that inhibits apoptosis proteins. Xevinapant was used for the first time in conjunction with a standard high-dose cisplatin chemoradiotherapy. In this study, locoregional control at 18 months was achieved for 54% of patients who received xevinapant vs. 33% of those who received standard care. The toxicity profiles were similar.
The use of high-tech radiotherapy is gaining ground. It allows patients to receive more targeted treatments at lower doses and in shorter time frames. One trial found, for instance, that a more hypofractionated adjuvant whole breast approach, using 26 Gy in five fractions over a week, is as effective and safe as 40 Gy in 15 fractions over 3 weeks. The researchers found that there was no difference in the incidence of locoregional relapses, disease-free survival, and overall survival between the regimens.
Dr. Simone also noted that advanced treatment modalities, such as intensity-modulated radiotherapy, stereotactic radiosurgery, and proton therapy, have the potential to improve patient-reported adverse events and clinical outcomes. “I have seen this both in my clinical practice and in several recent publications,” he says.
Personalization of radiotherapy is also an emerging area that may allow for more tailored treatments with improved outcomes. The authors highlighted a study that found that PMSA PET-CT was better than conventional CT for accurately staging prostate cancer. This approach was also less expensive and led to less radiation exposure.
On the basis of this research, “PMSA PET-CT has since become the [standard of care] for prostate cancer staging,” the authors explain.
Dr. Espenel and colleagues note that as patients survive longer, quality of life and patient satisfaction are increasingly becoming endpoints in RCTs. Experts are focusing more attention on sequelae of treatments and advances in technology that can spare critical organs from radiation and reduce overall treatment time.
Shared decision-making is becoming increasingly possible in many cases as well. For example, with some clinical trials that involved different treatment modalities, outcomes were equivalent, but toxicity profiles differed, allowing patients to choose therapeutic options tailored to their preferences.
Overall, these data demonstrate “a great dynamism of radiation oncology research in most primary tumor types,” the researchers write.
The study received no outside financial support. The authors have disclosed no relevant financial relationships. Dr. Simone is chair of the American Society for Radiation Oncology Lung Resource Panel and the American Society for Radiation Oncology Veteran Affairs Radiation Oncology Quality Surveillance Blue Ribbon Lung Panel and has received honorarium from Varian Medical Systems.
A version of this article first appeared on Medscape.com.
FROM THE EUROPEAN JOURNAL OF CANCER
At 100, Guinness’s oldest practicing doctor shows no signs of slowing down
In the same year that Howard Tucker, MD, began practicing neurology, the average loaf of bread cost 13 cents, the microwave oven became commercially available, and Jackie Robinson took the field for the Brooklyn Dodgers as the first Black person to play Major League Baseball.
Since 1947, Dr. Tucker has witnessed major changes in health care, from President Harry S. Truman proposing a national health care plan to Congress to the current day, when patients carry their digital records around with them.
Dr. Tucker has been a resident of Cleveland Heights, Ohio, since 1922, the year he was born.
After graduating high school in 1940, Dr. Tucker attended Ohio State University, Columbus, where he received his undergraduate and medical degrees. During the Korean War, he served as chief neurologist for the Atlantic fleet at a U.S. Naval Hospital in Philadelphia. Following the war, he completed his residency at the Cleveland Clinic and trained at the Neurological Institute of New York.
Dr. Tucker chose to return to Cleveland, where he practiced at the University Hospitals Cleveland Medical Center and Hillcrest Hospital for several decades.
Not content with just a medical degree, at the age of 67, Dr. Tucker attended Cleveland State University Cleveland Marshall College of Law. In 1989, he received his Juris Doctor degree and passed the Ohio bar examination.
And as if that weren’t enough career accomplishments, Guinness World Records dubbed him the world’s oldest practicing doctor at 98 years and 231 days. Dr. Tucker continues to practice into his 100th year. He celebrated his birthday in July.
Owing to the compelling and inspiring nature of his upbringing, Dr. Tucker has become the subject of a feature documentary film entitled “What’s Next?” The film is currently in production. It is being produced by his grandson, Austin Tucker, and is directed by Taylor Taglianetti.
This news organization recently spoke with Dr. Tucker about his life’s work in medicine.
Question: Why did you choose neurology?
Dr. Tucker: Well, I think I was just fascinated with medicine from about the seventh or eighth grade. I chose my specialty because it was a very cerebral one in those days. It was an intellectual pursuit. It was before the CAT scan, and you had to work hard to make a diagnosis. You even had to look at the spinal fluid. You had to look at EEGs, and it was a very detailed history taking.
Question: How has neurology changed since you started practicing?
Dr. Tucker: The MRI came in, so we don’t have to use spinal taps anymore. Lumbar puncture fluid and EEG aren’t needed as often either. Now we use EEG for convulsive disorders, but rarely when we suspect tumors like we used to. Also, when I was in med school, they said to use Dilaudid; don’t use morphine. And now, you can’t even find Dilaudin in emergency rooms anymore.
Question: How has medicine overall changed since you started practicing?
Dr. Tucker: Computers have made everything a different specialty.
In the old days, we would see a patient, call the referring doctor, and discuss [the case] with them in a very pleasant way. Now, when you call a doctor, he’ll say to you, “Let me read your note,” and that’s the end of it. He doesn’t want to talk to you. Medicine has changed dramatically.
It used to be a very warm relationship between you and your patients. You looked at your patient, you studied their expressions, and now you look at the screen and very rarely look at the patient.
Question: Why do you still enjoy practicing medicine?
Dr. Tucker: The challenge, the excitement of patients, and now I’m doing a lot of teaching, and I do love that part, too.
I teach neurology to residents and medical students that rotate through. When I retired from the Cleveland Clinic, 2 months of retirement was too much for me, so I went back to St. Vincent. It’s a smaller hospital but still has good residents and good teaching.
Question: What lessons do you teach to your residents?
Dr. Tucker: I ask my residents and physicians to think through a problem before they look at the CAT scan and imaging studies. Think through it, then you’ll know what questions you want to ask specifically before you even examine the patient, know exactly what you are going to find.
The complete neurological examination, aside from taking the history and checking mental status, is 5 minutes. You have them walk, check for excessive finger tapping, have them touch their nose, check their reflexes, check their strength – it’s over. That doesn’t take much time if you know what you’re looking for.
Residents say to me all the time, “55-year-old man, CAT scan shows ...” I have to say to them: “Slow down. Let’s talk about this first.”
Question: What advice do you have for physicians and medical students?
Dr. Tucker: Take a very careful history. Know the course of the illness. Make sure you have a diagnosis in your head and, specifically for medical residents, ask questions. You have to be smarter than the patients are, you have to know what to ask.
If someone hits their head on their steering wheel, they don’t know that they’ve lost their sense of smell. You have to ask that specifically, hence, why you have to be smarter than they are. Take a careful history before you do imaging studies.
A version of this article first appeared on Medscape.com.
In the same year that Howard Tucker, MD, began practicing neurology, the average loaf of bread cost 13 cents, the microwave oven became commercially available, and Jackie Robinson took the field for the Brooklyn Dodgers as the first Black person to play Major League Baseball.
Since 1947, Dr. Tucker has witnessed major changes in health care, from President Harry S. Truman proposing a national health care plan to Congress to the current day, when patients carry their digital records around with them.
Dr. Tucker has been a resident of Cleveland Heights, Ohio, since 1922, the year he was born.
After graduating high school in 1940, Dr. Tucker attended Ohio State University, Columbus, where he received his undergraduate and medical degrees. During the Korean War, he served as chief neurologist for the Atlantic fleet at a U.S. Naval Hospital in Philadelphia. Following the war, he completed his residency at the Cleveland Clinic and trained at the Neurological Institute of New York.
Dr. Tucker chose to return to Cleveland, where he practiced at the University Hospitals Cleveland Medical Center and Hillcrest Hospital for several decades.
Not content with just a medical degree, at the age of 67, Dr. Tucker attended Cleveland State University Cleveland Marshall College of Law. In 1989, he received his Juris Doctor degree and passed the Ohio bar examination.
And as if that weren’t enough career accomplishments, Guinness World Records dubbed him the world’s oldest practicing doctor at 98 years and 231 days. Dr. Tucker continues to practice into his 100th year. He celebrated his birthday in July.
Owing to the compelling and inspiring nature of his upbringing, Dr. Tucker has become the subject of a feature documentary film entitled “What’s Next?” The film is currently in production. It is being produced by his grandson, Austin Tucker, and is directed by Taylor Taglianetti.
This news organization recently spoke with Dr. Tucker about his life’s work in medicine.
Question: Why did you choose neurology?
Dr. Tucker: Well, I think I was just fascinated with medicine from about the seventh or eighth grade. I chose my specialty because it was a very cerebral one in those days. It was an intellectual pursuit. It was before the CAT scan, and you had to work hard to make a diagnosis. You even had to look at the spinal fluid. You had to look at EEGs, and it was a very detailed history taking.
Question: How has neurology changed since you started practicing?
Dr. Tucker: The MRI came in, so we don’t have to use spinal taps anymore. Lumbar puncture fluid and EEG aren’t needed as often either. Now we use EEG for convulsive disorders, but rarely when we suspect tumors like we used to. Also, when I was in med school, they said to use Dilaudid; don’t use morphine. And now, you can’t even find Dilaudin in emergency rooms anymore.
Question: How has medicine overall changed since you started practicing?
Dr. Tucker: Computers have made everything a different specialty.
In the old days, we would see a patient, call the referring doctor, and discuss [the case] with them in a very pleasant way. Now, when you call a doctor, he’ll say to you, “Let me read your note,” and that’s the end of it. He doesn’t want to talk to you. Medicine has changed dramatically.
It used to be a very warm relationship between you and your patients. You looked at your patient, you studied their expressions, and now you look at the screen and very rarely look at the patient.
Question: Why do you still enjoy practicing medicine?
Dr. Tucker: The challenge, the excitement of patients, and now I’m doing a lot of teaching, and I do love that part, too.
I teach neurology to residents and medical students that rotate through. When I retired from the Cleveland Clinic, 2 months of retirement was too much for me, so I went back to St. Vincent. It’s a smaller hospital but still has good residents and good teaching.
Question: What lessons do you teach to your residents?
Dr. Tucker: I ask my residents and physicians to think through a problem before they look at the CAT scan and imaging studies. Think through it, then you’ll know what questions you want to ask specifically before you even examine the patient, know exactly what you are going to find.
The complete neurological examination, aside from taking the history and checking mental status, is 5 minutes. You have them walk, check for excessive finger tapping, have them touch their nose, check their reflexes, check their strength – it’s over. That doesn’t take much time if you know what you’re looking for.
Residents say to me all the time, “55-year-old man, CAT scan shows ...” I have to say to them: “Slow down. Let’s talk about this first.”
Question: What advice do you have for physicians and medical students?
Dr. Tucker: Take a very careful history. Know the course of the illness. Make sure you have a diagnosis in your head and, specifically for medical residents, ask questions. You have to be smarter than the patients are, you have to know what to ask.
If someone hits their head on their steering wheel, they don’t know that they’ve lost their sense of smell. You have to ask that specifically, hence, why you have to be smarter than they are. Take a careful history before you do imaging studies.
A version of this article first appeared on Medscape.com.
In the same year that Howard Tucker, MD, began practicing neurology, the average loaf of bread cost 13 cents, the microwave oven became commercially available, and Jackie Robinson took the field for the Brooklyn Dodgers as the first Black person to play Major League Baseball.
Since 1947, Dr. Tucker has witnessed major changes in health care, from President Harry S. Truman proposing a national health care plan to Congress to the current day, when patients carry their digital records around with them.
Dr. Tucker has been a resident of Cleveland Heights, Ohio, since 1922, the year he was born.
After graduating high school in 1940, Dr. Tucker attended Ohio State University, Columbus, where he received his undergraduate and medical degrees. During the Korean War, he served as chief neurologist for the Atlantic fleet at a U.S. Naval Hospital in Philadelphia. Following the war, he completed his residency at the Cleveland Clinic and trained at the Neurological Institute of New York.
Dr. Tucker chose to return to Cleveland, where he practiced at the University Hospitals Cleveland Medical Center and Hillcrest Hospital for several decades.
Not content with just a medical degree, at the age of 67, Dr. Tucker attended Cleveland State University Cleveland Marshall College of Law. In 1989, he received his Juris Doctor degree and passed the Ohio bar examination.
And as if that weren’t enough career accomplishments, Guinness World Records dubbed him the world’s oldest practicing doctor at 98 years and 231 days. Dr. Tucker continues to practice into his 100th year. He celebrated his birthday in July.
Owing to the compelling and inspiring nature of his upbringing, Dr. Tucker has become the subject of a feature documentary film entitled “What’s Next?” The film is currently in production. It is being produced by his grandson, Austin Tucker, and is directed by Taylor Taglianetti.
This news organization recently spoke with Dr. Tucker about his life’s work in medicine.
Question: Why did you choose neurology?
Dr. Tucker: Well, I think I was just fascinated with medicine from about the seventh or eighth grade. I chose my specialty because it was a very cerebral one in those days. It was an intellectual pursuit. It was before the CAT scan, and you had to work hard to make a diagnosis. You even had to look at the spinal fluid. You had to look at EEGs, and it was a very detailed history taking.
Question: How has neurology changed since you started practicing?
Dr. Tucker: The MRI came in, so we don’t have to use spinal taps anymore. Lumbar puncture fluid and EEG aren’t needed as often either. Now we use EEG for convulsive disorders, but rarely when we suspect tumors like we used to. Also, when I was in med school, they said to use Dilaudid; don’t use morphine. And now, you can’t even find Dilaudin in emergency rooms anymore.
Question: How has medicine overall changed since you started practicing?
Dr. Tucker: Computers have made everything a different specialty.
In the old days, we would see a patient, call the referring doctor, and discuss [the case] with them in a very pleasant way. Now, when you call a doctor, he’ll say to you, “Let me read your note,” and that’s the end of it. He doesn’t want to talk to you. Medicine has changed dramatically.
It used to be a very warm relationship between you and your patients. You looked at your patient, you studied their expressions, and now you look at the screen and very rarely look at the patient.
Question: Why do you still enjoy practicing medicine?
Dr. Tucker: The challenge, the excitement of patients, and now I’m doing a lot of teaching, and I do love that part, too.
I teach neurology to residents and medical students that rotate through. When I retired from the Cleveland Clinic, 2 months of retirement was too much for me, so I went back to St. Vincent. It’s a smaller hospital but still has good residents and good teaching.
Question: What lessons do you teach to your residents?
Dr. Tucker: I ask my residents and physicians to think through a problem before they look at the CAT scan and imaging studies. Think through it, then you’ll know what questions you want to ask specifically before you even examine the patient, know exactly what you are going to find.
The complete neurological examination, aside from taking the history and checking mental status, is 5 minutes. You have them walk, check for excessive finger tapping, have them touch their nose, check their reflexes, check their strength – it’s over. That doesn’t take much time if you know what you’re looking for.
Residents say to me all the time, “55-year-old man, CAT scan shows ...” I have to say to them: “Slow down. Let’s talk about this first.”
Question: What advice do you have for physicians and medical students?
Dr. Tucker: Take a very careful history. Know the course of the illness. Make sure you have a diagnosis in your head and, specifically for medical residents, ask questions. You have to be smarter than the patients are, you have to know what to ask.
If someone hits their head on their steering wheel, they don’t know that they’ve lost their sense of smell. You have to ask that specifically, hence, why you have to be smarter than they are. Take a careful history before you do imaging studies.
A version of this article first appeared on Medscape.com.
Where women’s voices still get heard less
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
“Our study provides the first analysis of gender and early-career faculty disparities in speakers at hematology and medical oncology board review meetings,” the authors reported in research published in Blood Advances.
“We covered six major board reviews over the last 5 years that are either conducted yearly or every other year, [and] the general trend across all meetings showed skewness toward men speakers,” the authors reported.
Recent data from 2021 suggests a closing of the gender gap in oncology, with women making up 44.6% of oncologists in training. However, they still only represented 35.2% of practicing oncologists and are underrepresented in leadership positions in academic oncology, the authors reported.
With speaking roles at academic meetings potentially marking a key step in career advancement and improved opportunities, the authors sought to investigate the balance of gender, as well as early-career faculty among speakers at prominent hematology and/or oncology board review lecture series taking place in the United States between 2017 and 2021.
The five institutions and one society presenting the board review lecture series included Baylor College of Medicine/MD Anderson Cancer Center, both in Houston; Dana-Farber Brigham Cancer Center, Boston; George Washington University, Washington; Memorial Sloan Kettering Cancer Center, New York; Seattle Cancer Care Alliance; and the hematology board review series from the American Society of Hematology.
During the period in question, among 1,224 board review lectures presented, women constituted only 37.7% of the speakers. In lectures presented by American Board of Internal Medicine–certified speakers (n = 1,016, 83%), women were found to have made up fewer than 50% of speakers in five of six courses.
Men were also more likely to be recurrent speakers; across all courses, 13 men but only 2 women conducted 10 or more lectures. And while 35 men gave six or more lectures across all courses, only 12 women did so.
The lecture topics with the lowest rates of women presenters included malignant hematology (24.8%), solid tumors (38.9%), and benign hematology lectures (44.1%).
“We suspected [the imbalance in malignant hematology] since multiple recurrent roles were concentrated in the malignant hematology,” senior author Samer Al Hadidi, MD, of the Myeloma Center, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, AK, said in an interview.
He noted that “there are no regulations that such courses need to follow to ensure certain proportions of women and junior faculty are involved.”
Early-career faculty
In terms of early-career representation, more than 50% of lectures were given by faculty who had received their initial certifications more than 15 years earlier. The median time from initial certification was 12.5 years for hematology and 14 years for medical oncology.
The findings that more than half of the board review lectures were presented by faculty with more than 15 years’ experience since initial certification “reflects a lack of appropriate involvement of early-career faculty, who arguably may have more recent experience with board certification,” the authors wrote.
While being underrepresented in such roles is detrimental, there are no regulations that such courses follow to ensure certain proportions of women and junior faculty are involved, Dr. Al Hadidi noted.
Equal representation remains elusive
The study does suggest some notable gains. In a previous study of 181 academic conferences in the United States and Canada between 2007 and 2017, the rate of women speakers was only 15%, compared with 37.7% in the new study.
And an overall trend analysis in the study shows an approximately 10% increase in representation of women in all of the board reviews. However, only the ASH hematology board review achieved more than 50% women in their two courses.
“Overall, the proportion of women speakers is improving over the years, though it remains suboptimal,” Dr. Al Hadidi said.
The authors noted that oncology is clearly not the only specialty with gender disparities. They documented a lack of women speakers at conferences involving otolaryngology head and neck meetings, radiation oncology, emergency medicine, and research conferences.
They pointed to the work of ASH’s Women in Hematology Working Group as an important example of the needed effort to improve the balance of women hematologists.
Ariela Marshall, MD, director of women’s thrombosis and hemostasis at Penn Medicine in Philadelphia and a leader of ASH’s Women in Hematology Working Group, agreed that more efforts are needed to address both gender disparities as well as those of early career speakers. She asserted that the two disparities appear to be connected.
“If you broke down gender representation over time and the faculty/time since initial certification, the findings may mirror the percent of women in hematology-oncology at that given point in time,” Dr. Marshall said in an interview.
“If an institution is truly committed to taking action on gender equity, it needs to look at gender and experience equity of speakers,” she said. “Perhaps it’s the time to say ‘Dr. X has been doing this review course for 15 years. Let’s give someone else a chance.’
“This is not even just from a gender equity perspective but from a career development perspective overall,” she added. “Junior faculty need these speaking engagements a lot more than senior faculty.”
Meanwhile, the higher number of female trainees is a trend that ideally will be sustained as those trainees move into positions of leadership, Dr. Marshall noted.
“We do see that over time, we have achieved gender equity in the percent of women matriculating to medical school. And my hope is that, 20 years down the line, we will see the effects of this reflected in increased equity in leadership positions such as division/department chair, dean, and hospital CEO,” she said. “However, we have a lot of work to do because there are still huge inequities in the culture of medicine (institutional and more broadly), including gender-based discrimination, maternal discrimination, and high attrition rates for women physicians, compared to male physicians.
“It’s not enough to simply say ‘well, we have fixed the problem because our incoming medical student classes are now equitable in gender distribution,’ ”
The authors and Dr. Marshall had no disclosures to report.
FROM BLOOD ADVANCES
Understanding the relationship between life satisfaction and cognitive decline
Every day, we depend on our working memory, spatial cognition, and processing speed abilities to optimize productivity, interpersonal interactions, and psychological wellbeing. These cognitive functioning indices relate closely with academic and work performance, managing emotions, physical fitness, and a sense of fulfillment in personal and work relationships. They are linked intimately to complex cognitive skills (van Dijk et al., 2020). It is thus imperative to identify modifiable predictors of cognitive functioning in the brain to protect against aging-related cognitive decline and maximize the quality of life.
Similarly, it is plausible that a reduction in cognitive functioning may lead to a long-term decrease in life satisfaction. Working memory, processing speed, spatial cognition, and related capacities are essential to meaningful activities and feelings of gratification in personal and professional relationships and other spheres of health throughout life (Baumeister et al., 2007). These cognitive functioning markers safeguard against reduced life satisfaction by facilitating effective problem-solving, and choices (Swanson and Fung, 2016). For example, stronger working memory, processing speed, and related domains coincided with better tolerance for stress and trading off immediate rewards for long-term values and life goals (Hofmann et al., 2012). Therefore, reduction in cognitive functioning abilities could precede a future decline in life satisfaction.
Nonetheless, the literature on this topic has several limitations. Most of the studies have been cross-sectional (i.e., across a single time-point) and thus do not permit inferences between cause and effect (e.g., Toh et al., 2020). Also, most studies used statistical methods that did not differentiate between between-person (trait-like individual differences) and within-person (state-like) relations. Distinguishing within- and between-person relations is necessary because they may vary in magnitude and direction. The preceding theories emphasize change-to-future change relations within persons rather than between persons (Wright and Woods, 2020).
Clinical implications
Our recent work (Zainal and Newman, 2022b) added to the literature by using an advanced statistical method to determine the relations between change in life satisfaction and future change in cognitive functioning domains within persons. The choice of an advanced statistical technique minimizes biases due to the passage of time and assessment unreliability. It also adjusts for between-person effects (Klopack and Wickrama, 2020). Improving understanding of the within-person factors leading to the deterioration of cognitive functioning and life satisfaction is crucial given the rising rates of psychiatric and neurocognitive illnesses (Cui et al., 2020). Identifying these changeable risk factors can optimize prevention, early detection, and treatment approaches.
Specifically, we analyzed the publicly available Swedish Adoption/Twin Study of Aging (SATSA) dataset (Petkus et al., 2017). Their dataset comprised 520 middle- to older-aged twin adults without dementia. Participants provided data across 23 years with five time points. Each time lag ranged from 3 to 11 years. The analyses demonstrated that greater decreases in life satisfaction predicted larger future declines in processing speed, verbal working memory, and spatial cognition. Moreover, declines in verbal working memory and processing speed predicted a reduction in life satisfaction. However, change in spatial awareness did not predict change in life satisfaction.
Our study offers multiple theoretical perspectives. Scar theories propose that decreased life satisfaction and related mental health problems can compromise working memory, processing speed, and spatial cognition in the long term. This scarring process occurs through the buildup of allostatic load, such as increased biomarkers of chronic stress (e.g., cortisol) and inflammation (e.g., interleukin-6, C-reactive protein) (Fancourt and Steptoe, 2020; Zainal and Newman, 2021a). Also, findings suggest the importance of executive functioning domains to attain desired milestones and aspirations to enhance a sense of fulfillment (Baddeley, 2013; Toh and Yang, 2020). Reductions in these cognitive functioning capacities could, over time, adversely affect the ability to engage in daily living activities and manage negative moods.
Limitations of our study include the lack of a multiple-assessment approach to measuring diverse cognitive functioning domains. Also, the absence of cognitive self-reports is a shortcoming since perceived cognitive difficulties might not align with performance on cognitive tests. Relatedly, future studies should administer cognitive tests that parallel and transfer to everyday tasks. However, our study’s strengths include the robust findings across different intervals between study waves, advanced statistics, and the large sample size.
If future studies replicate a similar pattern of results, the clinical applications of this study merit attention. Mindfulness-based interventions can promote working memory, sustained awareness, and spatial cognition or protect against cognitive decline (Jha et al., 2019; Zainal and Newman, 2021b). Further, clinical science can profit from exploring cognitive-behavioral therapies to improve adults’ cognitive function or life satisfaction (Sok et al., 2021).
Dr. Zainal recently accepted a 2-year postdoctoral research associate position at Harvard Medical School, Boston, starting in summer 2022. She received her Ph.D. from Pennsylvania State University, University Park, and completed a predoctoral clinical fellowship at the HMS-affiliated Massachusetts General Hospital – Cognitive Behavioral Scientist Track. Her research interests focus on how executive functioning, social cognition, and cognitive-behavioral strategies link to the etiology, maintenance, and treatment of anxiety and depressive disorders. Dr. Newman is a professor of psychology and psychiatry, and the director of the Center for the Treatment of Anxiety and Depression, at Pennsylvania State University. She has conducted basic and applied research on anxiety disorders and depression and has published over 200 papers on these topics.
Sources
Baddeley A. Working memory and emotion: Ruminations on a theory of depression. Rev Gen Psychol. 2013;17(1):20-7. doi: 10.1037/a0030029.
Baumeister RF et al. “Self-regulation and the executive function: The self as controlling agent,” in Social Psychology: Handbook of Basic Principles, 2nd ed. (pp. 516-39). The Guilford Press: New York, 2007.
Cui L et al. Prevalence of alzheimer’s disease and parkinson’s disease in China: An updated systematical analysis. Front Aging Neurosci. 2020 Dec 21;12:603854. doi: 10.3389/fnagi.2020.603854.
Fancourt D and Steptoe A. The longitudinal relationship between changes in wellbeing and inflammatory markers: Are associations independent of depression? Brain Behav Immun. 2020 Jan;83:146-52. doi: 10.1016/j.bbi.2019.10.004.
Grant N et al. The relationship between life satisfaction and health behavior: A cross-cultural analysis of young adults. Int J Behav Med. 2009;16(3):259-68. doi: 10.1007/s12529-009-9032-x.
Hofmann W et al. Executive functions and self-regulation. Trends Cogn Sci. 2012 Mar;16(3):174-80. doi: 10.1016/j.tics.2012.01.006.
Jha AP et al. Bolstering cognitive resilience via train-the-trainer delivery of mindfulness training in applied high-demand settings. Mindfulness. 2019;11(3):683-97. doi: 10.1007/s12671-019-01284-7.
Klopack ET and Wickrama K. Modeling latent change score analysis and extensions in Mplus: A practical guide for researchers. Struct Equ Modeling. 2020;27(1):97-110. doi: 10.1080/10705511.2018.1562929.
Petkus AJ et al. Temporal dynamics of cognitive performance and anxiety across older adulthood. Psychol Aging. 2017 May;32(3):278-92. doi: 10.1037/pag0000164.
Ratigan A et al. Sex differences in the association of physical function and cognitive function with life satisfaction in older age: The Rancho Bernardo Study. Maturitas. 2016 Jul;89:29-35. doi: 10.1016/j.maturitas.2016.04.007.
Sok S et al. Effects of cognitive/exercise dual-task program on the cognitive function, health status, depression, and life satisfaction of the elderly living in the community. Int J Environ Res Public Health. 2021 Jul 24;18(15):7848. doi: 10.3390/ijerph18157848.
Swanson HL and Fung W. Working memory components and problem-solving accuracy: Are there multiple pathways? J Educ Psychol. 2016;108(8):1153-77. doi: 10.1037/edu0000116.
Toh WX and Yang H. Executive function moderates the effect of reappraisal on life satisfaction: A latent variable analysis. Emotion. 2020;22(3):554-71. doi: 10.1037/emo0000907.
Toh WX et al. Executive function and subjective wellbeing in middle and late adulthood. J Gerontol B Psychol Sci Soc Sci. 2020 Jun 2;75(6):e69-e77. doi: 10.1093/geronb/gbz006.
van Dijk DM, et al. Cognitive functioning, sleep quality, and work performance in non-clinical burnout: The role of working memory. PLoS One. 2020 Apr 23;15(4):e0231906. doi: 10.1371/journal.pone.0231906.
Wright AGC and Woods WC. Personalized models of psychopathology. Annu Rev Clin Psychol. 2020 May 7;16:49-74. doi: 10.1146/annurev-clinpsy-102419-125032.
Zainal NH and Newman MG. (2021a). Depression and worry symptoms predict future executive functioning impairment via inflammation. Psychol Med. 2021 Mar 3;1-11. doi: 10.1017/S0033291721000398.
Zainal NH and Newman MG. (2021b). Mindfulness enhances cognitive functioning: A meta-analysis of 111 randomized controlled trials. PsyArXiv Preprints. 2021 May 11. doi: 10.31234/osf.io/vzxw7.
Zainal NH and Newman MG. (2022a). Inflammation mediates depression and generalized anxiety symptoms predicting executive function impairment after 18 years. J Affect Disord. 2022 Jan 1;296:465-75. doi: 10.1016/j.jad.2021.08.077.
Zainal NH and Newman MG. (2022b). Life satisfaction prevents decline in working memory, spatial cognition, and processing speed: Latent change score analyses across 23 years. Eur Psychiatry. 2022 Apr 19;65(1):1-55. doi: 10.1192/j.eurpsy.2022.19.
Every day, we depend on our working memory, spatial cognition, and processing speed abilities to optimize productivity, interpersonal interactions, and psychological wellbeing. These cognitive functioning indices relate closely with academic and work performance, managing emotions, physical fitness, and a sense of fulfillment in personal and work relationships. They are linked intimately to complex cognitive skills (van Dijk et al., 2020). It is thus imperative to identify modifiable predictors of cognitive functioning in the brain to protect against aging-related cognitive decline and maximize the quality of life.
Similarly, it is plausible that a reduction in cognitive functioning may lead to a long-term decrease in life satisfaction. Working memory, processing speed, spatial cognition, and related capacities are essential to meaningful activities and feelings of gratification in personal and professional relationships and other spheres of health throughout life (Baumeister et al., 2007). These cognitive functioning markers safeguard against reduced life satisfaction by facilitating effective problem-solving, and choices (Swanson and Fung, 2016). For example, stronger working memory, processing speed, and related domains coincided with better tolerance for stress and trading off immediate rewards for long-term values and life goals (Hofmann et al., 2012). Therefore, reduction in cognitive functioning abilities could precede a future decline in life satisfaction.
Nonetheless, the literature on this topic has several limitations. Most of the studies have been cross-sectional (i.e., across a single time-point) and thus do not permit inferences between cause and effect (e.g., Toh et al., 2020). Also, most studies used statistical methods that did not differentiate between between-person (trait-like individual differences) and within-person (state-like) relations. Distinguishing within- and between-person relations is necessary because they may vary in magnitude and direction. The preceding theories emphasize change-to-future change relations within persons rather than between persons (Wright and Woods, 2020).
Clinical implications
Our recent work (Zainal and Newman, 2022b) added to the literature by using an advanced statistical method to determine the relations between change in life satisfaction and future change in cognitive functioning domains within persons. The choice of an advanced statistical technique minimizes biases due to the passage of time and assessment unreliability. It also adjusts for between-person effects (Klopack and Wickrama, 2020). Improving understanding of the within-person factors leading to the deterioration of cognitive functioning and life satisfaction is crucial given the rising rates of psychiatric and neurocognitive illnesses (Cui et al., 2020). Identifying these changeable risk factors can optimize prevention, early detection, and treatment approaches.
Specifically, we analyzed the publicly available Swedish Adoption/Twin Study of Aging (SATSA) dataset (Petkus et al., 2017). Their dataset comprised 520 middle- to older-aged twin adults without dementia. Participants provided data across 23 years with five time points. Each time lag ranged from 3 to 11 years. The analyses demonstrated that greater decreases in life satisfaction predicted larger future declines in processing speed, verbal working memory, and spatial cognition. Moreover, declines in verbal working memory and processing speed predicted a reduction in life satisfaction. However, change in spatial awareness did not predict change in life satisfaction.
Our study offers multiple theoretical perspectives. Scar theories propose that decreased life satisfaction and related mental health problems can compromise working memory, processing speed, and spatial cognition in the long term. This scarring process occurs through the buildup of allostatic load, such as increased biomarkers of chronic stress (e.g., cortisol) and inflammation (e.g., interleukin-6, C-reactive protein) (Fancourt and Steptoe, 2020; Zainal and Newman, 2021a). Also, findings suggest the importance of executive functioning domains to attain desired milestones and aspirations to enhance a sense of fulfillment (Baddeley, 2013; Toh and Yang, 2020). Reductions in these cognitive functioning capacities could, over time, adversely affect the ability to engage in daily living activities and manage negative moods.
Limitations of our study include the lack of a multiple-assessment approach to measuring diverse cognitive functioning domains. Also, the absence of cognitive self-reports is a shortcoming since perceived cognitive difficulties might not align with performance on cognitive tests. Relatedly, future studies should administer cognitive tests that parallel and transfer to everyday tasks. However, our study’s strengths include the robust findings across different intervals between study waves, advanced statistics, and the large sample size.
If future studies replicate a similar pattern of results, the clinical applications of this study merit attention. Mindfulness-based interventions can promote working memory, sustained awareness, and spatial cognition or protect against cognitive decline (Jha et al., 2019; Zainal and Newman, 2021b). Further, clinical science can profit from exploring cognitive-behavioral therapies to improve adults’ cognitive function or life satisfaction (Sok et al., 2021).
Dr. Zainal recently accepted a 2-year postdoctoral research associate position at Harvard Medical School, Boston, starting in summer 2022. She received her Ph.D. from Pennsylvania State University, University Park, and completed a predoctoral clinical fellowship at the HMS-affiliated Massachusetts General Hospital – Cognitive Behavioral Scientist Track. Her research interests focus on how executive functioning, social cognition, and cognitive-behavioral strategies link to the etiology, maintenance, and treatment of anxiety and depressive disorders. Dr. Newman is a professor of psychology and psychiatry, and the director of the Center for the Treatment of Anxiety and Depression, at Pennsylvania State University. She has conducted basic and applied research on anxiety disorders and depression and has published over 200 papers on these topics.
Sources
Baddeley A. Working memory and emotion: Ruminations on a theory of depression. Rev Gen Psychol. 2013;17(1):20-7. doi: 10.1037/a0030029.
Baumeister RF et al. “Self-regulation and the executive function: The self as controlling agent,” in Social Psychology: Handbook of Basic Principles, 2nd ed. (pp. 516-39). The Guilford Press: New York, 2007.
Cui L et al. Prevalence of alzheimer’s disease and parkinson’s disease in China: An updated systematical analysis. Front Aging Neurosci. 2020 Dec 21;12:603854. doi: 10.3389/fnagi.2020.603854.
Fancourt D and Steptoe A. The longitudinal relationship between changes in wellbeing and inflammatory markers: Are associations independent of depression? Brain Behav Immun. 2020 Jan;83:146-52. doi: 10.1016/j.bbi.2019.10.004.
Grant N et al. The relationship between life satisfaction and health behavior: A cross-cultural analysis of young adults. Int J Behav Med. 2009;16(3):259-68. doi: 10.1007/s12529-009-9032-x.
Hofmann W et al. Executive functions and self-regulation. Trends Cogn Sci. 2012 Mar;16(3):174-80. doi: 10.1016/j.tics.2012.01.006.
Jha AP et al. Bolstering cognitive resilience via train-the-trainer delivery of mindfulness training in applied high-demand settings. Mindfulness. 2019;11(3):683-97. doi: 10.1007/s12671-019-01284-7.
Klopack ET and Wickrama K. Modeling latent change score analysis and extensions in Mplus: A practical guide for researchers. Struct Equ Modeling. 2020;27(1):97-110. doi: 10.1080/10705511.2018.1562929.
Petkus AJ et al. Temporal dynamics of cognitive performance and anxiety across older adulthood. Psychol Aging. 2017 May;32(3):278-92. doi: 10.1037/pag0000164.
Ratigan A et al. Sex differences in the association of physical function and cognitive function with life satisfaction in older age: The Rancho Bernardo Study. Maturitas. 2016 Jul;89:29-35. doi: 10.1016/j.maturitas.2016.04.007.
Sok S et al. Effects of cognitive/exercise dual-task program on the cognitive function, health status, depression, and life satisfaction of the elderly living in the community. Int J Environ Res Public Health. 2021 Jul 24;18(15):7848. doi: 10.3390/ijerph18157848.
Swanson HL and Fung W. Working memory components and problem-solving accuracy: Are there multiple pathways? J Educ Psychol. 2016;108(8):1153-77. doi: 10.1037/edu0000116.
Toh WX and Yang H. Executive function moderates the effect of reappraisal on life satisfaction: A latent variable analysis. Emotion. 2020;22(3):554-71. doi: 10.1037/emo0000907.
Toh WX et al. Executive function and subjective wellbeing in middle and late adulthood. J Gerontol B Psychol Sci Soc Sci. 2020 Jun 2;75(6):e69-e77. doi: 10.1093/geronb/gbz006.
van Dijk DM, et al. Cognitive functioning, sleep quality, and work performance in non-clinical burnout: The role of working memory. PLoS One. 2020 Apr 23;15(4):e0231906. doi: 10.1371/journal.pone.0231906.
Wright AGC and Woods WC. Personalized models of psychopathology. Annu Rev Clin Psychol. 2020 May 7;16:49-74. doi: 10.1146/annurev-clinpsy-102419-125032.
Zainal NH and Newman MG. (2021a). Depression and worry symptoms predict future executive functioning impairment via inflammation. Psychol Med. 2021 Mar 3;1-11. doi: 10.1017/S0033291721000398.
Zainal NH and Newman MG. (2021b). Mindfulness enhances cognitive functioning: A meta-analysis of 111 randomized controlled trials. PsyArXiv Preprints. 2021 May 11. doi: 10.31234/osf.io/vzxw7.
Zainal NH and Newman MG. (2022a). Inflammation mediates depression and generalized anxiety symptoms predicting executive function impairment after 18 years. J Affect Disord. 2022 Jan 1;296:465-75. doi: 10.1016/j.jad.2021.08.077.
Zainal NH and Newman MG. (2022b). Life satisfaction prevents decline in working memory, spatial cognition, and processing speed: Latent change score analyses across 23 years. Eur Psychiatry. 2022 Apr 19;65(1):1-55. doi: 10.1192/j.eurpsy.2022.19.
Every day, we depend on our working memory, spatial cognition, and processing speed abilities to optimize productivity, interpersonal interactions, and psychological wellbeing. These cognitive functioning indices relate closely with academic and work performance, managing emotions, physical fitness, and a sense of fulfillment in personal and work relationships. They are linked intimately to complex cognitive skills (van Dijk et al., 2020). It is thus imperative to identify modifiable predictors of cognitive functioning in the brain to protect against aging-related cognitive decline and maximize the quality of life.
Similarly, it is plausible that a reduction in cognitive functioning may lead to a long-term decrease in life satisfaction. Working memory, processing speed, spatial cognition, and related capacities are essential to meaningful activities and feelings of gratification in personal and professional relationships and other spheres of health throughout life (Baumeister et al., 2007). These cognitive functioning markers safeguard against reduced life satisfaction by facilitating effective problem-solving, and choices (Swanson and Fung, 2016). For example, stronger working memory, processing speed, and related domains coincided with better tolerance for stress and trading off immediate rewards for long-term values and life goals (Hofmann et al., 2012). Therefore, reduction in cognitive functioning abilities could precede a future decline in life satisfaction.
Nonetheless, the literature on this topic has several limitations. Most of the studies have been cross-sectional (i.e., across a single time-point) and thus do not permit inferences between cause and effect (e.g., Toh et al., 2020). Also, most studies used statistical methods that did not differentiate between between-person (trait-like individual differences) and within-person (state-like) relations. Distinguishing within- and between-person relations is necessary because they may vary in magnitude and direction. The preceding theories emphasize change-to-future change relations within persons rather than between persons (Wright and Woods, 2020).
Clinical implications
Our recent work (Zainal and Newman, 2022b) added to the literature by using an advanced statistical method to determine the relations between change in life satisfaction and future change in cognitive functioning domains within persons. The choice of an advanced statistical technique minimizes biases due to the passage of time and assessment unreliability. It also adjusts for between-person effects (Klopack and Wickrama, 2020). Improving understanding of the within-person factors leading to the deterioration of cognitive functioning and life satisfaction is crucial given the rising rates of psychiatric and neurocognitive illnesses (Cui et al., 2020). Identifying these changeable risk factors can optimize prevention, early detection, and treatment approaches.
Specifically, we analyzed the publicly available Swedish Adoption/Twin Study of Aging (SATSA) dataset (Petkus et al., 2017). Their dataset comprised 520 middle- to older-aged twin adults without dementia. Participants provided data across 23 years with five time points. Each time lag ranged from 3 to 11 years. The analyses demonstrated that greater decreases in life satisfaction predicted larger future declines in processing speed, verbal working memory, and spatial cognition. Moreover, declines in verbal working memory and processing speed predicted a reduction in life satisfaction. However, change in spatial awareness did not predict change in life satisfaction.
Our study offers multiple theoretical perspectives. Scar theories propose that decreased life satisfaction and related mental health problems can compromise working memory, processing speed, and spatial cognition in the long term. This scarring process occurs through the buildup of allostatic load, such as increased biomarkers of chronic stress (e.g., cortisol) and inflammation (e.g., interleukin-6, C-reactive protein) (Fancourt and Steptoe, 2020; Zainal and Newman, 2021a). Also, findings suggest the importance of executive functioning domains to attain desired milestones and aspirations to enhance a sense of fulfillment (Baddeley, 2013; Toh and Yang, 2020). Reductions in these cognitive functioning capacities could, over time, adversely affect the ability to engage in daily living activities and manage negative moods.
Limitations of our study include the lack of a multiple-assessment approach to measuring diverse cognitive functioning domains. Also, the absence of cognitive self-reports is a shortcoming since perceived cognitive difficulties might not align with performance on cognitive tests. Relatedly, future studies should administer cognitive tests that parallel and transfer to everyday tasks. However, our study’s strengths include the robust findings across different intervals between study waves, advanced statistics, and the large sample size.
If future studies replicate a similar pattern of results, the clinical applications of this study merit attention. Mindfulness-based interventions can promote working memory, sustained awareness, and spatial cognition or protect against cognitive decline (Jha et al., 2019; Zainal and Newman, 2021b). Further, clinical science can profit from exploring cognitive-behavioral therapies to improve adults’ cognitive function or life satisfaction (Sok et al., 2021).
Dr. Zainal recently accepted a 2-year postdoctoral research associate position at Harvard Medical School, Boston, starting in summer 2022. She received her Ph.D. from Pennsylvania State University, University Park, and completed a predoctoral clinical fellowship at the HMS-affiliated Massachusetts General Hospital – Cognitive Behavioral Scientist Track. Her research interests focus on how executive functioning, social cognition, and cognitive-behavioral strategies link to the etiology, maintenance, and treatment of anxiety and depressive disorders. Dr. Newman is a professor of psychology and psychiatry, and the director of the Center for the Treatment of Anxiety and Depression, at Pennsylvania State University. She has conducted basic and applied research on anxiety disorders and depression and has published over 200 papers on these topics.
Sources
Baddeley A. Working memory and emotion: Ruminations on a theory of depression. Rev Gen Psychol. 2013;17(1):20-7. doi: 10.1037/a0030029.
Baumeister RF et al. “Self-regulation and the executive function: The self as controlling agent,” in Social Psychology: Handbook of Basic Principles, 2nd ed. (pp. 516-39). The Guilford Press: New York, 2007.
Cui L et al. Prevalence of alzheimer’s disease and parkinson’s disease in China: An updated systematical analysis. Front Aging Neurosci. 2020 Dec 21;12:603854. doi: 10.3389/fnagi.2020.603854.
Fancourt D and Steptoe A. The longitudinal relationship between changes in wellbeing and inflammatory markers: Are associations independent of depression? Brain Behav Immun. 2020 Jan;83:146-52. doi: 10.1016/j.bbi.2019.10.004.
Grant N et al. The relationship between life satisfaction and health behavior: A cross-cultural analysis of young adults. Int J Behav Med. 2009;16(3):259-68. doi: 10.1007/s12529-009-9032-x.
Hofmann W et al. Executive functions and self-regulation. Trends Cogn Sci. 2012 Mar;16(3):174-80. doi: 10.1016/j.tics.2012.01.006.
Jha AP et al. Bolstering cognitive resilience via train-the-trainer delivery of mindfulness training in applied high-demand settings. Mindfulness. 2019;11(3):683-97. doi: 10.1007/s12671-019-01284-7.
Klopack ET and Wickrama K. Modeling latent change score analysis and extensions in Mplus: A practical guide for researchers. Struct Equ Modeling. 2020;27(1):97-110. doi: 10.1080/10705511.2018.1562929.
Petkus AJ et al. Temporal dynamics of cognitive performance and anxiety across older adulthood. Psychol Aging. 2017 May;32(3):278-92. doi: 10.1037/pag0000164.
Ratigan A et al. Sex differences in the association of physical function and cognitive function with life satisfaction in older age: The Rancho Bernardo Study. Maturitas. 2016 Jul;89:29-35. doi: 10.1016/j.maturitas.2016.04.007.
Sok S et al. Effects of cognitive/exercise dual-task program on the cognitive function, health status, depression, and life satisfaction of the elderly living in the community. Int J Environ Res Public Health. 2021 Jul 24;18(15):7848. doi: 10.3390/ijerph18157848.
Swanson HL and Fung W. Working memory components and problem-solving accuracy: Are there multiple pathways? J Educ Psychol. 2016;108(8):1153-77. doi: 10.1037/edu0000116.
Toh WX and Yang H. Executive function moderates the effect of reappraisal on life satisfaction: A latent variable analysis. Emotion. 2020;22(3):554-71. doi: 10.1037/emo0000907.
Toh WX et al. Executive function and subjective wellbeing in middle and late adulthood. J Gerontol B Psychol Sci Soc Sci. 2020 Jun 2;75(6):e69-e77. doi: 10.1093/geronb/gbz006.
van Dijk DM, et al. Cognitive functioning, sleep quality, and work performance in non-clinical burnout: The role of working memory. PLoS One. 2020 Apr 23;15(4):e0231906. doi: 10.1371/journal.pone.0231906.
Wright AGC and Woods WC. Personalized models of psychopathology. Annu Rev Clin Psychol. 2020 May 7;16:49-74. doi: 10.1146/annurev-clinpsy-102419-125032.
Zainal NH and Newman MG. (2021a). Depression and worry symptoms predict future executive functioning impairment via inflammation. Psychol Med. 2021 Mar 3;1-11. doi: 10.1017/S0033291721000398.
Zainal NH and Newman MG. (2021b). Mindfulness enhances cognitive functioning: A meta-analysis of 111 randomized controlled trials. PsyArXiv Preprints. 2021 May 11. doi: 10.31234/osf.io/vzxw7.
Zainal NH and Newman MG. (2022a). Inflammation mediates depression and generalized anxiety symptoms predicting executive function impairment after 18 years. J Affect Disord. 2022 Jan 1;296:465-75. doi: 10.1016/j.jad.2021.08.077.
Zainal NH and Newman MG. (2022b). Life satisfaction prevents decline in working memory, spatial cognition, and processing speed: Latent change score analyses across 23 years. Eur Psychiatry. 2022 Apr 19;65(1):1-55. doi: 10.1192/j.eurpsy.2022.19.
Ultrasound helps predict gout flares over the next year
Adding ultrasound (US) to the clinical exam helps predict the likelihood of future gout flares, results of a prospective, observational study conducted in Italy suggest.
“Baseline US findings indicative of MSU [monosodium urate] burden and US-detected inflammation are independent predictors of gout flares over 12 months,” lead author Edoardo Cipolletta, MD, of the rheumatology unit, department of clinical and molecular sciences at Marche Polytechnic University in Ancona, Italy, and colleagues wrote in Rheumatology.
“We demonstrated that US findings provided an additional value over clinical data in estimating the risk of flares. Moreover, we reported an association between US findings at a joint and the occurrence of gout flares at the same joint,” they added.
Predicting risk of flares and reducing their occurrence are two main challenges in managing gout, the authors wrote. US can be used to scan multiple joints and is widely used in Europe as a low-cost, radiation-free imaging tool that’s easily integrated into clinical practice.
To investigate whether US can predict gout flares, the researchers enrolled 81 consecutive adult patients with gout in the study between April 2019 and March 2021 at one academic rheumatology treatment site in Italy and followed them for 12 months. The authors compared cases (who developed at least one flare within 12 months of the baseline visit) with controls (who self-reported no gout flares over that period).
Patients diagnosed with other inflammatory arthritis and those with coexisting calcium pyrophosphate deposition disease were excluded from the study.
The 71 participants who completed the study were, on average, in their early 60s, and in both groups, all but one were male. At the baseline visit, all had been on stable urate-lowering therapy for at least 6 months and had not had any gout flares in 4 weeks. The mean gout duration was 7 years in the case group and 8 years in controls.
At baseline, all participants underwent physical examination and US of elbows, wrists, second metacarpophalangeal joints, knees, ankles, and first metatarsophalangeal joints by a member of the research team who was blinded to the clinical and laboratory data.
Clinical assessments were scheduled at baseline and at 6-month intervals, and all participants were evaluated by a second researcher who was blinded to US findings.
During follow-up visits, participants were asked to report any gout flare, considered to meet at least three of four criteria: patient-defined flare, pain at rest score higher than 3 on a 0-10 scale, at least one swollen joint, and at least one warm joint. Patients not reaching their target serum urate goal received escalated urate-lowering therapy dosage and anti-inflammatory prophylaxis.
The US indicators of MSU deposits – aggregates, double contour sign, and tophi – were recorded as present or absent. The power Doppler signal was scored from 0 through 4, and summated scores for each US finding were calculated.
Over 12 months, the researchers found:
- Thirty (42.3%) patients had at least one flare, with a median of 2.0 flares. Patients with flares had higher a US median total MSU score (5.0 vs. 2.0; P = .01) and power Doppler signal (3.0 vs. 0; P < .01) than controls.
- In multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the occurrence of flares. The adjusted odds ratio for total MSU score was 1.75 (95% confidence interval, 1.26-2.43) and for power Doppler score was 1.63 (95% CI, 1.12-2.40).
- Also in a multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the number of flares. The incidence risk ratio for total MSU score adjusted was 1.17 (95% CI, 1.08-1.26) and for power Doppler score was 1.29 (95% CI, 1.19-1.40).
Four rheumatologists welcome findings
Gout remains the most common cause of inflammatory arthritis and a significant reason for hospital visits, noted Narender Annapureddy, MD, associate professor of medicine at Vanderbilt University Medical Center in Nashville, Tenn..
“The study adds to the growing utility of musculoskeletal ultrasound in rheumatology practices to treat various diseases,” he said. “Data that could provide risk prediction for gout flares would be associated with significant benefits in terms of reducing ED visits, hospital admission, and lost work productivity.”
One study limitation, Dr. Annapureddy mentioned, was the single experienced US reader, “which may limit generalizability of results at this time, at least in the United States.”
Yeohan Song, MD, an instructor at Ohio State University Wexner Medical Center, Columbus, integrates US into his practice.
“In gout management, musculoskeletal ultrasound is a useful adjunct to the clinical exam and laboratory markers, particularly [in patients] with recurrent flares despite guideline-directed target serum urate levels,” he said.
Sara K. Tedeschi, MD, MPH, assistant professor of medicine at Harvard Medical School, Boston, pointed out that the US protocol in the study involved imaging knees, ankles, first metatarsophalangeal joints, elbows, wrists, and second metacarpophalangeal joints, and took around 30 minutes to complete.
“That would not be practical in the United States due to time constraints in most rheumatology clinics,” she said.
“The authors report that a ‘reduced scanning protocol’ of the bilateral knees, ankles, and first metatarsophalangeal joints demonstrated similar predictive ability as the full protocol,” she added, “although scanning six joints still might not be feasible during a typical return patient clinic visit in the United States.”
Philip Chu, MD, clinical associate at Duke University, Durham, N.C., uses diagnostic US to help differentiate borderline gout cases from other arthropathies.
“A baseline scan, a follow-up scan before deciding to stop prophylaxis, or a follow-up scan in the setting of recurrent gout flares despite reaching goal serum uric acid, may be cost-effective time points to perform diagnostic US,” he advised.
“Unfortunately,” he added, “reimbursement for diagnostic US has been decreasing over the years, which makes it challenging to increase diagnostic US to the [frequency of its use] in Europe.”
Asked how most gout care being provided by primary care doctors in the United States affects gout management, Dr. Chu said: “Depending on which guidelines one follows for treating gout – from the American College of Rheumatology or the American College of Physicians – one may be more or less likely to start urate-lowering therapy after the first gout flare.”
“Understanding MSU burden in each patient, or even seeing active inflammation at these sites by increased Doppler signal, may change the threshold for physicians to initiate therapy,” he added.
The study received no funding. Three study authors reported financial involvements with pharmaceutical companies. Dr. Cipolletta, Dr. Annapureddy, Dr. Song, Dr. Tedeschi, and Dr. Chu reported no conflicts of interest.
A version of this article first appeared on Medscape.com.
Adding ultrasound (US) to the clinical exam helps predict the likelihood of future gout flares, results of a prospective, observational study conducted in Italy suggest.
“Baseline US findings indicative of MSU [monosodium urate] burden and US-detected inflammation are independent predictors of gout flares over 12 months,” lead author Edoardo Cipolletta, MD, of the rheumatology unit, department of clinical and molecular sciences at Marche Polytechnic University in Ancona, Italy, and colleagues wrote in Rheumatology.
“We demonstrated that US findings provided an additional value over clinical data in estimating the risk of flares. Moreover, we reported an association between US findings at a joint and the occurrence of gout flares at the same joint,” they added.
Predicting risk of flares and reducing their occurrence are two main challenges in managing gout, the authors wrote. US can be used to scan multiple joints and is widely used in Europe as a low-cost, radiation-free imaging tool that’s easily integrated into clinical practice.
To investigate whether US can predict gout flares, the researchers enrolled 81 consecutive adult patients with gout in the study between April 2019 and March 2021 at one academic rheumatology treatment site in Italy and followed them for 12 months. The authors compared cases (who developed at least one flare within 12 months of the baseline visit) with controls (who self-reported no gout flares over that period).
Patients diagnosed with other inflammatory arthritis and those with coexisting calcium pyrophosphate deposition disease were excluded from the study.
The 71 participants who completed the study were, on average, in their early 60s, and in both groups, all but one were male. At the baseline visit, all had been on stable urate-lowering therapy for at least 6 months and had not had any gout flares in 4 weeks. The mean gout duration was 7 years in the case group and 8 years in controls.
At baseline, all participants underwent physical examination and US of elbows, wrists, second metacarpophalangeal joints, knees, ankles, and first metatarsophalangeal joints by a member of the research team who was blinded to the clinical and laboratory data.
Clinical assessments were scheduled at baseline and at 6-month intervals, and all participants were evaluated by a second researcher who was blinded to US findings.
During follow-up visits, participants were asked to report any gout flare, considered to meet at least three of four criteria: patient-defined flare, pain at rest score higher than 3 on a 0-10 scale, at least one swollen joint, and at least one warm joint. Patients not reaching their target serum urate goal received escalated urate-lowering therapy dosage and anti-inflammatory prophylaxis.
The US indicators of MSU deposits – aggregates, double contour sign, and tophi – were recorded as present or absent. The power Doppler signal was scored from 0 through 4, and summated scores for each US finding were calculated.
Over 12 months, the researchers found:
- Thirty (42.3%) patients had at least one flare, with a median of 2.0 flares. Patients with flares had higher a US median total MSU score (5.0 vs. 2.0; P = .01) and power Doppler signal (3.0 vs. 0; P < .01) than controls.
- In multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the occurrence of flares. The adjusted odds ratio for total MSU score was 1.75 (95% confidence interval, 1.26-2.43) and for power Doppler score was 1.63 (95% CI, 1.12-2.40).
- Also in a multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the number of flares. The incidence risk ratio for total MSU score adjusted was 1.17 (95% CI, 1.08-1.26) and for power Doppler score was 1.29 (95% CI, 1.19-1.40).
Four rheumatologists welcome findings
Gout remains the most common cause of inflammatory arthritis and a significant reason for hospital visits, noted Narender Annapureddy, MD, associate professor of medicine at Vanderbilt University Medical Center in Nashville, Tenn..
“The study adds to the growing utility of musculoskeletal ultrasound in rheumatology practices to treat various diseases,” he said. “Data that could provide risk prediction for gout flares would be associated with significant benefits in terms of reducing ED visits, hospital admission, and lost work productivity.”
One study limitation, Dr. Annapureddy mentioned, was the single experienced US reader, “which may limit generalizability of results at this time, at least in the United States.”
Yeohan Song, MD, an instructor at Ohio State University Wexner Medical Center, Columbus, integrates US into his practice.
“In gout management, musculoskeletal ultrasound is a useful adjunct to the clinical exam and laboratory markers, particularly [in patients] with recurrent flares despite guideline-directed target serum urate levels,” he said.
Sara K. Tedeschi, MD, MPH, assistant professor of medicine at Harvard Medical School, Boston, pointed out that the US protocol in the study involved imaging knees, ankles, first metatarsophalangeal joints, elbows, wrists, and second metacarpophalangeal joints, and took around 30 minutes to complete.
“That would not be practical in the United States due to time constraints in most rheumatology clinics,” she said.
“The authors report that a ‘reduced scanning protocol’ of the bilateral knees, ankles, and first metatarsophalangeal joints demonstrated similar predictive ability as the full protocol,” she added, “although scanning six joints still might not be feasible during a typical return patient clinic visit in the United States.”
Philip Chu, MD, clinical associate at Duke University, Durham, N.C., uses diagnostic US to help differentiate borderline gout cases from other arthropathies.
“A baseline scan, a follow-up scan before deciding to stop prophylaxis, or a follow-up scan in the setting of recurrent gout flares despite reaching goal serum uric acid, may be cost-effective time points to perform diagnostic US,” he advised.
“Unfortunately,” he added, “reimbursement for diagnostic US has been decreasing over the years, which makes it challenging to increase diagnostic US to the [frequency of its use] in Europe.”
Asked how most gout care being provided by primary care doctors in the United States affects gout management, Dr. Chu said: “Depending on which guidelines one follows for treating gout – from the American College of Rheumatology or the American College of Physicians – one may be more or less likely to start urate-lowering therapy after the first gout flare.”
“Understanding MSU burden in each patient, or even seeing active inflammation at these sites by increased Doppler signal, may change the threshold for physicians to initiate therapy,” he added.
The study received no funding. Three study authors reported financial involvements with pharmaceutical companies. Dr. Cipolletta, Dr. Annapureddy, Dr. Song, Dr. Tedeschi, and Dr. Chu reported no conflicts of interest.
A version of this article first appeared on Medscape.com.
Adding ultrasound (US) to the clinical exam helps predict the likelihood of future gout flares, results of a prospective, observational study conducted in Italy suggest.
“Baseline US findings indicative of MSU [monosodium urate] burden and US-detected inflammation are independent predictors of gout flares over 12 months,” lead author Edoardo Cipolletta, MD, of the rheumatology unit, department of clinical and molecular sciences at Marche Polytechnic University in Ancona, Italy, and colleagues wrote in Rheumatology.
“We demonstrated that US findings provided an additional value over clinical data in estimating the risk of flares. Moreover, we reported an association between US findings at a joint and the occurrence of gout flares at the same joint,” they added.
Predicting risk of flares and reducing their occurrence are two main challenges in managing gout, the authors wrote. US can be used to scan multiple joints and is widely used in Europe as a low-cost, radiation-free imaging tool that’s easily integrated into clinical practice.
To investigate whether US can predict gout flares, the researchers enrolled 81 consecutive adult patients with gout in the study between April 2019 and March 2021 at one academic rheumatology treatment site in Italy and followed them for 12 months. The authors compared cases (who developed at least one flare within 12 months of the baseline visit) with controls (who self-reported no gout flares over that period).
Patients diagnosed with other inflammatory arthritis and those with coexisting calcium pyrophosphate deposition disease were excluded from the study.
The 71 participants who completed the study were, on average, in their early 60s, and in both groups, all but one were male. At the baseline visit, all had been on stable urate-lowering therapy for at least 6 months and had not had any gout flares in 4 weeks. The mean gout duration was 7 years in the case group and 8 years in controls.
At baseline, all participants underwent physical examination and US of elbows, wrists, second metacarpophalangeal joints, knees, ankles, and first metatarsophalangeal joints by a member of the research team who was blinded to the clinical and laboratory data.
Clinical assessments were scheduled at baseline and at 6-month intervals, and all participants were evaluated by a second researcher who was blinded to US findings.
During follow-up visits, participants were asked to report any gout flare, considered to meet at least three of four criteria: patient-defined flare, pain at rest score higher than 3 on a 0-10 scale, at least one swollen joint, and at least one warm joint. Patients not reaching their target serum urate goal received escalated urate-lowering therapy dosage and anti-inflammatory prophylaxis.
The US indicators of MSU deposits – aggregates, double contour sign, and tophi – were recorded as present or absent. The power Doppler signal was scored from 0 through 4, and summated scores for each US finding were calculated.
Over 12 months, the researchers found:
- Thirty (42.3%) patients had at least one flare, with a median of 2.0 flares. Patients with flares had higher a US median total MSU score (5.0 vs. 2.0; P = .01) and power Doppler signal (3.0 vs. 0; P < .01) than controls.
- In multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the occurrence of flares. The adjusted odds ratio for total MSU score was 1.75 (95% confidence interval, 1.26-2.43) and for power Doppler score was 1.63 (95% CI, 1.12-2.40).
- Also in a multivariate analysis, baseline US scores indicating MSU deposits and US-detected inflammation were significantly linked with the number of flares. The incidence risk ratio for total MSU score adjusted was 1.17 (95% CI, 1.08-1.26) and for power Doppler score was 1.29 (95% CI, 1.19-1.40).
Four rheumatologists welcome findings
Gout remains the most common cause of inflammatory arthritis and a significant reason for hospital visits, noted Narender Annapureddy, MD, associate professor of medicine at Vanderbilt University Medical Center in Nashville, Tenn..
“The study adds to the growing utility of musculoskeletal ultrasound in rheumatology practices to treat various diseases,” he said. “Data that could provide risk prediction for gout flares would be associated with significant benefits in terms of reducing ED visits, hospital admission, and lost work productivity.”
One study limitation, Dr. Annapureddy mentioned, was the single experienced US reader, “which may limit generalizability of results at this time, at least in the United States.”
Yeohan Song, MD, an instructor at Ohio State University Wexner Medical Center, Columbus, integrates US into his practice.
“In gout management, musculoskeletal ultrasound is a useful adjunct to the clinical exam and laboratory markers, particularly [in patients] with recurrent flares despite guideline-directed target serum urate levels,” he said.
Sara K. Tedeschi, MD, MPH, assistant professor of medicine at Harvard Medical School, Boston, pointed out that the US protocol in the study involved imaging knees, ankles, first metatarsophalangeal joints, elbows, wrists, and second metacarpophalangeal joints, and took around 30 minutes to complete.
“That would not be practical in the United States due to time constraints in most rheumatology clinics,” she said.
“The authors report that a ‘reduced scanning protocol’ of the bilateral knees, ankles, and first metatarsophalangeal joints demonstrated similar predictive ability as the full protocol,” she added, “although scanning six joints still might not be feasible during a typical return patient clinic visit in the United States.”
Philip Chu, MD, clinical associate at Duke University, Durham, N.C., uses diagnostic US to help differentiate borderline gout cases from other arthropathies.
“A baseline scan, a follow-up scan before deciding to stop prophylaxis, or a follow-up scan in the setting of recurrent gout flares despite reaching goal serum uric acid, may be cost-effective time points to perform diagnostic US,” he advised.
“Unfortunately,” he added, “reimbursement for diagnostic US has been decreasing over the years, which makes it challenging to increase diagnostic US to the [frequency of its use] in Europe.”
Asked how most gout care being provided by primary care doctors in the United States affects gout management, Dr. Chu said: “Depending on which guidelines one follows for treating gout – from the American College of Rheumatology or the American College of Physicians – one may be more or less likely to start urate-lowering therapy after the first gout flare.”
“Understanding MSU burden in each patient, or even seeing active inflammation at these sites by increased Doppler signal, may change the threshold for physicians to initiate therapy,” he added.
The study received no funding. Three study authors reported financial involvements with pharmaceutical companies. Dr. Cipolletta, Dr. Annapureddy, Dr. Song, Dr. Tedeschi, and Dr. Chu reported no conflicts of interest.
A version of this article first appeared on Medscape.com.
FROM RHEUMATOLOGY
Black Americans’ high gout rate stems from social causes
Gout prevalence is more common in Black Americans than White Americans, and the disparity in prevalence is attributable to social determinants of health, according to a recently published article in JAMA Network Open.
“There has been evidence from recent cohort studies in the U.S. that was suggesting that the prevalence and incidence [of gout] was growing among non-White populations,” said Natalie McCormick, PhD, the study’s lead author and postdoctoral research fellow in medicine in the division of rheumatology, allergy, and immunology at Massachusetts General Hospital and Harvard Medical School, both in Boston. “We wanted to do this at the general population level to see how generalizable [that evidence] is.”
Alvin Wells, MD, PhD, director of the department of rheumatology at Advocate Aurora Medical Group, Franklin, Wisc., noted the findings highlight inequities in care for patients with gout that could be improved with greater emphasis on educating patients about their condition.
“I think that what this shows is that in the U.S. ... there still are some disparities in treating gout,” said Dr. Wells, who was not involved with the study. “And that we have ways to mitigate that, with not only aggressive therapy, but also with other tools like counseling patients. At the end of the day, people all want to be educated about the disease.”
Greater prevalence disappears with adjustment for socioclinical factors
The cross-sectional analysis involved data from U.S. adult participants in the National Health and Nutrition Examination Survey (NHANES) from 2007 to 2016 who self-reported Black or White race.
Investigators considered factors such as excess body mass index (BMI), chronic kidney disease (defined as estimated glomerular filtration rate less than 60 mL/min per 1.73 m2), poverty, poor-quality diet, lower educational level, alcohol consumption, and diuretic use in their analysis.
Dr. McCormick and coinvestigators included a total of 18,693 participants, consisting of 3,304 Black women, 6,195 White women, 3,085 Black men, and 6,109 White men.
They determined that the age-standardized prevalence of gout was 3.5% (95% confidence interval, 2.7%-4.3%) in Black women and 2.0% (95% CI, 1.5% - 2.5%) in White women (age-adjusted odds ratio, 1.81; 95% CI, 1.29-2.53). They calculated that the prevalence was 7.0% (95% CI, 6.2%-7.9%) in Black men and 5.4% (95% CI, 4.7%-6.2%) in White men (age-adjusted OR, 1.26; 95% CI, 1.02-1.55). They found similar differences in the prevalence of hyperuricemia between Black and White Americans.
The increased prevalence of gout in Black Americans, compared with White Americans, does not arise from genetics, according to McCormick. “Our conclusion was that it was due to social determinants of health,” she said. “When we adjusted for all socioclinical risk factors, the racial differences in gout and hyperuricemia prevalence disappeared. Importantly, stepwise regression analysis showed the two biggest drivers of the racial difference in gout prevalence among women were poverty itself, and excess BMI, which can be influenced by poverty.”
Dr. McCormick pointed out that in contrast to the current data, there was no racial difference in the prevalence of gout approximately 2 decades earlier, looking at data from the 1988-1994 NHANES III.
Given the findings, which included the fact that significantly more Black women and men were currently taking diuretics, compared with their White counterparts, Dr. McCormick pointed out clinicians should give more thought to medical therapies prescribed for conditions like high blood pressure to patients with gout or at risk for gout.
“One thing we found was that diuretic use was a driver” of gout, Dr. McCormick said. A prescriber “may want to consider different therapies that present a lower risk of gout if someone has hypertension. There could be greater consideration for prescribing alternatives to diuretics.”
More patient education and rheumatology referrals needed
An impediment to providing that education to patients with gout is unconscious bias on the part of the primary care provider, Dr. Wells said.
“It is about what your perspectives are and what you bring to the table,” he explained. “If you saw [a patient] who looked like someone in your family, that person will be treated differently [than someone who does not look like a family member]. That is where the whole concept [of unconscious bias] comes in.”
Primary care providers need to adopt a holistic approach to gout management that involves counseling about good nutrition, smoking cessation, regular exercise, and limiting alcohol consumption, in addition to medication adherence. Primary care providers may have a bias in treating their Black patients, failing to devote sufficient time and attention to assist them in getting their disease under control, he said.
“Gout should be just like any other chronic disease,” Dr. Wells said. “You need to have a target in mind, and you and your patient need to work together to get to that target. When [patients] end up in rheumatology offices, it is almost too late. I think the take-home message here is that in 2022 ... for any patient who has gout, that patient probably needs to be seen by a rheumatologist because, indeed, with aggressive therapy, preventive therapy, [and] education, and if they are on the right medications, they won’t end up with these crippling joints that we see all the time.”
Dr. McCormick and Dr. Wells disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Gout prevalence is more common in Black Americans than White Americans, and the disparity in prevalence is attributable to social determinants of health, according to a recently published article in JAMA Network Open.
“There has been evidence from recent cohort studies in the U.S. that was suggesting that the prevalence and incidence [of gout] was growing among non-White populations,” said Natalie McCormick, PhD, the study’s lead author and postdoctoral research fellow in medicine in the division of rheumatology, allergy, and immunology at Massachusetts General Hospital and Harvard Medical School, both in Boston. “We wanted to do this at the general population level to see how generalizable [that evidence] is.”
Alvin Wells, MD, PhD, director of the department of rheumatology at Advocate Aurora Medical Group, Franklin, Wisc., noted the findings highlight inequities in care for patients with gout that could be improved with greater emphasis on educating patients about their condition.
“I think that what this shows is that in the U.S. ... there still are some disparities in treating gout,” said Dr. Wells, who was not involved with the study. “And that we have ways to mitigate that, with not only aggressive therapy, but also with other tools like counseling patients. At the end of the day, people all want to be educated about the disease.”
Greater prevalence disappears with adjustment for socioclinical factors
The cross-sectional analysis involved data from U.S. adult participants in the National Health and Nutrition Examination Survey (NHANES) from 2007 to 2016 who self-reported Black or White race.
Investigators considered factors such as excess body mass index (BMI), chronic kidney disease (defined as estimated glomerular filtration rate less than 60 mL/min per 1.73 m2), poverty, poor-quality diet, lower educational level, alcohol consumption, and diuretic use in their analysis.
Dr. McCormick and coinvestigators included a total of 18,693 participants, consisting of 3,304 Black women, 6,195 White women, 3,085 Black men, and 6,109 White men.
They determined that the age-standardized prevalence of gout was 3.5% (95% confidence interval, 2.7%-4.3%) in Black women and 2.0% (95% CI, 1.5% - 2.5%) in White women (age-adjusted odds ratio, 1.81; 95% CI, 1.29-2.53). They calculated that the prevalence was 7.0% (95% CI, 6.2%-7.9%) in Black men and 5.4% (95% CI, 4.7%-6.2%) in White men (age-adjusted OR, 1.26; 95% CI, 1.02-1.55). They found similar differences in the prevalence of hyperuricemia between Black and White Americans.
The increased prevalence of gout in Black Americans, compared with White Americans, does not arise from genetics, according to McCormick. “Our conclusion was that it was due to social determinants of health,” she said. “When we adjusted for all socioclinical risk factors, the racial differences in gout and hyperuricemia prevalence disappeared. Importantly, stepwise regression analysis showed the two biggest drivers of the racial difference in gout prevalence among women were poverty itself, and excess BMI, which can be influenced by poverty.”
Dr. McCormick pointed out that in contrast to the current data, there was no racial difference in the prevalence of gout approximately 2 decades earlier, looking at data from the 1988-1994 NHANES III.
Given the findings, which included the fact that significantly more Black women and men were currently taking diuretics, compared with their White counterparts, Dr. McCormick pointed out clinicians should give more thought to medical therapies prescribed for conditions like high blood pressure to patients with gout or at risk for gout.
“One thing we found was that diuretic use was a driver” of gout, Dr. McCormick said. A prescriber “may want to consider different therapies that present a lower risk of gout if someone has hypertension. There could be greater consideration for prescribing alternatives to diuretics.”
More patient education and rheumatology referrals needed
An impediment to providing that education to patients with gout is unconscious bias on the part of the primary care provider, Dr. Wells said.
“It is about what your perspectives are and what you bring to the table,” he explained. “If you saw [a patient] who looked like someone in your family, that person will be treated differently [than someone who does not look like a family member]. That is where the whole concept [of unconscious bias] comes in.”
Primary care providers need to adopt a holistic approach to gout management that involves counseling about good nutrition, smoking cessation, regular exercise, and limiting alcohol consumption, in addition to medication adherence. Primary care providers may have a bias in treating their Black patients, failing to devote sufficient time and attention to assist them in getting their disease under control, he said.
“Gout should be just like any other chronic disease,” Dr. Wells said. “You need to have a target in mind, and you and your patient need to work together to get to that target. When [patients] end up in rheumatology offices, it is almost too late. I think the take-home message here is that in 2022 ... for any patient who has gout, that patient probably needs to be seen by a rheumatologist because, indeed, with aggressive therapy, preventive therapy, [and] education, and if they are on the right medications, they won’t end up with these crippling joints that we see all the time.”
Dr. McCormick and Dr. Wells disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Gout prevalence is more common in Black Americans than White Americans, and the disparity in prevalence is attributable to social determinants of health, according to a recently published article in JAMA Network Open.
“There has been evidence from recent cohort studies in the U.S. that was suggesting that the prevalence and incidence [of gout] was growing among non-White populations,” said Natalie McCormick, PhD, the study’s lead author and postdoctoral research fellow in medicine in the division of rheumatology, allergy, and immunology at Massachusetts General Hospital and Harvard Medical School, both in Boston. “We wanted to do this at the general population level to see how generalizable [that evidence] is.”
Alvin Wells, MD, PhD, director of the department of rheumatology at Advocate Aurora Medical Group, Franklin, Wisc., noted the findings highlight inequities in care for patients with gout that could be improved with greater emphasis on educating patients about their condition.
“I think that what this shows is that in the U.S. ... there still are some disparities in treating gout,” said Dr. Wells, who was not involved with the study. “And that we have ways to mitigate that, with not only aggressive therapy, but also with other tools like counseling patients. At the end of the day, people all want to be educated about the disease.”
Greater prevalence disappears with adjustment for socioclinical factors
The cross-sectional analysis involved data from U.S. adult participants in the National Health and Nutrition Examination Survey (NHANES) from 2007 to 2016 who self-reported Black or White race.
Investigators considered factors such as excess body mass index (BMI), chronic kidney disease (defined as estimated glomerular filtration rate less than 60 mL/min per 1.73 m2), poverty, poor-quality diet, lower educational level, alcohol consumption, and diuretic use in their analysis.
Dr. McCormick and coinvestigators included a total of 18,693 participants, consisting of 3,304 Black women, 6,195 White women, 3,085 Black men, and 6,109 White men.
They determined that the age-standardized prevalence of gout was 3.5% (95% confidence interval, 2.7%-4.3%) in Black women and 2.0% (95% CI, 1.5% - 2.5%) in White women (age-adjusted odds ratio, 1.81; 95% CI, 1.29-2.53). They calculated that the prevalence was 7.0% (95% CI, 6.2%-7.9%) in Black men and 5.4% (95% CI, 4.7%-6.2%) in White men (age-adjusted OR, 1.26; 95% CI, 1.02-1.55). They found similar differences in the prevalence of hyperuricemia between Black and White Americans.
The increased prevalence of gout in Black Americans, compared with White Americans, does not arise from genetics, according to McCormick. “Our conclusion was that it was due to social determinants of health,” she said. “When we adjusted for all socioclinical risk factors, the racial differences in gout and hyperuricemia prevalence disappeared. Importantly, stepwise regression analysis showed the two biggest drivers of the racial difference in gout prevalence among women were poverty itself, and excess BMI, which can be influenced by poverty.”
Dr. McCormick pointed out that in contrast to the current data, there was no racial difference in the prevalence of gout approximately 2 decades earlier, looking at data from the 1988-1994 NHANES III.
Given the findings, which included the fact that significantly more Black women and men were currently taking diuretics, compared with their White counterparts, Dr. McCormick pointed out clinicians should give more thought to medical therapies prescribed for conditions like high blood pressure to patients with gout or at risk for gout.
“One thing we found was that diuretic use was a driver” of gout, Dr. McCormick said. A prescriber “may want to consider different therapies that present a lower risk of gout if someone has hypertension. There could be greater consideration for prescribing alternatives to diuretics.”
More patient education and rheumatology referrals needed
An impediment to providing that education to patients with gout is unconscious bias on the part of the primary care provider, Dr. Wells said.
“It is about what your perspectives are and what you bring to the table,” he explained. “If you saw [a patient] who looked like someone in your family, that person will be treated differently [than someone who does not look like a family member]. That is where the whole concept [of unconscious bias] comes in.”
Primary care providers need to adopt a holistic approach to gout management that involves counseling about good nutrition, smoking cessation, regular exercise, and limiting alcohol consumption, in addition to medication adherence. Primary care providers may have a bias in treating their Black patients, failing to devote sufficient time and attention to assist them in getting their disease under control, he said.
“Gout should be just like any other chronic disease,” Dr. Wells said. “You need to have a target in mind, and you and your patient need to work together to get to that target. When [patients] end up in rheumatology offices, it is almost too late. I think the take-home message here is that in 2022 ... for any patient who has gout, that patient probably needs to be seen by a rheumatologist because, indeed, with aggressive therapy, preventive therapy, [and] education, and if they are on the right medications, they won’t end up with these crippling joints that we see all the time.”
Dr. McCormick and Dr. Wells disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Toenail trauma
The patient’s initial injury was probably a subungual hematoma, which can take 12 to 18 months to resolve (the time it takes for a new toenail to grow). However, the precipitating trauma likely created an opportunity for fungal elements to invade the nail plate, resulting in the current complaint of superficial onychomycosis.
Onychomycosis is a frequently seen condition with increasing prevalence in older patients. It has several clinical presentations: Superficial onychomycosis manifests with chalky white changes on the surface of the nail. Distal subungual onychomycosis develops at the distal aspect of the nail with thickening and subungual debris. Proximal subungual onychomycosis occurs in the proximal aspect of the nail.
Although often asymptomatic, onychomycosis can cause thickening of the nails and development of subsequent deformity or pincer nails (which painfully “pinch” the underlying skin). It is especially concerning in patients with diabetes or peripheral neuropathy, in whom the abnormal thickness and shape of the nails can lead to microtrauma at the proximal and lateral attachments of the nail. These patients have an increased risk of secondary infection, possible complications, and even, for some, amputation.
If the patient is asymptomatic, and does not have diabetes, neuropathy, or other risk factors, treatment is not required. For those who would benefit from treatment, it is usually safe and inexpensive with the current generation of oral antifungal medications.
Some recommend confirmatory testing before treatment intiation,1 but the low adverse effect profile of terbinafine and its current cost below $10/month2 make empiric treatment safe and cost effective in most cases.3 If needed, and with access to microscopy, a potassium hydroxide (KOH) prep can be performed on scrapings from the affected portions of the nail. If that is not available, scrapings or clippings can be sent to the lab for KOH and periodic acid-Schiff staining.
The US Food and Drug Administration previously recommended follow-up liver enzyme tests if terbinafine is used for more than 6 weeks. (Fingernails require only 6 weeks of treatment, but toenails grow more slowly and require 12 weeks of treatment.) However, research has demonstrated that hepatotoxicity risk is extremely low and transaminase elevations are rare.4 In the rare cases that liver dysfunction has occurred, patients developed symptoms of jaundice, malaise, dark urine, or pruritis.4
This patient was counseled regarding the fungal nature of onychomycosis and the general safety of a 90-day course of oral terbinafine 250 mg/d—provided he did not have underlying liver or kidney disease or leukopenia. He reported that he had not had any blood work performed in the past year but was due for his annual wellness evaluation, at which he would discuss his overall health with his primary care provider, obtain baseline blood testing, and determine whether to proceed with treatment. He was advised that if, after starting treatment, he developed any symptoms of jaundice, dark urine, or other difficulties, he should report them to his care team.
Photo courtesy of Daniel Stulberg, MD. Text courtesy of Daniel Stulberg, MD, FAAFP, Department of Family and Community Medicine, University of New Mexico School of Medicine, Albuquerque.
- Frazier WT, Santiago-Delgado ZM, Stupka KC 2nd. Onychomycosis: rapid evidence review. Am Fam Physician. 2021;104:359-367.
- Terbinafine. GoodRx. Accessed August 9, 2022. https://www.goodrx.com/terbinafine
- Mikailov A, Cohen J, Joyce C, et al. Cost-effectiveness of confirmatory testing before treatment of onychomycosis. JAMA Dermatol. 2016;152:276-281. doi: 10.1001/jamadermatol.2015.4190
- Sun CW, Hsu S. Terbinafine: safety profile and monitoring in treatment of dermatophyte infections. Dermatol Ther. 2019;32:e13111. doi: 10.1111/dth.13111
The patient’s initial injury was probably a subungual hematoma, which can take 12 to 18 months to resolve (the time it takes for a new toenail to grow). However, the precipitating trauma likely created an opportunity for fungal elements to invade the nail plate, resulting in the current complaint of superficial onychomycosis.
Onychomycosis is a frequently seen condition with increasing prevalence in older patients. It has several clinical presentations: Superficial onychomycosis manifests with chalky white changes on the surface of the nail. Distal subungual onychomycosis develops at the distal aspect of the nail with thickening and subungual debris. Proximal subungual onychomycosis occurs in the proximal aspect of the nail.
Although often asymptomatic, onychomycosis can cause thickening of the nails and development of subsequent deformity or pincer nails (which painfully “pinch” the underlying skin). It is especially concerning in patients with diabetes or peripheral neuropathy, in whom the abnormal thickness and shape of the nails can lead to microtrauma at the proximal and lateral attachments of the nail. These patients have an increased risk of secondary infection, possible complications, and even, for some, amputation.
If the patient is asymptomatic, and does not have diabetes, neuropathy, or other risk factors, treatment is not required. For those who would benefit from treatment, it is usually safe and inexpensive with the current generation of oral antifungal medications.
Some recommend confirmatory testing before treatment intiation,1 but the low adverse effect profile of terbinafine and its current cost below $10/month2 make empiric treatment safe and cost effective in most cases.3 If needed, and with access to microscopy, a potassium hydroxide (KOH) prep can be performed on scrapings from the affected portions of the nail. If that is not available, scrapings or clippings can be sent to the lab for KOH and periodic acid-Schiff staining.
The US Food and Drug Administration previously recommended follow-up liver enzyme tests if terbinafine is used for more than 6 weeks. (Fingernails require only 6 weeks of treatment, but toenails grow more slowly and require 12 weeks of treatment.) However, research has demonstrated that hepatotoxicity risk is extremely low and transaminase elevations are rare.4 In the rare cases that liver dysfunction has occurred, patients developed symptoms of jaundice, malaise, dark urine, or pruritis.4
This patient was counseled regarding the fungal nature of onychomycosis and the general safety of a 90-day course of oral terbinafine 250 mg/d—provided he did not have underlying liver or kidney disease or leukopenia. He reported that he had not had any blood work performed in the past year but was due for his annual wellness evaluation, at which he would discuss his overall health with his primary care provider, obtain baseline blood testing, and determine whether to proceed with treatment. He was advised that if, after starting treatment, he developed any symptoms of jaundice, dark urine, or other difficulties, he should report them to his care team.
Photo courtesy of Daniel Stulberg, MD. Text courtesy of Daniel Stulberg, MD, FAAFP, Department of Family and Community Medicine, University of New Mexico School of Medicine, Albuquerque.
The patient’s initial injury was probably a subungual hematoma, which can take 12 to 18 months to resolve (the time it takes for a new toenail to grow). However, the precipitating trauma likely created an opportunity for fungal elements to invade the nail plate, resulting in the current complaint of superficial onychomycosis.
Onychomycosis is a frequently seen condition with increasing prevalence in older patients. It has several clinical presentations: Superficial onychomycosis manifests with chalky white changes on the surface of the nail. Distal subungual onychomycosis develops at the distal aspect of the nail with thickening and subungual debris. Proximal subungual onychomycosis occurs in the proximal aspect of the nail.
Although often asymptomatic, onychomycosis can cause thickening of the nails and development of subsequent deformity or pincer nails (which painfully “pinch” the underlying skin). It is especially concerning in patients with diabetes or peripheral neuropathy, in whom the abnormal thickness and shape of the nails can lead to microtrauma at the proximal and lateral attachments of the nail. These patients have an increased risk of secondary infection, possible complications, and even, for some, amputation.
If the patient is asymptomatic, and does not have diabetes, neuropathy, or other risk factors, treatment is not required. For those who would benefit from treatment, it is usually safe and inexpensive with the current generation of oral antifungal medications.
Some recommend confirmatory testing before treatment intiation,1 but the low adverse effect profile of terbinafine and its current cost below $10/month2 make empiric treatment safe and cost effective in most cases.3 If needed, and with access to microscopy, a potassium hydroxide (KOH) prep can be performed on scrapings from the affected portions of the nail. If that is not available, scrapings or clippings can be sent to the lab for KOH and periodic acid-Schiff staining.
The US Food and Drug Administration previously recommended follow-up liver enzyme tests if terbinafine is used for more than 6 weeks. (Fingernails require only 6 weeks of treatment, but toenails grow more slowly and require 12 weeks of treatment.) However, research has demonstrated that hepatotoxicity risk is extremely low and transaminase elevations are rare.4 In the rare cases that liver dysfunction has occurred, patients developed symptoms of jaundice, malaise, dark urine, or pruritis.4
This patient was counseled regarding the fungal nature of onychomycosis and the general safety of a 90-day course of oral terbinafine 250 mg/d—provided he did not have underlying liver or kidney disease or leukopenia. He reported that he had not had any blood work performed in the past year but was due for his annual wellness evaluation, at which he would discuss his overall health with his primary care provider, obtain baseline blood testing, and determine whether to proceed with treatment. He was advised that if, after starting treatment, he developed any symptoms of jaundice, dark urine, or other difficulties, he should report them to his care team.
Photo courtesy of Daniel Stulberg, MD. Text courtesy of Daniel Stulberg, MD, FAAFP, Department of Family and Community Medicine, University of New Mexico School of Medicine, Albuquerque.
- Frazier WT, Santiago-Delgado ZM, Stupka KC 2nd. Onychomycosis: rapid evidence review. Am Fam Physician. 2021;104:359-367.
- Terbinafine. GoodRx. Accessed August 9, 2022. https://www.goodrx.com/terbinafine
- Mikailov A, Cohen J, Joyce C, et al. Cost-effectiveness of confirmatory testing before treatment of onychomycosis. JAMA Dermatol. 2016;152:276-281. doi: 10.1001/jamadermatol.2015.4190
- Sun CW, Hsu S. Terbinafine: safety profile and monitoring in treatment of dermatophyte infections. Dermatol Ther. 2019;32:e13111. doi: 10.1111/dth.13111
- Frazier WT, Santiago-Delgado ZM, Stupka KC 2nd. Onychomycosis: rapid evidence review. Am Fam Physician. 2021;104:359-367.
- Terbinafine. GoodRx. Accessed August 9, 2022. https://www.goodrx.com/terbinafine
- Mikailov A, Cohen J, Joyce C, et al. Cost-effectiveness of confirmatory testing before treatment of onychomycosis. JAMA Dermatol. 2016;152:276-281. doi: 10.1001/jamadermatol.2015.4190
- Sun CW, Hsu S. Terbinafine: safety profile and monitoring in treatment of dermatophyte infections. Dermatol Ther. 2019;32:e13111. doi: 10.1111/dth.13111