One hour of walking per week may boost longevity for octogenarians

Article Type
Changed
Fri, 08/26/2022 - 16:04

Adults aged 85 years and older who logged an hour or more of walking each week had a 40% reduced risk of all-cause mortality compared with less active peers, according to data from more than 7,000 individuals.

“Aging is accompanied by reduced physical activity and increased sedentary behavior, and reduced physical activity is associated with decreased life expectancy,” Moo-Nyun Jin, MD, of Inje University Sanggye Paik Hospital, Seoul, South Korea, said in an interview.

Reduced physical activity was especially likely in the elderly during the COVID-19 pandemic, he added.

oneinchpunch/Thinkstock

“Promoting walking may be a simple way to help older adults avoid inactivity and encourage an active lifestyle for all-cause and cardiovascular mortality risk reduction,” Dr. Jin said.

Although walking is generally an easy form of exercise for the older adult population, the specific benefit of walking on reducing mortality has not been well studied, according to Dr. Jin and colleagues.

For adults of any age, current guidelines recommend at least 150 minutes per week of moderate activity or 75 minutes per week of vigorous activity, but the amount of physical activity tends to decline with age, and activity recommendations are more difficult to meet, the authors wrote in a press release accompanying their study.

In the study, to be presented at the European Society of Cardiology Congress on Aug. 28 (Abstract 85643), the researchers reviewed data from 7,047 adults aged 85 years and older who participated in the Korean National Health Screening Program. The average age of the study population was 87 years, and 68% were women. Participants completed questionnaires about the amount of time spent in leisure time activities each week, including walking at a slow pace, moderate activity (such as cycling or brisk walking), and vigorous activity (such as running).

Those who walked at a slow pace for at least 1 hour per week had a 40% reduced risk of all-cause mortality and a 39% reduced risk of cardiovascular mortality, compared with inactive participants.

The proportions of participants who reported walking, moderate activity,­ and vigorous ­intensity physical activity were 42.5%, 14.7%, and 11.0%, respectively. Roughly one-third (33%) of those who reported slow walking each week also reported moderate or vigorous physical activity.



However, walking for 1 hour per week significantly reduced the risk for all-cause mortality and cardiovascular mortality among individuals who reported walking only, without other moderate or vigorous physical activity (hazard ratio, 0.50 and 0.46, respectively).

“Walking was linked with a lower likelihood of dying in older adults, regardless of whether or not they did any moderate to vigorous intensity physical activity,” Dr. Jin told this news organization. “Our study indicates that walking even just 1 hour every week is advantageous to those aged 85 years and older compared to being inactive.”

The hour of walking need not be in long bouts, 10 minutes each day will do, Dr. Jin added.

The participants were divided into five groups based on reported amount of weekly walking. More than half (57.5%) reported no slow walking, 8.5% walked less than 1 hour per week, 12.0% walked 1-2 hours, 8.7% walked 2-3 hours, and 13.3% walked more than 3 hours.

Although the study was limited by the reliance on self-reports, the results were strengthened by the large sample size and support the value of easy walking for adults aged 85 years and older compared to being inactive.

“Walking may present an opportunity for promoting physical activity among the elderly population, offering a simple way to avoid inactivity and increase physical activity,” said Dr. Jin. However, more research is needed to evaluate the association between mortality and walking by objective measurement of walking levels, using a device such as a smart watch, he noted.

 

 

 

Results are preliminary

“This is an observational study, not an experiment, so it means causality cannot be presumed,” said Maria Fiatarone Singh, MD, a geriatrician with a focus on exercise physiology at the University of Sydney, in an interview. “In other words, it is possible that diseases resulting in mortality prevented people from walking rather than the other way around,” she noted. The only published experimental study on exercise and mortality in older adults was conducted by Dr. Fiatarone Singh and colleagues in Norway. In that study, published in the British Medical Journal in 2020, high-intensity training programs were associated with reduced all-cause mortality compared with inactive controls and individuals who engaged in moderate intensity exercise.

The current study “would have needed to control for many factors related to mortality, such as cardiovascular disease, hypertension, diabetes, malnutrition, and dementia to see what residual benefit might be related to walking,” Dr. Fiatarone Singh said.

“Although walking seems easy and safe, in fact people who are frail, sarcopenic, osteoporotic, or have fallen are recommended to do resistance and balance training rather than walking, and add walking later when they are able to do it safely,” she emphasized.

The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Fiatarone Singh had no financial conflicts to disclose.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Adults aged 85 years and older who logged an hour or more of walking each week had a 40% reduced risk of all-cause mortality compared with less active peers, according to data from more than 7,000 individuals.

“Aging is accompanied by reduced physical activity and increased sedentary behavior, and reduced physical activity is associated with decreased life expectancy,” Moo-Nyun Jin, MD, of Inje University Sanggye Paik Hospital, Seoul, South Korea, said in an interview.

Reduced physical activity was especially likely in the elderly during the COVID-19 pandemic, he added.

oneinchpunch/Thinkstock

“Promoting walking may be a simple way to help older adults avoid inactivity and encourage an active lifestyle for all-cause and cardiovascular mortality risk reduction,” Dr. Jin said.

Although walking is generally an easy form of exercise for the older adult population, the specific benefit of walking on reducing mortality has not been well studied, according to Dr. Jin and colleagues.

For adults of any age, current guidelines recommend at least 150 minutes per week of moderate activity or 75 minutes per week of vigorous activity, but the amount of physical activity tends to decline with age, and activity recommendations are more difficult to meet, the authors wrote in a press release accompanying their study.

In the study, to be presented at the European Society of Cardiology Congress on Aug. 28 (Abstract 85643), the researchers reviewed data from 7,047 adults aged 85 years and older who participated in the Korean National Health Screening Program. The average age of the study population was 87 years, and 68% were women. Participants completed questionnaires about the amount of time spent in leisure time activities each week, including walking at a slow pace, moderate activity (such as cycling or brisk walking), and vigorous activity (such as running).

Those who walked at a slow pace for at least 1 hour per week had a 40% reduced risk of all-cause mortality and a 39% reduced risk of cardiovascular mortality, compared with inactive participants.

The proportions of participants who reported walking, moderate activity,­ and vigorous ­intensity physical activity were 42.5%, 14.7%, and 11.0%, respectively. Roughly one-third (33%) of those who reported slow walking each week also reported moderate or vigorous physical activity.



However, walking for 1 hour per week significantly reduced the risk for all-cause mortality and cardiovascular mortality among individuals who reported walking only, without other moderate or vigorous physical activity (hazard ratio, 0.50 and 0.46, respectively).

“Walking was linked with a lower likelihood of dying in older adults, regardless of whether or not they did any moderate to vigorous intensity physical activity,” Dr. Jin told this news organization. “Our study indicates that walking even just 1 hour every week is advantageous to those aged 85 years and older compared to being inactive.”

The hour of walking need not be in long bouts, 10 minutes each day will do, Dr. Jin added.

The participants were divided into five groups based on reported amount of weekly walking. More than half (57.5%) reported no slow walking, 8.5% walked less than 1 hour per week, 12.0% walked 1-2 hours, 8.7% walked 2-3 hours, and 13.3% walked more than 3 hours.

Although the study was limited by the reliance on self-reports, the results were strengthened by the large sample size and support the value of easy walking for adults aged 85 years and older compared to being inactive.

“Walking may present an opportunity for promoting physical activity among the elderly population, offering a simple way to avoid inactivity and increase physical activity,” said Dr. Jin. However, more research is needed to evaluate the association between mortality and walking by objective measurement of walking levels, using a device such as a smart watch, he noted.

 

 

 

Results are preliminary

“This is an observational study, not an experiment, so it means causality cannot be presumed,” said Maria Fiatarone Singh, MD, a geriatrician with a focus on exercise physiology at the University of Sydney, in an interview. “In other words, it is possible that diseases resulting in mortality prevented people from walking rather than the other way around,” she noted. The only published experimental study on exercise and mortality in older adults was conducted by Dr. Fiatarone Singh and colleagues in Norway. In that study, published in the British Medical Journal in 2020, high-intensity training programs were associated with reduced all-cause mortality compared with inactive controls and individuals who engaged in moderate intensity exercise.

The current study “would have needed to control for many factors related to mortality, such as cardiovascular disease, hypertension, diabetes, malnutrition, and dementia to see what residual benefit might be related to walking,” Dr. Fiatarone Singh said.

“Although walking seems easy and safe, in fact people who are frail, sarcopenic, osteoporotic, or have fallen are recommended to do resistance and balance training rather than walking, and add walking later when they are able to do it safely,” she emphasized.

The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Fiatarone Singh had no financial conflicts to disclose.

Adults aged 85 years and older who logged an hour or more of walking each week had a 40% reduced risk of all-cause mortality compared with less active peers, according to data from more than 7,000 individuals.

“Aging is accompanied by reduced physical activity and increased sedentary behavior, and reduced physical activity is associated with decreased life expectancy,” Moo-Nyun Jin, MD, of Inje University Sanggye Paik Hospital, Seoul, South Korea, said in an interview.

Reduced physical activity was especially likely in the elderly during the COVID-19 pandemic, he added.

oneinchpunch/Thinkstock

“Promoting walking may be a simple way to help older adults avoid inactivity and encourage an active lifestyle for all-cause and cardiovascular mortality risk reduction,” Dr. Jin said.

Although walking is generally an easy form of exercise for the older adult population, the specific benefit of walking on reducing mortality has not been well studied, according to Dr. Jin and colleagues.

For adults of any age, current guidelines recommend at least 150 minutes per week of moderate activity or 75 minutes per week of vigorous activity, but the amount of physical activity tends to decline with age, and activity recommendations are more difficult to meet, the authors wrote in a press release accompanying their study.

In the study, to be presented at the European Society of Cardiology Congress on Aug. 28 (Abstract 85643), the researchers reviewed data from 7,047 adults aged 85 years and older who participated in the Korean National Health Screening Program. The average age of the study population was 87 years, and 68% were women. Participants completed questionnaires about the amount of time spent in leisure time activities each week, including walking at a slow pace, moderate activity (such as cycling or brisk walking), and vigorous activity (such as running).

Those who walked at a slow pace for at least 1 hour per week had a 40% reduced risk of all-cause mortality and a 39% reduced risk of cardiovascular mortality, compared with inactive participants.

The proportions of participants who reported walking, moderate activity,­ and vigorous ­intensity physical activity were 42.5%, 14.7%, and 11.0%, respectively. Roughly one-third (33%) of those who reported slow walking each week also reported moderate or vigorous physical activity.



However, walking for 1 hour per week significantly reduced the risk for all-cause mortality and cardiovascular mortality among individuals who reported walking only, without other moderate or vigorous physical activity (hazard ratio, 0.50 and 0.46, respectively).

“Walking was linked with a lower likelihood of dying in older adults, regardless of whether or not they did any moderate to vigorous intensity physical activity,” Dr. Jin told this news organization. “Our study indicates that walking even just 1 hour every week is advantageous to those aged 85 years and older compared to being inactive.”

The hour of walking need not be in long bouts, 10 minutes each day will do, Dr. Jin added.

The participants were divided into five groups based on reported amount of weekly walking. More than half (57.5%) reported no slow walking, 8.5% walked less than 1 hour per week, 12.0% walked 1-2 hours, 8.7% walked 2-3 hours, and 13.3% walked more than 3 hours.

Although the study was limited by the reliance on self-reports, the results were strengthened by the large sample size and support the value of easy walking for adults aged 85 years and older compared to being inactive.

“Walking may present an opportunity for promoting physical activity among the elderly population, offering a simple way to avoid inactivity and increase physical activity,” said Dr. Jin. However, more research is needed to evaluate the association between mortality and walking by objective measurement of walking levels, using a device such as a smart watch, he noted.

 

 

 

Results are preliminary

“This is an observational study, not an experiment, so it means causality cannot be presumed,” said Maria Fiatarone Singh, MD, a geriatrician with a focus on exercise physiology at the University of Sydney, in an interview. “In other words, it is possible that diseases resulting in mortality prevented people from walking rather than the other way around,” she noted. The only published experimental study on exercise and mortality in older adults was conducted by Dr. Fiatarone Singh and colleagues in Norway. In that study, published in the British Medical Journal in 2020, high-intensity training programs were associated with reduced all-cause mortality compared with inactive controls and individuals who engaged in moderate intensity exercise.

The current study “would have needed to control for many factors related to mortality, such as cardiovascular disease, hypertension, diabetes, malnutrition, and dementia to see what residual benefit might be related to walking,” Dr. Fiatarone Singh said.

“Although walking seems easy and safe, in fact people who are frail, sarcopenic, osteoporotic, or have fallen are recommended to do resistance and balance training rather than walking, and add walking later when they are able to do it safely,” she emphasized.

The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Fiatarone Singh had no financial conflicts to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ESC CONGRESS 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cholesterol levels lowering in U.S., but disparities emerge

Article Type
Changed
Mon, 08/29/2022 - 08:55

Cholesterol levels in American adults have improved over the previous decade, but a large cross-sectional analysis of more than 30,000 U.S. adults has found notable disparities in cholesterol control, particularly among Asian adults, lower lipid control rates among Black and other Hispanic adults compared to Whites, and no appreciable improvements for people taking statins.

“We found that total cholesterol improved significantly among U.S. adults from 2008 to 2018,” senior study author Rishi Wadhera, MD, of Beth Israel Deaconess Medical Center in Boston, said in an interview. “When we looked at rates of lipid control among adults treated with statins, we found no significant improvements from 2008 through 2018.”

Dr. Rishi Wadhera

He noted the patterns for lipid control were consistent for women and men, adding, “In contrast to all other racial and ethnic groups, Mexican American and Black adults did experience significant improvements in cholesterol control. Despite this progress, rates of cholesterol control still remained significantly lower in Black adults compared to White adults.”

The study analyzed lipid concentrations from 33,040 adults ages 20 and older from the National Health and Nutrition Examination Surveys (NHANES), using 2007-2008 as the baseline and 2017-2018 as the endpoint. With lipid control defined as total cholesterol of 200 mg/dL or less, the analysis showed that total cholesterol improved in the overall population from 197 to 189 mg/dL in that time (95% confidence interval, –12.2 to –4.9 mg/dL; P < .001).

The study analyzed lipid trends in several demographic categories. Age-adjusted total cholesterol for women improved significantly, from 199 to 192 mg/dL (95% confidence interval [CI], –11.6 to –3.6 mg/dL; P < .001), but improved slightly more for men, from 195 to 185 mg/dL (95% CI, –14 to –5.1 mg/dL; P < .001).

Overall, age-adjusted total cholesterol improved significantly for Blacks (–7.8 mg/dL), Mexican Americans (–11.3 mg/dL), other Hispanic adults (–8 mg/dL) and Whites (–8.8 mg/dL; P < .001 for all), but not for Asian adults, measured from 2011-2012 to 2017-2018: –.2 mg/dL (95% CI, –6.5 to 6.2 mg/dL; P = .9).

The study found that LDL cholesterol, on an age-adjusted basis, improved significantly overall, from 116 mg/dL in 2007-2008 to 111 mg/dL in 2017-2018 (95% CI, –8.3 to –1.4 mg/dL; P = .001). However, unlike total cholesterol, this improvement didn’t carry over to most ethnic groups. Mexican American adults (–8 mg/dL; P = .01) and Whites (–5.9 mg/dL; P = .001) showed significant improvements, but Asian, Black or other Hispanic adults didn’t.

The study also evaluated lipid control in people taking statins and found that, overall, it didn’t change significantly: from 78.5% in 2007-2008 to 79.5% in 2017-2018 (P = .27). Mexican American adults were the only ethnic group that showed significant improvement in lipid control, going from 73% in 2007-2008 to 86.5% in 2017-2018 (P = .008).

  

Disparities in lipid control

Women had notably lower lipid control rates than men, with an odds ratio of .52 in 2007-2010 (P < .001), with similar patterns found in 2011-2014 (OR, 0.48) and 2015-2018 (OR, 0.54, P < .001 for both).

Lipid control worsened over time for Black and other Hispanic adults compared to Whites. In 2007-2010, lipid control rates among the studied ethnic groups were similar, a trend that carried over to the 2011-2014 study interval and included Asian adults. However, in 2015-2018, Blacks had lower rates of lipid control compared to Whites (OR, 0.66; 95% CI, .47-.94; P = .03), as did other Hispanic adults (OR, 0.59; 95% CI, .37-.95; P = .04).

These disparities between sexes and ethnic groups warrant further investigation, Dr. Wadhera said. “We were surprised that women had significantly lower rates of cholesterol control than men,” he said. “We need to better understand whether gaps in care, such barriers in access, less frequent lab monitoring of cholesterol, or less intensive prescribing of important treatments, contribute to these differences.”

He called the lower lipid control rates in Black and Hispanic adults “concerning, especially because rates of heart attacks and strokes remain high in these groups. ... Efforts to identify gaps in care and increase and intensify medical therapy are needed, as treatment rates in these populations are low.”

While the study collected data before the COVID-19 pandemic, Dr. Wadhera acknowledged that the management of cardiovascular risk factors may have worsened because of it. “Monitoring cholesterol levels and control rates in the U.S. population as we emerge from the pandemic will be critically important,” he said.

In an accompanying editorial, Hermes Florez, MD, PhD, of the Medical University of South Carolina in Charleston, and colleagues called for adequately powered studies to further investigate the disparities in the Asian and Hispanic populations. “Worse rates of cholesterol control observed in women and in minority populations deserve special attention,” they wrote.

They noted that future studies should consider the impact of guidelines and recommendations that emerged since the study started, namely from the American College of Cardiology/American Heart Association 2013 guidelines, Healthy People 2030, and the U.S. Preventive Services Task Force (JAMA. 2022 Aug 23. doi: 10.1001/jama.2022.13044).

“More important, future work must focus on how to effectively eliminate those disparities and better control modifiable risk factors to enhance outcomes for all individuals regardless of race and ethnicity,” Dr. Florez and colleagues wrote.

The study received funding from the National Heart, Lung, and Blood Institute. Dr. Wadhera disclosed relationships with CVS Health and Abbott. Dr. Florez and colleagues have no disclosures.

Publications
Topics
Sections

Cholesterol levels in American adults have improved over the previous decade, but a large cross-sectional analysis of more than 30,000 U.S. adults has found notable disparities in cholesterol control, particularly among Asian adults, lower lipid control rates among Black and other Hispanic adults compared to Whites, and no appreciable improvements for people taking statins.

“We found that total cholesterol improved significantly among U.S. adults from 2008 to 2018,” senior study author Rishi Wadhera, MD, of Beth Israel Deaconess Medical Center in Boston, said in an interview. “When we looked at rates of lipid control among adults treated with statins, we found no significant improvements from 2008 through 2018.”

Dr. Rishi Wadhera

He noted the patterns for lipid control were consistent for women and men, adding, “In contrast to all other racial and ethnic groups, Mexican American and Black adults did experience significant improvements in cholesterol control. Despite this progress, rates of cholesterol control still remained significantly lower in Black adults compared to White adults.”

The study analyzed lipid concentrations from 33,040 adults ages 20 and older from the National Health and Nutrition Examination Surveys (NHANES), using 2007-2008 as the baseline and 2017-2018 as the endpoint. With lipid control defined as total cholesterol of 200 mg/dL or less, the analysis showed that total cholesterol improved in the overall population from 197 to 189 mg/dL in that time (95% confidence interval, –12.2 to –4.9 mg/dL; P < .001).

The study analyzed lipid trends in several demographic categories. Age-adjusted total cholesterol for women improved significantly, from 199 to 192 mg/dL (95% confidence interval [CI], –11.6 to –3.6 mg/dL; P < .001), but improved slightly more for men, from 195 to 185 mg/dL (95% CI, –14 to –5.1 mg/dL; P < .001).

Overall, age-adjusted total cholesterol improved significantly for Blacks (–7.8 mg/dL), Mexican Americans (–11.3 mg/dL), other Hispanic adults (–8 mg/dL) and Whites (–8.8 mg/dL; P < .001 for all), but not for Asian adults, measured from 2011-2012 to 2017-2018: –.2 mg/dL (95% CI, –6.5 to 6.2 mg/dL; P = .9).

The study found that LDL cholesterol, on an age-adjusted basis, improved significantly overall, from 116 mg/dL in 2007-2008 to 111 mg/dL in 2017-2018 (95% CI, –8.3 to –1.4 mg/dL; P = .001). However, unlike total cholesterol, this improvement didn’t carry over to most ethnic groups. Mexican American adults (–8 mg/dL; P = .01) and Whites (–5.9 mg/dL; P = .001) showed significant improvements, but Asian, Black or other Hispanic adults didn’t.

The study also evaluated lipid control in people taking statins and found that, overall, it didn’t change significantly: from 78.5% in 2007-2008 to 79.5% in 2017-2018 (P = .27). Mexican American adults were the only ethnic group that showed significant improvement in lipid control, going from 73% in 2007-2008 to 86.5% in 2017-2018 (P = .008).

  

Disparities in lipid control

Women had notably lower lipid control rates than men, with an odds ratio of .52 in 2007-2010 (P < .001), with similar patterns found in 2011-2014 (OR, 0.48) and 2015-2018 (OR, 0.54, P < .001 for both).

Lipid control worsened over time for Black and other Hispanic adults compared to Whites. In 2007-2010, lipid control rates among the studied ethnic groups were similar, a trend that carried over to the 2011-2014 study interval and included Asian adults. However, in 2015-2018, Blacks had lower rates of lipid control compared to Whites (OR, 0.66; 95% CI, .47-.94; P = .03), as did other Hispanic adults (OR, 0.59; 95% CI, .37-.95; P = .04).

These disparities between sexes and ethnic groups warrant further investigation, Dr. Wadhera said. “We were surprised that women had significantly lower rates of cholesterol control than men,” he said. “We need to better understand whether gaps in care, such barriers in access, less frequent lab monitoring of cholesterol, or less intensive prescribing of important treatments, contribute to these differences.”

He called the lower lipid control rates in Black and Hispanic adults “concerning, especially because rates of heart attacks and strokes remain high in these groups. ... Efforts to identify gaps in care and increase and intensify medical therapy are needed, as treatment rates in these populations are low.”

While the study collected data before the COVID-19 pandemic, Dr. Wadhera acknowledged that the management of cardiovascular risk factors may have worsened because of it. “Monitoring cholesterol levels and control rates in the U.S. population as we emerge from the pandemic will be critically important,” he said.

In an accompanying editorial, Hermes Florez, MD, PhD, of the Medical University of South Carolina in Charleston, and colleagues called for adequately powered studies to further investigate the disparities in the Asian and Hispanic populations. “Worse rates of cholesterol control observed in women and in minority populations deserve special attention,” they wrote.

They noted that future studies should consider the impact of guidelines and recommendations that emerged since the study started, namely from the American College of Cardiology/American Heart Association 2013 guidelines, Healthy People 2030, and the U.S. Preventive Services Task Force (JAMA. 2022 Aug 23. doi: 10.1001/jama.2022.13044).

“More important, future work must focus on how to effectively eliminate those disparities and better control modifiable risk factors to enhance outcomes for all individuals regardless of race and ethnicity,” Dr. Florez and colleagues wrote.

The study received funding from the National Heart, Lung, and Blood Institute. Dr. Wadhera disclosed relationships with CVS Health and Abbott. Dr. Florez and colleagues have no disclosures.

Cholesterol levels in American adults have improved over the previous decade, but a large cross-sectional analysis of more than 30,000 U.S. adults has found notable disparities in cholesterol control, particularly among Asian adults, lower lipid control rates among Black and other Hispanic adults compared to Whites, and no appreciable improvements for people taking statins.

“We found that total cholesterol improved significantly among U.S. adults from 2008 to 2018,” senior study author Rishi Wadhera, MD, of Beth Israel Deaconess Medical Center in Boston, said in an interview. “When we looked at rates of lipid control among adults treated with statins, we found no significant improvements from 2008 through 2018.”

Dr. Rishi Wadhera

He noted the patterns for lipid control were consistent for women and men, adding, “In contrast to all other racial and ethnic groups, Mexican American and Black adults did experience significant improvements in cholesterol control. Despite this progress, rates of cholesterol control still remained significantly lower in Black adults compared to White adults.”

The study analyzed lipid concentrations from 33,040 adults ages 20 and older from the National Health and Nutrition Examination Surveys (NHANES), using 2007-2008 as the baseline and 2017-2018 as the endpoint. With lipid control defined as total cholesterol of 200 mg/dL or less, the analysis showed that total cholesterol improved in the overall population from 197 to 189 mg/dL in that time (95% confidence interval, –12.2 to –4.9 mg/dL; P < .001).

The study analyzed lipid trends in several demographic categories. Age-adjusted total cholesterol for women improved significantly, from 199 to 192 mg/dL (95% confidence interval [CI], –11.6 to –3.6 mg/dL; P < .001), but improved slightly more for men, from 195 to 185 mg/dL (95% CI, –14 to –5.1 mg/dL; P < .001).

Overall, age-adjusted total cholesterol improved significantly for Blacks (–7.8 mg/dL), Mexican Americans (–11.3 mg/dL), other Hispanic adults (–8 mg/dL) and Whites (–8.8 mg/dL; P < .001 for all), but not for Asian adults, measured from 2011-2012 to 2017-2018: –.2 mg/dL (95% CI, –6.5 to 6.2 mg/dL; P = .9).

The study found that LDL cholesterol, on an age-adjusted basis, improved significantly overall, from 116 mg/dL in 2007-2008 to 111 mg/dL in 2017-2018 (95% CI, –8.3 to –1.4 mg/dL; P = .001). However, unlike total cholesterol, this improvement didn’t carry over to most ethnic groups. Mexican American adults (–8 mg/dL; P = .01) and Whites (–5.9 mg/dL; P = .001) showed significant improvements, but Asian, Black or other Hispanic adults didn’t.

The study also evaluated lipid control in people taking statins and found that, overall, it didn’t change significantly: from 78.5% in 2007-2008 to 79.5% in 2017-2018 (P = .27). Mexican American adults were the only ethnic group that showed significant improvement in lipid control, going from 73% in 2007-2008 to 86.5% in 2017-2018 (P = .008).

  

Disparities in lipid control

Women had notably lower lipid control rates than men, with an odds ratio of .52 in 2007-2010 (P < .001), with similar patterns found in 2011-2014 (OR, 0.48) and 2015-2018 (OR, 0.54, P < .001 for both).

Lipid control worsened over time for Black and other Hispanic adults compared to Whites. In 2007-2010, lipid control rates among the studied ethnic groups were similar, a trend that carried over to the 2011-2014 study interval and included Asian adults. However, in 2015-2018, Blacks had lower rates of lipid control compared to Whites (OR, 0.66; 95% CI, .47-.94; P = .03), as did other Hispanic adults (OR, 0.59; 95% CI, .37-.95; P = .04).

These disparities between sexes and ethnic groups warrant further investigation, Dr. Wadhera said. “We were surprised that women had significantly lower rates of cholesterol control than men,” he said. “We need to better understand whether gaps in care, such barriers in access, less frequent lab monitoring of cholesterol, or less intensive prescribing of important treatments, contribute to these differences.”

He called the lower lipid control rates in Black and Hispanic adults “concerning, especially because rates of heart attacks and strokes remain high in these groups. ... Efforts to identify gaps in care and increase and intensify medical therapy are needed, as treatment rates in these populations are low.”

While the study collected data before the COVID-19 pandemic, Dr. Wadhera acknowledged that the management of cardiovascular risk factors may have worsened because of it. “Monitoring cholesterol levels and control rates in the U.S. population as we emerge from the pandemic will be critically important,” he said.

In an accompanying editorial, Hermes Florez, MD, PhD, of the Medical University of South Carolina in Charleston, and colleagues called for adequately powered studies to further investigate the disparities in the Asian and Hispanic populations. “Worse rates of cholesterol control observed in women and in minority populations deserve special attention,” they wrote.

They noted that future studies should consider the impact of guidelines and recommendations that emerged since the study started, namely from the American College of Cardiology/American Heart Association 2013 guidelines, Healthy People 2030, and the U.S. Preventive Services Task Force (JAMA. 2022 Aug 23. doi: 10.1001/jama.2022.13044).

“More important, future work must focus on how to effectively eliminate those disparities and better control modifiable risk factors to enhance outcomes for all individuals regardless of race and ethnicity,” Dr. Florez and colleagues wrote.

The study received funding from the National Heart, Lung, and Blood Institute. Dr. Wadhera disclosed relationships with CVS Health and Abbott. Dr. Florez and colleagues have no disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Blood biomarkers predict TBI disability and mortality

Article Type
Changed
Thu, 12/15/2022 - 15:37

Two biomarkers present in blood measured on the day of traumatic brain injury (TBI) can accurately predict a patient’s risk for death or severe disability 6 months later, new research suggests.

In new data from the TRACK-TBI study group, high levels of glial fibrillary acidic protein (GFAP) and ubiquitin carboxy-terminal hydrolase L1 (UCH-L1) proteins found in glial cells and neurons, respectively, correlated with death and severe injury. Investigators note that measuring these biomarkers may give a more accurate assessment of a patient’s prognosis following TBI.

This study is the “first report of the accuracy of a blood test that can be obtained rapidly on the day of injury to predict neurological recovery at 6 months after injury,” lead author Frederick Korley, MD, PhD, associate professor of emergency medicine at the University of Michigan, Ann Arbor, said in a news release.

The findings were published online in the Lancet Neurology.
 

Added value

The researchers measured GFAP and UCH-L1 in blood samples taken from 1,696 patients with TBI on the day of their injury, and they assessed patient recovery 6 months later.

The markers were measured using the i-STAT TBI Plasma test (Abbott Labs). The test was approved in 2021 by the U.S. Food and Drug Administration to determine which patients with mild TBI should undergo computed tomography scans.

About two-thirds of the study population were men, and the average age was 39 years. All patients were evaluated at Level I trauma centers for injuries caused primarily by traffic accidents or falls.

Six months following injury, 7% of the patients had died and 14% had an unfavorable outcome, ranging from vegetative state to severe disability requiring daily support. In addition, 67% had incomplete recovery, ranging from moderate disabilities requiring assistance outside of the home to minor disabling neurological or psychological deficits.

Day-of-injury GFAP and UCH-L1 levels had a high probability of predicting death (87% for GFAP and 89% for UCH-L1) and severe disability (86% for both GFAP and UCH-L1) at 6 months, the investigators reported.

The biomarkers were less accurate in predicting incomplete recovery (62% for GFAP and 61% for UCH-L1).

The researchers also assessed the added value of combining the blood biomarkers to current TBI prognostic models that take into account variables such as age, motor score, pupil reactivity, and CT characteristics.

In patients with a Glasgow Coma Scale (GCS) score of 3-12, adding GFAP and UCH-L1 alone or combined to each of the three International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) models significantly increased their accuracy for predicting death (range, 90%-94%) and unfavorable outcome (range, 83%-89%).

In patients with milder TBI (GCS score, 13-15), adding GFAP and UCH-L1 to the UPFRONT prognostic model modestly increased accuracy for predicting incomplete recovery (69%).
 

‘Important’ findings

Commenting on the study, Cyrus A. Raji, MD, PhD, assistant professor of radiology and neurology, Washington University, St. Louis, said this “critical” study shows that these biomarkers can “predict key outcomes,” including mortality and severe disability. “Thus, in conjunction with clinical evaluations and related data such as neuroimaging, these tests may warrant translation to broader clinical practice, particularly in acute settings,” said Dr. Raji, who was not involved in the research.

Also weighing in, Heidi Fusco, MD, assistant director of the traumatic brain injury program at NYU Langone Rusk Rehabilitation, said the findings are “important.”

“Prognosis after brain injury often is based on the initial presentation, ongoing clinical exams, and neuroimaging; and the addition of biomarkers would contribute to creating a more objective prognostic model,” Dr. Fusco said.

She noted “it’s unclear” whether clinical hospital laboratories would be able to accommodate this type of laboratory drawing.

“It is imperative that clinicians still use the patient history [and] clinical and radiological exam when making clinical decisions for a patient and not just lab values. It would be best to incorporate the GFAP and UCH-L1 into a preexisting prognostic model,” Dr. Fusco said.

The study was funded by the U.S. National Institutes of Health, the National Institute of Neurologic Disorders and Stroke, the U.S. Department of Defense, One Mind, and U.S. Army Medical Research and Development Command. Dr. Korley reported having previously consulted for Abbott Laboratories and has received research funding from Abbott Laboratories, which makes the assays used in the study. Dr. Raji is a consultant for Brainreader ApS and Neurevolution. Dr. Fusco has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 30(10)
Publications
Topics
Sections

Two biomarkers present in blood measured on the day of traumatic brain injury (TBI) can accurately predict a patient’s risk for death or severe disability 6 months later, new research suggests.

In new data from the TRACK-TBI study group, high levels of glial fibrillary acidic protein (GFAP) and ubiquitin carboxy-terminal hydrolase L1 (UCH-L1) proteins found in glial cells and neurons, respectively, correlated with death and severe injury. Investigators note that measuring these biomarkers may give a more accurate assessment of a patient’s prognosis following TBI.

This study is the “first report of the accuracy of a blood test that can be obtained rapidly on the day of injury to predict neurological recovery at 6 months after injury,” lead author Frederick Korley, MD, PhD, associate professor of emergency medicine at the University of Michigan, Ann Arbor, said in a news release.

The findings were published online in the Lancet Neurology.
 

Added value

The researchers measured GFAP and UCH-L1 in blood samples taken from 1,696 patients with TBI on the day of their injury, and they assessed patient recovery 6 months later.

The markers were measured using the i-STAT TBI Plasma test (Abbott Labs). The test was approved in 2021 by the U.S. Food and Drug Administration to determine which patients with mild TBI should undergo computed tomography scans.

About two-thirds of the study population were men, and the average age was 39 years. All patients were evaluated at Level I trauma centers for injuries caused primarily by traffic accidents or falls.

Six months following injury, 7% of the patients had died and 14% had an unfavorable outcome, ranging from vegetative state to severe disability requiring daily support. In addition, 67% had incomplete recovery, ranging from moderate disabilities requiring assistance outside of the home to minor disabling neurological or psychological deficits.

Day-of-injury GFAP and UCH-L1 levels had a high probability of predicting death (87% for GFAP and 89% for UCH-L1) and severe disability (86% for both GFAP and UCH-L1) at 6 months, the investigators reported.

The biomarkers were less accurate in predicting incomplete recovery (62% for GFAP and 61% for UCH-L1).

The researchers also assessed the added value of combining the blood biomarkers to current TBI prognostic models that take into account variables such as age, motor score, pupil reactivity, and CT characteristics.

In patients with a Glasgow Coma Scale (GCS) score of 3-12, adding GFAP and UCH-L1 alone or combined to each of the three International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) models significantly increased their accuracy for predicting death (range, 90%-94%) and unfavorable outcome (range, 83%-89%).

In patients with milder TBI (GCS score, 13-15), adding GFAP and UCH-L1 to the UPFRONT prognostic model modestly increased accuracy for predicting incomplete recovery (69%).
 

‘Important’ findings

Commenting on the study, Cyrus A. Raji, MD, PhD, assistant professor of radiology and neurology, Washington University, St. Louis, said this “critical” study shows that these biomarkers can “predict key outcomes,” including mortality and severe disability. “Thus, in conjunction with clinical evaluations and related data such as neuroimaging, these tests may warrant translation to broader clinical practice, particularly in acute settings,” said Dr. Raji, who was not involved in the research.

Also weighing in, Heidi Fusco, MD, assistant director of the traumatic brain injury program at NYU Langone Rusk Rehabilitation, said the findings are “important.”

“Prognosis after brain injury often is based on the initial presentation, ongoing clinical exams, and neuroimaging; and the addition of biomarkers would contribute to creating a more objective prognostic model,” Dr. Fusco said.

She noted “it’s unclear” whether clinical hospital laboratories would be able to accommodate this type of laboratory drawing.

“It is imperative that clinicians still use the patient history [and] clinical and radiological exam when making clinical decisions for a patient and not just lab values. It would be best to incorporate the GFAP and UCH-L1 into a preexisting prognostic model,” Dr. Fusco said.

The study was funded by the U.S. National Institutes of Health, the National Institute of Neurologic Disorders and Stroke, the U.S. Department of Defense, One Mind, and U.S. Army Medical Research and Development Command. Dr. Korley reported having previously consulted for Abbott Laboratories and has received research funding from Abbott Laboratories, which makes the assays used in the study. Dr. Raji is a consultant for Brainreader ApS and Neurevolution. Dr. Fusco has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Two biomarkers present in blood measured on the day of traumatic brain injury (TBI) can accurately predict a patient’s risk for death or severe disability 6 months later, new research suggests.

In new data from the TRACK-TBI study group, high levels of glial fibrillary acidic protein (GFAP) and ubiquitin carboxy-terminal hydrolase L1 (UCH-L1) proteins found in glial cells and neurons, respectively, correlated with death and severe injury. Investigators note that measuring these biomarkers may give a more accurate assessment of a patient’s prognosis following TBI.

This study is the “first report of the accuracy of a blood test that can be obtained rapidly on the day of injury to predict neurological recovery at 6 months after injury,” lead author Frederick Korley, MD, PhD, associate professor of emergency medicine at the University of Michigan, Ann Arbor, said in a news release.

The findings were published online in the Lancet Neurology.
 

Added value

The researchers measured GFAP and UCH-L1 in blood samples taken from 1,696 patients with TBI on the day of their injury, and they assessed patient recovery 6 months later.

The markers were measured using the i-STAT TBI Plasma test (Abbott Labs). The test was approved in 2021 by the U.S. Food and Drug Administration to determine which patients with mild TBI should undergo computed tomography scans.

About two-thirds of the study population were men, and the average age was 39 years. All patients were evaluated at Level I trauma centers for injuries caused primarily by traffic accidents or falls.

Six months following injury, 7% of the patients had died and 14% had an unfavorable outcome, ranging from vegetative state to severe disability requiring daily support. In addition, 67% had incomplete recovery, ranging from moderate disabilities requiring assistance outside of the home to minor disabling neurological or psychological deficits.

Day-of-injury GFAP and UCH-L1 levels had a high probability of predicting death (87% for GFAP and 89% for UCH-L1) and severe disability (86% for both GFAP and UCH-L1) at 6 months, the investigators reported.

The biomarkers were less accurate in predicting incomplete recovery (62% for GFAP and 61% for UCH-L1).

The researchers also assessed the added value of combining the blood biomarkers to current TBI prognostic models that take into account variables such as age, motor score, pupil reactivity, and CT characteristics.

In patients with a Glasgow Coma Scale (GCS) score of 3-12, adding GFAP and UCH-L1 alone or combined to each of the three International Mission for Prognosis and Analysis of Clinical Trials in TBI (IMPACT) models significantly increased their accuracy for predicting death (range, 90%-94%) and unfavorable outcome (range, 83%-89%).

In patients with milder TBI (GCS score, 13-15), adding GFAP and UCH-L1 to the UPFRONT prognostic model modestly increased accuracy for predicting incomplete recovery (69%).
 

‘Important’ findings

Commenting on the study, Cyrus A. Raji, MD, PhD, assistant professor of radiology and neurology, Washington University, St. Louis, said this “critical” study shows that these biomarkers can “predict key outcomes,” including mortality and severe disability. “Thus, in conjunction with clinical evaluations and related data such as neuroimaging, these tests may warrant translation to broader clinical practice, particularly in acute settings,” said Dr. Raji, who was not involved in the research.

Also weighing in, Heidi Fusco, MD, assistant director of the traumatic brain injury program at NYU Langone Rusk Rehabilitation, said the findings are “important.”

“Prognosis after brain injury often is based on the initial presentation, ongoing clinical exams, and neuroimaging; and the addition of biomarkers would contribute to creating a more objective prognostic model,” Dr. Fusco said.

She noted “it’s unclear” whether clinical hospital laboratories would be able to accommodate this type of laboratory drawing.

“It is imperative that clinicians still use the patient history [and] clinical and radiological exam when making clinical decisions for a patient and not just lab values. It would be best to incorporate the GFAP and UCH-L1 into a preexisting prognostic model,” Dr. Fusco said.

The study was funded by the U.S. National Institutes of Health, the National Institute of Neurologic Disorders and Stroke, the U.S. Department of Defense, One Mind, and U.S. Army Medical Research and Development Command. Dr. Korley reported having previously consulted for Abbott Laboratories and has received research funding from Abbott Laboratories, which makes the assays used in the study. Dr. Raji is a consultant for Brainreader ApS and Neurevolution. Dr. Fusco has reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 30(10)
Issue
Neurology Reviews - 30(10)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE LANCET NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Second opinions on melanocytic lesions swayed when first opinion is known

Article Type
Changed
Thu, 09/01/2022 - 12:54

Many dermatopathologists offering a second opinion about melanocytic skin lesions prefer to have access to the first diagnostic report, but a controlled trial demonstrates that this has a powerful influence on perception, diminishing the value and accuracy of an independent analysis.

In a novel effort to determine whether previous interpretations sway second opinions, 149 dermatopathologists were asked to read melanocytic skin biopsy specimens without access to the initial pathology report. A year or more later they read them again but now with access to the initial reading.

Dr. Joann G. Elmore

The study showed that the participants, independent of many variables, such as years of experience or frequency with which they offered second options, were more likely to upgrade or downgrade the severity of the specimens in accordance with the initial report even if their original reading was correct.

If the goal of a second dermatopathologist opinion is to obtain an independent diagnostic opinion, the message from this study is that they “should be blinded to first opinions,” according to the authors of this study, led by Joann G. Elmore, MD, professor of medicine, University of California, Los Angeles. The study was published online in JAMA Dermatology.
 

Two-phase study has 1-year washout

The study was conducted in two phases. In phase 1, a nationally representative sample of volunteer dermatopathologists performed 878 interpretations. In phase 2, conducted after a washout period of 12 months or more, the dermatopathologists read a random subset of the same cases evaluated in phase 1, but this time, unlike the first, they were first exposed to prior pathology reports.

Ultimately, “the dermatologists provided more than 5,000 interpretations of study cases, which was a big contribution of time,” Dr. Elmore said in an interview. Grateful for their critical contribution, she speculated that they were driven by the importance of the question being asked.

When categorized by the Melanocytic Pathology Assessment Tool (MPAT), which rates specimens from benign (class 1) to pT1b invasive melanoma (class 4), the influence of the prior report went in both directions, so that the likelihood of upgrading or downgrading went in accordance with the grading in the original dermatopathology report.

As a result, the risk of a less severe interpretation on the second relative to the first reading was 38% greater if the initial dermatopathology report had a lower grade (relative risk, 1.38; 95% confidence interval [CI], 1.19-1.59). The risk of upgrading the second report if the initial pathology report had a higher grade was increased by more than 50% (RR, 1.52; 95% CI, 1.34-1.73).

The greater likelihood of upgrading than downgrading is “understandable,” Dr. Elmore said. “I think this is consistent with the concern about missing something,” she explained.

According to Dr. Elmore, one of the greatest concerns regarding the bias imposed by the original pathology report is that the switch of opinions often went from one that was accurate to one that was inaccurate.

If the phase 1 diagnosis was accurate but upgraded in the phase 2 diagnosis, the risk of inaccuracy was almost doubled (RR, 1.96; 95% CI, 1.31-2.93). If the phase 1 report was inaccurate, the relative risk of changing the phase 2 diagnosis was still high but lower than if it was accurate (RR, 1.46; 95% CI, 1.27-1.68).

“That is, even when the phase 1 diagnoses agreed with the consensus reference diagnosis, they were swayed away from the correct diagnosis in phase 2 [when the initial pathology report characterized the specimen as higher grade],” Dr. Elmore reported.

Conversely, the risk of downgrading was about the same whether the phase 1 evaluation was accurate (RR, 1.37; 95% CI, 1.14-1.64) or inaccurate (RR 1.32; 95% CI, 1.07-1.64).

Downward and upward shifts in severity from an accurate diagnosis are concerning because of the likelihood they will lead to overtreatment or undertreatment. The problem, according to data from this study, is that dermatologists making a second opinion cannot judge their own susceptibility to being swayed by the original report.
 

 

 

Pathologists might be unaware of bias

At baseline, the participants were asked whether they thought they were influenced by the first interpretation when providing a second opinion. Although 69% acknowledged that they might be “somewhat influenced,” 31% maintained that they do not take initial reports into consideration. When the two groups were compared, the risk of downgrading was nearly identical. The risk of upgrading was lower in those claiming to disregard initial reports (RR, 1.29) relative to those who said they were “somewhat influenced” by a previous diagnosis (RR, 1.64), but the difference was not significant.

The actual risk of bias incurred by prior pathology reports might be greater than that captured in this study for several reasons, according to the investigators. They pointed out that all participants were experienced and board-certified and might therefore be expected to be more confident in their interpretations than an unselected group of dermatopathologists. In addition, participants might have been more careful in their interpretations knowing they were participating in a study.

“There are a lot of data to support the value of second opinions [in dermatopathology and other areas], but we need to consider the process of how they are being obtained,” Dr. Elmore said. “There needs to be a greater emphasis on providing an independent analysis.”

More than 60% of the dermatologists participating in this study reported that they agreed or strongly agreed with the premise that they prefer to have the original dermatopathology report when they offer a second opinion. Dr. Elmore said that the desire of those offering a second opinion to have as much information in front of them as possible is understandable, but the bias imposed by the original report weakens the value of the second opinion.
 

Blind reading of pathology reports needed

“These data suggest that seeing the original report sways opinions and that includes swaying opinions away from an accurate reading,” Dr. Elmore said. She thinks that for dermatopathologists to render a valuable and independent second opinion, the specimens should be examined “at least initially” without access to the first report.

The results of this study were not surprising to Vishal Anil Patel, MD, director of the Cutaneous Oncology Program, George Washington University Cancer Center, Washington. He made the point that physicians “are human first and foremost and not perfect machines.” As a result, he suggested bias and error are inevitable.

Although strategies to avoid bias are likely to offer some protection against inaccuracy, he said that diagnostic support tools such as artificial intelligence might be the right direction for improving inter- and intra-rater reliability.

Ruifeng Guo, MD, PhD, a consultant in the division of anatomic pathology at the Mayo Clinic, Rochester, Minn., agreed with the basic premise of the study, but he cautioned that restricting access to the initial pathology report might not always be the right approach.

It is true that “dermatopathologists providing a second opinion in diagnosing cutaneous melanoma are mostly unaware of the risk of bias if they read the initial pathology report,” said Dr. Guo, but restricting access comes with risks.

“There are also times critical information may be contained in the initial pathology report that needs to be considered when providing a second opinion consultation,” he noted. Ultimately, the decision to read or not read the initial report should be decided “on an individual basis.”

The study was funded by grants from the National Cancer Institute. Dr. Elmore, Dr. Patel, and Dr. Guo reported no relevant financial relationships. 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Many dermatopathologists offering a second opinion about melanocytic skin lesions prefer to have access to the first diagnostic report, but a controlled trial demonstrates that this has a powerful influence on perception, diminishing the value and accuracy of an independent analysis.

In a novel effort to determine whether previous interpretations sway second opinions, 149 dermatopathologists were asked to read melanocytic skin biopsy specimens without access to the initial pathology report. A year or more later they read them again but now with access to the initial reading.

Dr. Joann G. Elmore

The study showed that the participants, independent of many variables, such as years of experience or frequency with which they offered second options, were more likely to upgrade or downgrade the severity of the specimens in accordance with the initial report even if their original reading was correct.

If the goal of a second dermatopathologist opinion is to obtain an independent diagnostic opinion, the message from this study is that they “should be blinded to first opinions,” according to the authors of this study, led by Joann G. Elmore, MD, professor of medicine, University of California, Los Angeles. The study was published online in JAMA Dermatology.
 

Two-phase study has 1-year washout

The study was conducted in two phases. In phase 1, a nationally representative sample of volunteer dermatopathologists performed 878 interpretations. In phase 2, conducted after a washout period of 12 months or more, the dermatopathologists read a random subset of the same cases evaluated in phase 1, but this time, unlike the first, they were first exposed to prior pathology reports.

Ultimately, “the dermatologists provided more than 5,000 interpretations of study cases, which was a big contribution of time,” Dr. Elmore said in an interview. Grateful for their critical contribution, she speculated that they were driven by the importance of the question being asked.

When categorized by the Melanocytic Pathology Assessment Tool (MPAT), which rates specimens from benign (class 1) to pT1b invasive melanoma (class 4), the influence of the prior report went in both directions, so that the likelihood of upgrading or downgrading went in accordance with the grading in the original dermatopathology report.

As a result, the risk of a less severe interpretation on the second relative to the first reading was 38% greater if the initial dermatopathology report had a lower grade (relative risk, 1.38; 95% confidence interval [CI], 1.19-1.59). The risk of upgrading the second report if the initial pathology report had a higher grade was increased by more than 50% (RR, 1.52; 95% CI, 1.34-1.73).

The greater likelihood of upgrading than downgrading is “understandable,” Dr. Elmore said. “I think this is consistent with the concern about missing something,” she explained.

According to Dr. Elmore, one of the greatest concerns regarding the bias imposed by the original pathology report is that the switch of opinions often went from one that was accurate to one that was inaccurate.

If the phase 1 diagnosis was accurate but upgraded in the phase 2 diagnosis, the risk of inaccuracy was almost doubled (RR, 1.96; 95% CI, 1.31-2.93). If the phase 1 report was inaccurate, the relative risk of changing the phase 2 diagnosis was still high but lower than if it was accurate (RR, 1.46; 95% CI, 1.27-1.68).

“That is, even when the phase 1 diagnoses agreed with the consensus reference diagnosis, they were swayed away from the correct diagnosis in phase 2 [when the initial pathology report characterized the specimen as higher grade],” Dr. Elmore reported.

Conversely, the risk of downgrading was about the same whether the phase 1 evaluation was accurate (RR, 1.37; 95% CI, 1.14-1.64) or inaccurate (RR 1.32; 95% CI, 1.07-1.64).

Downward and upward shifts in severity from an accurate diagnosis are concerning because of the likelihood they will lead to overtreatment or undertreatment. The problem, according to data from this study, is that dermatologists making a second opinion cannot judge their own susceptibility to being swayed by the original report.
 

 

 

Pathologists might be unaware of bias

At baseline, the participants were asked whether they thought they were influenced by the first interpretation when providing a second opinion. Although 69% acknowledged that they might be “somewhat influenced,” 31% maintained that they do not take initial reports into consideration. When the two groups were compared, the risk of downgrading was nearly identical. The risk of upgrading was lower in those claiming to disregard initial reports (RR, 1.29) relative to those who said they were “somewhat influenced” by a previous diagnosis (RR, 1.64), but the difference was not significant.

The actual risk of bias incurred by prior pathology reports might be greater than that captured in this study for several reasons, according to the investigators. They pointed out that all participants were experienced and board-certified and might therefore be expected to be more confident in their interpretations than an unselected group of dermatopathologists. In addition, participants might have been more careful in their interpretations knowing they were participating in a study.

“There are a lot of data to support the value of second opinions [in dermatopathology and other areas], but we need to consider the process of how they are being obtained,” Dr. Elmore said. “There needs to be a greater emphasis on providing an independent analysis.”

More than 60% of the dermatologists participating in this study reported that they agreed or strongly agreed with the premise that they prefer to have the original dermatopathology report when they offer a second opinion. Dr. Elmore said that the desire of those offering a second opinion to have as much information in front of them as possible is understandable, but the bias imposed by the original report weakens the value of the second opinion.
 

Blind reading of pathology reports needed

“These data suggest that seeing the original report sways opinions and that includes swaying opinions away from an accurate reading,” Dr. Elmore said. She thinks that for dermatopathologists to render a valuable and independent second opinion, the specimens should be examined “at least initially” without access to the first report.

The results of this study were not surprising to Vishal Anil Patel, MD, director of the Cutaneous Oncology Program, George Washington University Cancer Center, Washington. He made the point that physicians “are human first and foremost and not perfect machines.” As a result, he suggested bias and error are inevitable.

Although strategies to avoid bias are likely to offer some protection against inaccuracy, he said that diagnostic support tools such as artificial intelligence might be the right direction for improving inter- and intra-rater reliability.

Ruifeng Guo, MD, PhD, a consultant in the division of anatomic pathology at the Mayo Clinic, Rochester, Minn., agreed with the basic premise of the study, but he cautioned that restricting access to the initial pathology report might not always be the right approach.

It is true that “dermatopathologists providing a second opinion in diagnosing cutaneous melanoma are mostly unaware of the risk of bias if they read the initial pathology report,” said Dr. Guo, but restricting access comes with risks.

“There are also times critical information may be contained in the initial pathology report that needs to be considered when providing a second opinion consultation,” he noted. Ultimately, the decision to read or not read the initial report should be decided “on an individual basis.”

The study was funded by grants from the National Cancer Institute. Dr. Elmore, Dr. Patel, and Dr. Guo reported no relevant financial relationships. 

A version of this article first appeared on Medscape.com.

Many dermatopathologists offering a second opinion about melanocytic skin lesions prefer to have access to the first diagnostic report, but a controlled trial demonstrates that this has a powerful influence on perception, diminishing the value and accuracy of an independent analysis.

In a novel effort to determine whether previous interpretations sway second opinions, 149 dermatopathologists were asked to read melanocytic skin biopsy specimens without access to the initial pathology report. A year or more later they read them again but now with access to the initial reading.

Dr. Joann G. Elmore

The study showed that the participants, independent of many variables, such as years of experience or frequency with which they offered second options, were more likely to upgrade or downgrade the severity of the specimens in accordance with the initial report even if their original reading was correct.

If the goal of a second dermatopathologist opinion is to obtain an independent diagnostic opinion, the message from this study is that they “should be blinded to first opinions,” according to the authors of this study, led by Joann G. Elmore, MD, professor of medicine, University of California, Los Angeles. The study was published online in JAMA Dermatology.
 

Two-phase study has 1-year washout

The study was conducted in two phases. In phase 1, a nationally representative sample of volunteer dermatopathologists performed 878 interpretations. In phase 2, conducted after a washout period of 12 months or more, the dermatopathologists read a random subset of the same cases evaluated in phase 1, but this time, unlike the first, they were first exposed to prior pathology reports.

Ultimately, “the dermatologists provided more than 5,000 interpretations of study cases, which was a big contribution of time,” Dr. Elmore said in an interview. Grateful for their critical contribution, she speculated that they were driven by the importance of the question being asked.

When categorized by the Melanocytic Pathology Assessment Tool (MPAT), which rates specimens from benign (class 1) to pT1b invasive melanoma (class 4), the influence of the prior report went in both directions, so that the likelihood of upgrading or downgrading went in accordance with the grading in the original dermatopathology report.

As a result, the risk of a less severe interpretation on the second relative to the first reading was 38% greater if the initial dermatopathology report had a lower grade (relative risk, 1.38; 95% confidence interval [CI], 1.19-1.59). The risk of upgrading the second report if the initial pathology report had a higher grade was increased by more than 50% (RR, 1.52; 95% CI, 1.34-1.73).

The greater likelihood of upgrading than downgrading is “understandable,” Dr. Elmore said. “I think this is consistent with the concern about missing something,” she explained.

According to Dr. Elmore, one of the greatest concerns regarding the bias imposed by the original pathology report is that the switch of opinions often went from one that was accurate to one that was inaccurate.

If the phase 1 diagnosis was accurate but upgraded in the phase 2 diagnosis, the risk of inaccuracy was almost doubled (RR, 1.96; 95% CI, 1.31-2.93). If the phase 1 report was inaccurate, the relative risk of changing the phase 2 diagnosis was still high but lower than if it was accurate (RR, 1.46; 95% CI, 1.27-1.68).

“That is, even when the phase 1 diagnoses agreed with the consensus reference diagnosis, they were swayed away from the correct diagnosis in phase 2 [when the initial pathology report characterized the specimen as higher grade],” Dr. Elmore reported.

Conversely, the risk of downgrading was about the same whether the phase 1 evaluation was accurate (RR, 1.37; 95% CI, 1.14-1.64) or inaccurate (RR 1.32; 95% CI, 1.07-1.64).

Downward and upward shifts in severity from an accurate diagnosis are concerning because of the likelihood they will lead to overtreatment or undertreatment. The problem, according to data from this study, is that dermatologists making a second opinion cannot judge their own susceptibility to being swayed by the original report.
 

 

 

Pathologists might be unaware of bias

At baseline, the participants were asked whether they thought they were influenced by the first interpretation when providing a second opinion. Although 69% acknowledged that they might be “somewhat influenced,” 31% maintained that they do not take initial reports into consideration. When the two groups were compared, the risk of downgrading was nearly identical. The risk of upgrading was lower in those claiming to disregard initial reports (RR, 1.29) relative to those who said they were “somewhat influenced” by a previous diagnosis (RR, 1.64), but the difference was not significant.

The actual risk of bias incurred by prior pathology reports might be greater than that captured in this study for several reasons, according to the investigators. They pointed out that all participants were experienced and board-certified and might therefore be expected to be more confident in their interpretations than an unselected group of dermatopathologists. In addition, participants might have been more careful in their interpretations knowing they were participating in a study.

“There are a lot of data to support the value of second opinions [in dermatopathology and other areas], but we need to consider the process of how they are being obtained,” Dr. Elmore said. “There needs to be a greater emphasis on providing an independent analysis.”

More than 60% of the dermatologists participating in this study reported that they agreed or strongly agreed with the premise that they prefer to have the original dermatopathology report when they offer a second opinion. Dr. Elmore said that the desire of those offering a second opinion to have as much information in front of them as possible is understandable, but the bias imposed by the original report weakens the value of the second opinion.
 

Blind reading of pathology reports needed

“These data suggest that seeing the original report sways opinions and that includes swaying opinions away from an accurate reading,” Dr. Elmore said. She thinks that for dermatopathologists to render a valuable and independent second opinion, the specimens should be examined “at least initially” without access to the first report.

The results of this study were not surprising to Vishal Anil Patel, MD, director of the Cutaneous Oncology Program, George Washington University Cancer Center, Washington. He made the point that physicians “are human first and foremost and not perfect machines.” As a result, he suggested bias and error are inevitable.

Although strategies to avoid bias are likely to offer some protection against inaccuracy, he said that diagnostic support tools such as artificial intelligence might be the right direction for improving inter- and intra-rater reliability.

Ruifeng Guo, MD, PhD, a consultant in the division of anatomic pathology at the Mayo Clinic, Rochester, Minn., agreed with the basic premise of the study, but he cautioned that restricting access to the initial pathology report might not always be the right approach.

It is true that “dermatopathologists providing a second opinion in diagnosing cutaneous melanoma are mostly unaware of the risk of bias if they read the initial pathology report,” said Dr. Guo, but restricting access comes with risks.

“There are also times critical information may be contained in the initial pathology report that needs to be considered when providing a second opinion consultation,” he noted. Ultimately, the decision to read or not read the initial report should be decided “on an individual basis.”

The study was funded by grants from the National Cancer Institute. Dr. Elmore, Dr. Patel, and Dr. Guo reported no relevant financial relationships. 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

In Memoriam: John Hickner, MD, MSc

Article Type
Changed
Tue, 09/13/2022 - 08:52
Display Headline
In Memoriam: John Hickner, MD, MSc

John Hickner, MD, MSc

We are deeply saddened by the recent death of our friend and colleague, John Hickner. Although we are grieving, we consider ourselves fortunate to have had John in our lives and to be able to share a few of his many accomplishments and attributes. Anyone who knew John knew that he had many gifts. But above all, John was kind, generous, and thoughtful. Val, John’s wife of 48 years, and their family were at the center of John’s world. Everything John did was a reflection of his love for his family.

John was a small-town family physician, and this guided virtually all of his professional endeavors. He was a member of the faculty for the Michigan State University Department of Family Medicine in Escanaba, in Michigan’s Upper Peninsula. While in the Upper Peninsula, he helped establish 2 practice-based research networks: the statewide Michigan Research Network (MiRNet) and the regional Upper Peninsula Research Network (UPRNet). If you ever had the chance to attend the UPRNet meetings, you would have observed the entire practice staff included in planning research activities, sharing, and troubleshooting common practice hiccups. At the end of those meetings, John would conclude by reading a children’s story such as Goodnight Moon or play a song on his guitar and then give a final thoughtful message.

In 1999, John worked with the American Academy of Family Physicians to create the National Research Network, now composed of more than 870 practices and nearly 2400 members. His own interests in respiratory infections, stemming from his experiences with his own children, led to work with the North American Respiratory Infection Study Group and with the Centers for Disease Control and Prevention.

John’s interests in practice-based research paralleled his interests in evidence-based medicine, largely as a way to translate research into daily practice. This focus on evidence guided much of his work as Editor-in-Chief for The Journal of Family Practice, a title he held for a decade. He also worked with state Academies of Family Physicians for more than a decade to create a new conference series centered on short, practical clinical topics and based completely on summaries of recent research. Any listener of the Primary Care Update podcasts could hear his thoughtful questioning of current research and his wise approach to its integration into practice.

John was more than a thoughtful and kind clinician, an outstanding educator, and a gifted researcher; he was a natural leader. John had the capacity to understand the systems in which he worked and was able to skillfully guide teams to improve those systems. He served as the Chair of Family Medicine at the Cleveland Clinic and then at the University of Illinois Chicago (UIC), and mentored many faculty, residents, and students during his time at those institutions.

After retiring from UIC, John and Val moved back to Escanaba. At his retirement dinner, his children (Michael, Laura, Zach, Anna, and Olivia) gifted him a beautiful maple acoustic guitar with which he then serenaded the attendees. John was an avid tennis player and often would tell us he would have to skip meeting us for dinner while away at a conference because he had found a tennis opponent! Most of all, he loved to set out on his 35-foot sailboat on Big Bay de Noc or on Green Bay. We have fond memories of the days spent sailing with John and hope that he has found fair winds and following seas.

Henry C. Barry, MD, MS
Mark Ebell, MD, MS
Kate Rowland, MD, MS, FAAFP

Article PDF
Issue
The Journal of Family Practice - 71(7)
Publications
Page Number
278-279
Sections
Article PDF
Article PDF

John Hickner, MD, MSc

We are deeply saddened by the recent death of our friend and colleague, John Hickner. Although we are grieving, we consider ourselves fortunate to have had John in our lives and to be able to share a few of his many accomplishments and attributes. Anyone who knew John knew that he had many gifts. But above all, John was kind, generous, and thoughtful. Val, John’s wife of 48 years, and their family were at the center of John’s world. Everything John did was a reflection of his love for his family.

John was a small-town family physician, and this guided virtually all of his professional endeavors. He was a member of the faculty for the Michigan State University Department of Family Medicine in Escanaba, in Michigan’s Upper Peninsula. While in the Upper Peninsula, he helped establish 2 practice-based research networks: the statewide Michigan Research Network (MiRNet) and the regional Upper Peninsula Research Network (UPRNet). If you ever had the chance to attend the UPRNet meetings, you would have observed the entire practice staff included in planning research activities, sharing, and troubleshooting common practice hiccups. At the end of those meetings, John would conclude by reading a children’s story such as Goodnight Moon or play a song on his guitar and then give a final thoughtful message.

In 1999, John worked with the American Academy of Family Physicians to create the National Research Network, now composed of more than 870 practices and nearly 2400 members. His own interests in respiratory infections, stemming from his experiences with his own children, led to work with the North American Respiratory Infection Study Group and with the Centers for Disease Control and Prevention.

John’s interests in practice-based research paralleled his interests in evidence-based medicine, largely as a way to translate research into daily practice. This focus on evidence guided much of his work as Editor-in-Chief for The Journal of Family Practice, a title he held for a decade. He also worked with state Academies of Family Physicians for more than a decade to create a new conference series centered on short, practical clinical topics and based completely on summaries of recent research. Any listener of the Primary Care Update podcasts could hear his thoughtful questioning of current research and his wise approach to its integration into practice.

John was more than a thoughtful and kind clinician, an outstanding educator, and a gifted researcher; he was a natural leader. John had the capacity to understand the systems in which he worked and was able to skillfully guide teams to improve those systems. He served as the Chair of Family Medicine at the Cleveland Clinic and then at the University of Illinois Chicago (UIC), and mentored many faculty, residents, and students during his time at those institutions.

After retiring from UIC, John and Val moved back to Escanaba. At his retirement dinner, his children (Michael, Laura, Zach, Anna, and Olivia) gifted him a beautiful maple acoustic guitar with which he then serenaded the attendees. John was an avid tennis player and often would tell us he would have to skip meeting us for dinner while away at a conference because he had found a tennis opponent! Most of all, he loved to set out on his 35-foot sailboat on Big Bay de Noc or on Green Bay. We have fond memories of the days spent sailing with John and hope that he has found fair winds and following seas.

Henry C. Barry, MD, MS
Mark Ebell, MD, MS
Kate Rowland, MD, MS, FAAFP

John Hickner, MD, MSc

We are deeply saddened by the recent death of our friend and colleague, John Hickner. Although we are grieving, we consider ourselves fortunate to have had John in our lives and to be able to share a few of his many accomplishments and attributes. Anyone who knew John knew that he had many gifts. But above all, John was kind, generous, and thoughtful. Val, John’s wife of 48 years, and their family were at the center of John’s world. Everything John did was a reflection of his love for his family.

John was a small-town family physician, and this guided virtually all of his professional endeavors. He was a member of the faculty for the Michigan State University Department of Family Medicine in Escanaba, in Michigan’s Upper Peninsula. While in the Upper Peninsula, he helped establish 2 practice-based research networks: the statewide Michigan Research Network (MiRNet) and the regional Upper Peninsula Research Network (UPRNet). If you ever had the chance to attend the UPRNet meetings, you would have observed the entire practice staff included in planning research activities, sharing, and troubleshooting common practice hiccups. At the end of those meetings, John would conclude by reading a children’s story such as Goodnight Moon or play a song on his guitar and then give a final thoughtful message.

In 1999, John worked with the American Academy of Family Physicians to create the National Research Network, now composed of more than 870 practices and nearly 2400 members. His own interests in respiratory infections, stemming from his experiences with his own children, led to work with the North American Respiratory Infection Study Group and with the Centers for Disease Control and Prevention.

John’s interests in practice-based research paralleled his interests in evidence-based medicine, largely as a way to translate research into daily practice. This focus on evidence guided much of his work as Editor-in-Chief for The Journal of Family Practice, a title he held for a decade. He also worked with state Academies of Family Physicians for more than a decade to create a new conference series centered on short, practical clinical topics and based completely on summaries of recent research. Any listener of the Primary Care Update podcasts could hear his thoughtful questioning of current research and his wise approach to its integration into practice.

John was more than a thoughtful and kind clinician, an outstanding educator, and a gifted researcher; he was a natural leader. John had the capacity to understand the systems in which he worked and was able to skillfully guide teams to improve those systems. He served as the Chair of Family Medicine at the Cleveland Clinic and then at the University of Illinois Chicago (UIC), and mentored many faculty, residents, and students during his time at those institutions.

After retiring from UIC, John and Val moved back to Escanaba. At his retirement dinner, his children (Michael, Laura, Zach, Anna, and Olivia) gifted him a beautiful maple acoustic guitar with which he then serenaded the attendees. John was an avid tennis player and often would tell us he would have to skip meeting us for dinner while away at a conference because he had found a tennis opponent! Most of all, he loved to set out on his 35-foot sailboat on Big Bay de Noc or on Green Bay. We have fond memories of the days spent sailing with John and hope that he has found fair winds and following seas.

Henry C. Barry, MD, MS
Mark Ebell, MD, MS
Kate Rowland, MD, MS, FAAFP

Issue
The Journal of Family Practice - 71(7)
Issue
The Journal of Family Practice - 71(7)
Page Number
278-279
Page Number
278-279
Publications
Publications
Article Type
Display Headline
In Memoriam: John Hickner, MD, MSc
Display Headline
In Memoriam: John Hickner, MD, MSc
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 08/24/2022 - 10:45
Un-Gate On Date
Wed, 08/24/2022 - 10:45
Use ProPublica
CFC Schedule Remove Status
Wed, 08/24/2022 - 10:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Pfizer seeks approval for updated COVID booster

Article Type
Changed
Thu, 12/15/2022 - 14:28

Pfizer has sent an application to the Food and Drug Administration for emergency use authorization of its updated COVID-19 booster vaccine for the fall of 2022, the company announced on Aug. 22.

The vaccine, which is adapted for the BA.4 and BA.5 Omicron variants, would be meant for ages 12 and older. If authorized by the FDA, the doses could ship as soon as September.

“Having rapidly scaled up production, we are positioned to immediately begin distribution of the bivalent Omicron BA.4/BA.5 boosters, if authorized, to help protect individuals and families as we prepare for potential fall and winter surges,” Albert Bourla, PhD, Pfizer’s chairman and CEO, said in the statement.

Earlier this year, the FDA ordered vaccine makers such as Pfizer and Moderna to update their shots to target BA.4 and BA.5, which are better at escaping immunity from earlier vaccines and previous infections.

The United States has a contract to buy 105 million of the Pfizer doses and 66 million of the Moderna doses, according to The Associated Press. Moderna is expected to file its FDA application soon as well.

The new shots target both the original spike protein on the coronavirus and the spike mutations carried by BA.4 and BA.5. For now, BA.5 is causing 89% of new infections in the United States, followed by BA.4.6 with 6.3% and BA.4 with 4.3%, according to the latest Centers for Disease Control and Prevention data.

There’s no way to tell if BA.5 will still be the dominant strain this winter or if new variant will replace it, the AP reported. But public health officials have supported the updated boosters as a way to target the most recent strains and increase immunity again.

On Aug. 15, Great Britain became the first country to authorize another one of Moderna’s updated vaccines, which adds protection against BA.1, or the original Omicron strain that became dominant in the winter of 2021-2022. European regulators are considering this shot, the AP reported, but the United States opted not to use this version since new Omicron variants have become dominant.

To approve the latest Pfizer shot, the FDA will rely on scientific testing of prior updates to the vaccine, rather than the newest boosters, to decide whether to fast-track the updated shots for fall, the AP reported. This method is like how flu vaccines are updated each year without large studies that take months.

Previously, Pfizer announced results from a study that found the earlier Omicron update significantly boosted antibodies capable of fighting the BA.1 variant and provided some protection against BA.4 and BA.5. The company’s latest FDA application contains that data and animal testing on the newest booster, the AP reported.

Pfizer will start a trial using the BA.4/BA.5 booster in coming weeks to get more data on how well the latest shot works. Moderna has begun a similar study.

The full results from these studies won’t be available before a fall booster campaign, which is why the FDA and public health officials have called for an updated shot to be ready for distribution in September.

“It’s clear that none of these vaccines are going to completely prevent infection,” Rachel Presti, MD, a researcher with the Moderna trial and an infectious diseases specialist at Washington University in St. Louis, told the AP.

But previous studies of variant booster candidates have shown that “you still get a broader immune response giving a variant booster than giving the same booster,” she said.

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

Pfizer has sent an application to the Food and Drug Administration for emergency use authorization of its updated COVID-19 booster vaccine for the fall of 2022, the company announced on Aug. 22.

The vaccine, which is adapted for the BA.4 and BA.5 Omicron variants, would be meant for ages 12 and older. If authorized by the FDA, the doses could ship as soon as September.

“Having rapidly scaled up production, we are positioned to immediately begin distribution of the bivalent Omicron BA.4/BA.5 boosters, if authorized, to help protect individuals and families as we prepare for potential fall and winter surges,” Albert Bourla, PhD, Pfizer’s chairman and CEO, said in the statement.

Earlier this year, the FDA ordered vaccine makers such as Pfizer and Moderna to update their shots to target BA.4 and BA.5, which are better at escaping immunity from earlier vaccines and previous infections.

The United States has a contract to buy 105 million of the Pfizer doses and 66 million of the Moderna doses, according to The Associated Press. Moderna is expected to file its FDA application soon as well.

The new shots target both the original spike protein on the coronavirus and the spike mutations carried by BA.4 and BA.5. For now, BA.5 is causing 89% of new infections in the United States, followed by BA.4.6 with 6.3% and BA.4 with 4.3%, according to the latest Centers for Disease Control and Prevention data.

There’s no way to tell if BA.5 will still be the dominant strain this winter or if new variant will replace it, the AP reported. But public health officials have supported the updated boosters as a way to target the most recent strains and increase immunity again.

On Aug. 15, Great Britain became the first country to authorize another one of Moderna’s updated vaccines, which adds protection against BA.1, or the original Omicron strain that became dominant in the winter of 2021-2022. European regulators are considering this shot, the AP reported, but the United States opted not to use this version since new Omicron variants have become dominant.

To approve the latest Pfizer shot, the FDA will rely on scientific testing of prior updates to the vaccine, rather than the newest boosters, to decide whether to fast-track the updated shots for fall, the AP reported. This method is like how flu vaccines are updated each year without large studies that take months.

Previously, Pfizer announced results from a study that found the earlier Omicron update significantly boosted antibodies capable of fighting the BA.1 variant and provided some protection against BA.4 and BA.5. The company’s latest FDA application contains that data and animal testing on the newest booster, the AP reported.

Pfizer will start a trial using the BA.4/BA.5 booster in coming weeks to get more data on how well the latest shot works. Moderna has begun a similar study.

The full results from these studies won’t be available before a fall booster campaign, which is why the FDA and public health officials have called for an updated shot to be ready for distribution in September.

“It’s clear that none of these vaccines are going to completely prevent infection,” Rachel Presti, MD, a researcher with the Moderna trial and an infectious diseases specialist at Washington University in St. Louis, told the AP.

But previous studies of variant booster candidates have shown that “you still get a broader immune response giving a variant booster than giving the same booster,” she said.

A version of this article first appeared on WebMD.com.

Pfizer has sent an application to the Food and Drug Administration for emergency use authorization of its updated COVID-19 booster vaccine for the fall of 2022, the company announced on Aug. 22.

The vaccine, which is adapted for the BA.4 and BA.5 Omicron variants, would be meant for ages 12 and older. If authorized by the FDA, the doses could ship as soon as September.

“Having rapidly scaled up production, we are positioned to immediately begin distribution of the bivalent Omicron BA.4/BA.5 boosters, if authorized, to help protect individuals and families as we prepare for potential fall and winter surges,” Albert Bourla, PhD, Pfizer’s chairman and CEO, said in the statement.

Earlier this year, the FDA ordered vaccine makers such as Pfizer and Moderna to update their shots to target BA.4 and BA.5, which are better at escaping immunity from earlier vaccines and previous infections.

The United States has a contract to buy 105 million of the Pfizer doses and 66 million of the Moderna doses, according to The Associated Press. Moderna is expected to file its FDA application soon as well.

The new shots target both the original spike protein on the coronavirus and the spike mutations carried by BA.4 and BA.5. For now, BA.5 is causing 89% of new infections in the United States, followed by BA.4.6 with 6.3% and BA.4 with 4.3%, according to the latest Centers for Disease Control and Prevention data.

There’s no way to tell if BA.5 will still be the dominant strain this winter or if new variant will replace it, the AP reported. But public health officials have supported the updated boosters as a way to target the most recent strains and increase immunity again.

On Aug. 15, Great Britain became the first country to authorize another one of Moderna’s updated vaccines, which adds protection against BA.1, or the original Omicron strain that became dominant in the winter of 2021-2022. European regulators are considering this shot, the AP reported, but the United States opted not to use this version since new Omicron variants have become dominant.

To approve the latest Pfizer shot, the FDA will rely on scientific testing of prior updates to the vaccine, rather than the newest boosters, to decide whether to fast-track the updated shots for fall, the AP reported. This method is like how flu vaccines are updated each year without large studies that take months.

Previously, Pfizer announced results from a study that found the earlier Omicron update significantly boosted antibodies capable of fighting the BA.1 variant and provided some protection against BA.4 and BA.5. The company’s latest FDA application contains that data and animal testing on the newest booster, the AP reported.

Pfizer will start a trial using the BA.4/BA.5 booster in coming weeks to get more data on how well the latest shot works. Moderna has begun a similar study.

The full results from these studies won’t be available before a fall booster campaign, which is why the FDA and public health officials have called for an updated shot to be ready for distribution in September.

“It’s clear that none of these vaccines are going to completely prevent infection,” Rachel Presti, MD, a researcher with the Moderna trial and an infectious diseases specialist at Washington University in St. Louis, told the AP.

But previous studies of variant booster candidates have shown that “you still get a broader immune response giving a variant booster than giving the same booster,” she said.

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Young children with leukemia are outliving teens

Article Type
Changed
Fri, 12/16/2022 - 11:25

Two new studies offer insights into leukemia survival rates in the United States. From 2000 to 2014, a drop in mortality among children spurred a rise in 5-year leukemia survival rates among patients aged 0-24. But adolescents and young adults who survive 5 years after diagnosis face an ongoing higher risk of death, recent research revealed, and their long-term survival is lower compared to that of the general population.

“Outcomes are improving. However, additional efforts, support, and resources are needed to further improve short- and long-term survival for acute leukemia survivors. Targeted efforts focused on populations that face greater disparities in their survival are needed to move the needle faster,” Michael Roth, MD, codirector of the Adolescent and Young Adult Oncology Program at the University of Texas M.D. Anderson Cancer Center, said in an interview.

In one study, released in The Lancet Child & Adolescent Health, an international team of researchers tracked survival outcomes from various types of leukemia in 61 nations. The study focused on the years 2000-2014 and followed patients aged 0-24.

“Age-standardized 5-year net survival in children, adolescents, and young adults for all leukemias combined during 2010-14 varied widely, ranging from 46% in Mexico to more than 85% in Canada, Cyprus, Belgium, Denmark, Finland, and Australia,” the researchers wrote. “Throughout 2000-14, survival from all leukemias combined remained consistently higher for children than adolescents and young adults, and minimal improvement was seen for adolescents and young adults in most countries.”

The U.S. data came from 41 states that cover 86% of the nation’s population, lead author Naomi Ssenyonga, a research fellow at London School of Hygiene & Tropical Medicine, said in an interview.

The 5-year survival rate for acute lymphoid leukemia (ALL) rose from 80% during 2000-2004 to 86% during 2010-2014. Survival in patients with acute myeloid leukemia (AML) was lower than for other subtypes: 66% in 2010-2014 vs. 57% in 2000-2004.

In regard to all leukemias, “we noted a steady increase in the U.S. of 6 percentage points in 5-year survival, up from 77% for patients diagnosed during 2000-2004 to 83% for those diagnosed during 2010-2014,” Ms. Ssenyonga said. “The gains were largely driven by the improvements seen among children.”

Why haven’t adolescents and young adults gained as much ground in survival?

“They often have unique clinical needs,” Ms. Ssenyonga said. “Over the past few years, adolescents and young adults with leukemia in some parts of the world, including the U.S., have increasingly been treated under pediatric protocols. This has led to higher survival. However, this approach has not been adopted consistently, and survival for adolescents and young adults with leukemia is still generally lower than survival for children.”

Gwen Nichols, MD, chief medical officer of the Leukemia & Lymphoma Society, agreed that pediatric treatment protocols hold promise as treatments for young adults. However, “because we arbitrarily set an age cutoff for being an adult, many of these patients are treated by an adult [nonpediatric] hematologist/oncologist, and some patients in the 20-39 age group do not receive the more intensive treatment regimens given to children,” she said in an interview.

In another study, published in Cancer Epidemiology, Biomarkers, & Prevention, M.D. Anderson Cancer Center’s Dr. Roth and colleagues tracked 1,938 patients with ALL and 2,350 with AML who were diagnosed at ages 15-39 from 1980 to 2009. All lived at least 5 years after diagnosis. In both groups, about 58% were White, and most of the rest were Hispanic. The median age of diagnosis for ALL was 23 (range: 15-39) and 28 years for AML (range: 15-39).

“For ALL, 10-year survival for those diagnosed in the 1980s, 1990s, and 2000s was 83%, 88%, and 88%, respectively,” the researchers reported. “Ten-year survival for AML was 82%, 90%, and 90% for those diagnosed in the 1980s, 1990s, and 2000s, respectively.”

“Early mortality within 10 years of diagnosis was mostly secondary to leukemia progressing or recurring. We believe that later mortality is secondary to the development of late side effects from their cancer treatment,” Dr. Roth said.

He noted that many adolescents and young adults with ALL or AML receive stem-cell transplants. “This treatment approach is effective. However, it is associated with short- and long-term toxicity that impacts patients’ health for many years after treatment.”

Indeed, up to 80% of acute leukemia survivors have significant health complications after therapy, said the Leukemia & Lymphoma Society’s Dr. Nichols, who wasn’t surprised by the findings. According to the society, “even when treatments are effective, more than 70% of childhood cancer survivors have a chronic health condition and 42% have a severe, disabling or life-threatening condition 30 years after diagnosis.”

“It would be interesting to understand the male predominance better,” she added, noting that the study found that male patients had worse long-term survival than females (survival time ratio: 0.61, 95% confidence interval, 0.45-0.82). “While it is tempting to suggest it is due to difference in cardiac disease, I am not aware of data to support why there is this survival difference.”

What’s next? “In ALL, we now have a number of new modalities to treat high-risk and relapsed disease such as antibodies and CAR-T,” Dr. Nichols said. “We anticipate that 5-year survival can improve utilizing these modalities due to getting more patients into remission, hopefully while reducing chemotherapeutic toxicity.”

Dr. Nichol’s also highlighted the society’s new genomic-led Pediatric Acute Leukemia (PedAL) Master Clinical Trial, which began enrolling children with acute leukemia in the United States and Canada this year, in an effort to transform medicine’s traditional high-level chemotherapy strategy to their care. The project was launched in collaboration with the National Cancer Institute, Children’s Oncology Group, and the European Pediatric Acute Leukemia Foundation.

As part of the screening process, the biology of each child’s cancer will be identified, and families will be encouraged to enroll them in appropriate targeted therapy trials.

“Until we are able to decrease the toxicity of leukemia regimens, we won’t see a dramatic shift in late effects and thus in morbidity and mortality,” Dr. Nichols said. “The trial is an effort to test newer, less toxic regimens to begin to change that cycle.”

The 5-year survival study was funded by Children with Cancer UK, Institut National du Cancer, La Ligue Contre le Cancer, Centers for Disease Control and Prevention, Swiss Re, Swiss Cancer Research foundation, Swiss Cancer League, Rossy Family Foundation, National Cancer Institute, and the American Cancer Society. One author reports a grant from Macmillan Cancer Support, consultancy fees from Pfizer, and unsolicited small gifts from Moondance Cancer Initiative for philanthropic work. The other authors report no disclosures.

The long-term survival study was funded by the National Cancer Institute, the Archer Foundation and LyondellBasell Industries. Dr. Roth reports no disclosures; other authors report various disclosures. Dr. Nichols reports no disclosures.

Publications
Topics
Sections

Two new studies offer insights into leukemia survival rates in the United States. From 2000 to 2014, a drop in mortality among children spurred a rise in 5-year leukemia survival rates among patients aged 0-24. But adolescents and young adults who survive 5 years after diagnosis face an ongoing higher risk of death, recent research revealed, and their long-term survival is lower compared to that of the general population.

“Outcomes are improving. However, additional efforts, support, and resources are needed to further improve short- and long-term survival for acute leukemia survivors. Targeted efforts focused on populations that face greater disparities in their survival are needed to move the needle faster,” Michael Roth, MD, codirector of the Adolescent and Young Adult Oncology Program at the University of Texas M.D. Anderson Cancer Center, said in an interview.

In one study, released in The Lancet Child & Adolescent Health, an international team of researchers tracked survival outcomes from various types of leukemia in 61 nations. The study focused on the years 2000-2014 and followed patients aged 0-24.

“Age-standardized 5-year net survival in children, adolescents, and young adults for all leukemias combined during 2010-14 varied widely, ranging from 46% in Mexico to more than 85% in Canada, Cyprus, Belgium, Denmark, Finland, and Australia,” the researchers wrote. “Throughout 2000-14, survival from all leukemias combined remained consistently higher for children than adolescents and young adults, and minimal improvement was seen for adolescents and young adults in most countries.”

The U.S. data came from 41 states that cover 86% of the nation’s population, lead author Naomi Ssenyonga, a research fellow at London School of Hygiene & Tropical Medicine, said in an interview.

The 5-year survival rate for acute lymphoid leukemia (ALL) rose from 80% during 2000-2004 to 86% during 2010-2014. Survival in patients with acute myeloid leukemia (AML) was lower than for other subtypes: 66% in 2010-2014 vs. 57% in 2000-2004.

In regard to all leukemias, “we noted a steady increase in the U.S. of 6 percentage points in 5-year survival, up from 77% for patients diagnosed during 2000-2004 to 83% for those diagnosed during 2010-2014,” Ms. Ssenyonga said. “The gains were largely driven by the improvements seen among children.”

Why haven’t adolescents and young adults gained as much ground in survival?

“They often have unique clinical needs,” Ms. Ssenyonga said. “Over the past few years, adolescents and young adults with leukemia in some parts of the world, including the U.S., have increasingly been treated under pediatric protocols. This has led to higher survival. However, this approach has not been adopted consistently, and survival for adolescents and young adults with leukemia is still generally lower than survival for children.”

Gwen Nichols, MD, chief medical officer of the Leukemia & Lymphoma Society, agreed that pediatric treatment protocols hold promise as treatments for young adults. However, “because we arbitrarily set an age cutoff for being an adult, many of these patients are treated by an adult [nonpediatric] hematologist/oncologist, and some patients in the 20-39 age group do not receive the more intensive treatment regimens given to children,” she said in an interview.

In another study, published in Cancer Epidemiology, Biomarkers, & Prevention, M.D. Anderson Cancer Center’s Dr. Roth and colleagues tracked 1,938 patients with ALL and 2,350 with AML who were diagnosed at ages 15-39 from 1980 to 2009. All lived at least 5 years after diagnosis. In both groups, about 58% were White, and most of the rest were Hispanic. The median age of diagnosis for ALL was 23 (range: 15-39) and 28 years for AML (range: 15-39).

“For ALL, 10-year survival for those diagnosed in the 1980s, 1990s, and 2000s was 83%, 88%, and 88%, respectively,” the researchers reported. “Ten-year survival for AML was 82%, 90%, and 90% for those diagnosed in the 1980s, 1990s, and 2000s, respectively.”

“Early mortality within 10 years of diagnosis was mostly secondary to leukemia progressing or recurring. We believe that later mortality is secondary to the development of late side effects from their cancer treatment,” Dr. Roth said.

He noted that many adolescents and young adults with ALL or AML receive stem-cell transplants. “This treatment approach is effective. However, it is associated with short- and long-term toxicity that impacts patients’ health for many years after treatment.”

Indeed, up to 80% of acute leukemia survivors have significant health complications after therapy, said the Leukemia & Lymphoma Society’s Dr. Nichols, who wasn’t surprised by the findings. According to the society, “even when treatments are effective, more than 70% of childhood cancer survivors have a chronic health condition and 42% have a severe, disabling or life-threatening condition 30 years after diagnosis.”

“It would be interesting to understand the male predominance better,” she added, noting that the study found that male patients had worse long-term survival than females (survival time ratio: 0.61, 95% confidence interval, 0.45-0.82). “While it is tempting to suggest it is due to difference in cardiac disease, I am not aware of data to support why there is this survival difference.”

What’s next? “In ALL, we now have a number of new modalities to treat high-risk and relapsed disease such as antibodies and CAR-T,” Dr. Nichols said. “We anticipate that 5-year survival can improve utilizing these modalities due to getting more patients into remission, hopefully while reducing chemotherapeutic toxicity.”

Dr. Nichol’s also highlighted the society’s new genomic-led Pediatric Acute Leukemia (PedAL) Master Clinical Trial, which began enrolling children with acute leukemia in the United States and Canada this year, in an effort to transform medicine’s traditional high-level chemotherapy strategy to their care. The project was launched in collaboration with the National Cancer Institute, Children’s Oncology Group, and the European Pediatric Acute Leukemia Foundation.

As part of the screening process, the biology of each child’s cancer will be identified, and families will be encouraged to enroll them in appropriate targeted therapy trials.

“Until we are able to decrease the toxicity of leukemia regimens, we won’t see a dramatic shift in late effects and thus in morbidity and mortality,” Dr. Nichols said. “The trial is an effort to test newer, less toxic regimens to begin to change that cycle.”

The 5-year survival study was funded by Children with Cancer UK, Institut National du Cancer, La Ligue Contre le Cancer, Centers for Disease Control and Prevention, Swiss Re, Swiss Cancer Research foundation, Swiss Cancer League, Rossy Family Foundation, National Cancer Institute, and the American Cancer Society. One author reports a grant from Macmillan Cancer Support, consultancy fees from Pfizer, and unsolicited small gifts from Moondance Cancer Initiative for philanthropic work. The other authors report no disclosures.

The long-term survival study was funded by the National Cancer Institute, the Archer Foundation and LyondellBasell Industries. Dr. Roth reports no disclosures; other authors report various disclosures. Dr. Nichols reports no disclosures.

Two new studies offer insights into leukemia survival rates in the United States. From 2000 to 2014, a drop in mortality among children spurred a rise in 5-year leukemia survival rates among patients aged 0-24. But adolescents and young adults who survive 5 years after diagnosis face an ongoing higher risk of death, recent research revealed, and their long-term survival is lower compared to that of the general population.

“Outcomes are improving. However, additional efforts, support, and resources are needed to further improve short- and long-term survival for acute leukemia survivors. Targeted efforts focused on populations that face greater disparities in their survival are needed to move the needle faster,” Michael Roth, MD, codirector of the Adolescent and Young Adult Oncology Program at the University of Texas M.D. Anderson Cancer Center, said in an interview.

In one study, released in The Lancet Child & Adolescent Health, an international team of researchers tracked survival outcomes from various types of leukemia in 61 nations. The study focused on the years 2000-2014 and followed patients aged 0-24.

“Age-standardized 5-year net survival in children, adolescents, and young adults for all leukemias combined during 2010-14 varied widely, ranging from 46% in Mexico to more than 85% in Canada, Cyprus, Belgium, Denmark, Finland, and Australia,” the researchers wrote. “Throughout 2000-14, survival from all leukemias combined remained consistently higher for children than adolescents and young adults, and minimal improvement was seen for adolescents and young adults in most countries.”

The U.S. data came from 41 states that cover 86% of the nation’s population, lead author Naomi Ssenyonga, a research fellow at London School of Hygiene & Tropical Medicine, said in an interview.

The 5-year survival rate for acute lymphoid leukemia (ALL) rose from 80% during 2000-2004 to 86% during 2010-2014. Survival in patients with acute myeloid leukemia (AML) was lower than for other subtypes: 66% in 2010-2014 vs. 57% in 2000-2004.

In regard to all leukemias, “we noted a steady increase in the U.S. of 6 percentage points in 5-year survival, up from 77% for patients diagnosed during 2000-2004 to 83% for those diagnosed during 2010-2014,” Ms. Ssenyonga said. “The gains were largely driven by the improvements seen among children.”

Why haven’t adolescents and young adults gained as much ground in survival?

“They often have unique clinical needs,” Ms. Ssenyonga said. “Over the past few years, adolescents and young adults with leukemia in some parts of the world, including the U.S., have increasingly been treated under pediatric protocols. This has led to higher survival. However, this approach has not been adopted consistently, and survival for adolescents and young adults with leukemia is still generally lower than survival for children.”

Gwen Nichols, MD, chief medical officer of the Leukemia & Lymphoma Society, agreed that pediatric treatment protocols hold promise as treatments for young adults. However, “because we arbitrarily set an age cutoff for being an adult, many of these patients are treated by an adult [nonpediatric] hematologist/oncologist, and some patients in the 20-39 age group do not receive the more intensive treatment regimens given to children,” she said in an interview.

In another study, published in Cancer Epidemiology, Biomarkers, & Prevention, M.D. Anderson Cancer Center’s Dr. Roth and colleagues tracked 1,938 patients with ALL and 2,350 with AML who were diagnosed at ages 15-39 from 1980 to 2009. All lived at least 5 years after diagnosis. In both groups, about 58% were White, and most of the rest were Hispanic. The median age of diagnosis for ALL was 23 (range: 15-39) and 28 years for AML (range: 15-39).

“For ALL, 10-year survival for those diagnosed in the 1980s, 1990s, and 2000s was 83%, 88%, and 88%, respectively,” the researchers reported. “Ten-year survival for AML was 82%, 90%, and 90% for those diagnosed in the 1980s, 1990s, and 2000s, respectively.”

“Early mortality within 10 years of diagnosis was mostly secondary to leukemia progressing or recurring. We believe that later mortality is secondary to the development of late side effects from their cancer treatment,” Dr. Roth said.

He noted that many adolescents and young adults with ALL or AML receive stem-cell transplants. “This treatment approach is effective. However, it is associated with short- and long-term toxicity that impacts patients’ health for many years after treatment.”

Indeed, up to 80% of acute leukemia survivors have significant health complications after therapy, said the Leukemia & Lymphoma Society’s Dr. Nichols, who wasn’t surprised by the findings. According to the society, “even when treatments are effective, more than 70% of childhood cancer survivors have a chronic health condition and 42% have a severe, disabling or life-threatening condition 30 years after diagnosis.”

“It would be interesting to understand the male predominance better,” she added, noting that the study found that male patients had worse long-term survival than females (survival time ratio: 0.61, 95% confidence interval, 0.45-0.82). “While it is tempting to suggest it is due to difference in cardiac disease, I am not aware of data to support why there is this survival difference.”

What’s next? “In ALL, we now have a number of new modalities to treat high-risk and relapsed disease such as antibodies and CAR-T,” Dr. Nichols said. “We anticipate that 5-year survival can improve utilizing these modalities due to getting more patients into remission, hopefully while reducing chemotherapeutic toxicity.”

Dr. Nichol’s also highlighted the society’s new genomic-led Pediatric Acute Leukemia (PedAL) Master Clinical Trial, which began enrolling children with acute leukemia in the United States and Canada this year, in an effort to transform medicine’s traditional high-level chemotherapy strategy to their care. The project was launched in collaboration with the National Cancer Institute, Children’s Oncology Group, and the European Pediatric Acute Leukemia Foundation.

As part of the screening process, the biology of each child’s cancer will be identified, and families will be encouraged to enroll them in appropriate targeted therapy trials.

“Until we are able to decrease the toxicity of leukemia regimens, we won’t see a dramatic shift in late effects and thus in morbidity and mortality,” Dr. Nichols said. “The trial is an effort to test newer, less toxic regimens to begin to change that cycle.”

The 5-year survival study was funded by Children with Cancer UK, Institut National du Cancer, La Ligue Contre le Cancer, Centers for Disease Control and Prevention, Swiss Re, Swiss Cancer Research foundation, Swiss Cancer League, Rossy Family Foundation, National Cancer Institute, and the American Cancer Society. One author reports a grant from Macmillan Cancer Support, consultancy fees from Pfizer, and unsolicited small gifts from Moondance Cancer Initiative for philanthropic work. The other authors report no disclosures.

The long-term survival study was funded by the National Cancer Institute, the Archer Foundation and LyondellBasell Industries. Dr. Roth reports no disclosures; other authors report various disclosures. Dr. Nichols reports no disclosures.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Are we up the creek without a paddle? What COVID, monkeypox, and nature are trying to tell us

Article Type
Changed
Mon, 08/29/2022 - 08:56

Monkeypox. Polio. Covid. A quick glance at the news on any given day seems to indicate that outbreaks, epidemics, and perhaps even pandemics are increasing in frequency.

Granted, these types of events are hardly new; from the plagues of the 5th and 13th centuries to the Spanish flu in the 20th century and SARS-CoV-2 today, they’ve been with us from time immemorial. 

What appears to be different, however, is not their frequency, but their intensity, with research reinforcing that we may be facing unique challenges and smaller windows to intervene as we move forward.

Findings from a modeling study, published in 2021 in Proceedings of the National Academy of Sciences, underscore that without effective intervention, the probability of extreme events like COVID-19 will likely increase threefold in the coming decades.

“The fact is, pandemic preparedness is not something that people have valued or thought of as important, or paid much attention to,” Amesh Adalja, MD, senior scholar, Johns Hopkins Center for Health Security, Baltimore, told this news organization.

“It’s all been based on some unusual cluster of cases that were causing severe disease and overwhelming local authorities. So often, like Indiana Jones, somebody got dispatched to deal with an outbreak,” Dr. Adalja said.

In a perfect post-COVID world, government bodies, scientists, clinicians, and others would cross silos to coordinate pandemic prevention, not just preparedness. The public would trust those who carry the title “public health” in their daily responsibilities, and in turn, public health experts would get back to their core responsibility – infectious disease preparedness – the role they were initially assigned following Europe’s Black Death during the 14th century. Instead, the world finds itself at a crossroads, with emerging and reemerging infectious disease outbreaks that on the surface appear to arise haphazardly but in reality are the result of decades of reaction and containment policies aimed at putting out fires, not addressing their cause.

Dr. Adalja noted that only when the threat of biological weapons became a reality in the mid-2000s was there a realization that economies of scale could be exploited by merging interests and efforts to develop health security medical countermeasures. For example, it encouraged governments to more closely integrate agencies like the Biomedical Advanced Research and Development Authority and infectious disease research organizations and individuals.

Still, while significant strides have been made in certain areas, the ongoing COVID-19 pandemic has revealed substantial weaknesses remaining in public and private health systems, as well as major gaps in infectious disease preparedness.
 

The role of spillover events

No matter whom you ask, scientists, public health and conservation experts, and infectious disease clinicians all point to one of the most important threats to human health. As Walt Kelly’s Pogo famously put it, “We have met the enemy, and he is us.”

“The reason why these outbreaks of novel infectious diseases are increasingly occurring is because of human-driven environmental change, particularly land use, unsafe practices when raising farmed animals, and commercial wildlife markets,” Neil M. Vora, MD, a physician specializing in pandemic prevention at Conservation International and a former Centers for Disease Control and Prevention epidemic intelligence officer, said in an interview.

In fact, more than 60% of emerging infections and diseases are due to these “spillover events” (zoonotic spillover) that occur when pathogens that commonly circulate in wildlife jump over to new, human hosts.

Several examples come to mind.

COVID-19 may have begun as an enzootic virus from two undetermined animals, using the Huanan Seafood Market as a possible intermediate reservoir, according to a July 26 preprint in the journal Science. 

Likewise, while the Ebola virus was originally attributed to deforestation efforts to create palm oil (which allowed fruit bat carriers to transfer the virus to humans), recent research suggests that bats dwelling in the walls of human dwellings and hospitals are responsible for the 2018 outbreak in the Democratic Republic of Congo. 

(Incidentally, just this week, a new Ebola case was confirmed in Eastern Congo, and it has been genetically linked to the previous outbreak, despite that outbreak having been declared over in early July.)

“When we clear forests, we create opportunities for humans to live alongside the forest edge and displace wildlife. There’s evidence that shows when [these] biodiverse areas are cleared, specialist species that evolved to live in the forests first start to disappear, whereas generalist species – rodents and bats – continue to survive and are able to carry pathogens that can be passed on to humans,” Dr. Vora explained.

So far, China’s outbreak of the novel Langya henipavirus is believed to have spread (either directly or indirectly) by rodents and shrews, according to reports from public health authorities like the European Centre for Disease Prevention and Control, which is currently monitoring the situation. 

Yet, an overreliance on surveillance and containment only perpetuates what Dr. Vora says are cycles of panic and neglect.

“We saw it with Ebola in 2015, in 2016 to 2017 with Zika, you see it with tuberculosis, with sexually transmitted infections, and with COVID. You have policymakers working on solutions, and once they think that they’ve fixed the problem, they’re going to move on to the next crisis.”

It’s also a question of equity.

Reports detailing the reemergence of monkeypox in Nigeria in 2017 were largely ignored, despite the fact that the United States assisted in diagnosing an early case in an 11-year-old boy. At the time, it was clear that the virus was spreading by human-to-human transmission versus animal-to-human transmission, something that had not been seen previously. 

“The current model [is] waiting for pathogens to spill over and then [continuing] to spread signals that rich countries are tolerant of these outbreaks so long as they don’t grow into epidemics or pandemics,” Dr. Vora said.

This model is clearly broken; roughly 5 years after Nigeria reported the resurgence of monkeypox, the United States has more than 14,000 confirmed cases, which represents more than a quarter of the total number of cases reported worldwide. 
 

 

 

Public health on the brink

I’s difficult to imagine a future without outbreaks and more pandemics, and if experts are to be believed, we are ill-prepared. 

“I think that we are in a situation where this is a major threat, and people have become complacent about it,” said Dr. Adalja, who noted that we should be asking ourselves if the “government is actually in a position to be able to respond in a way that we need them to or is [that response] tied up in bureaucracy and inefficiency?”

COVID-19 should have been seen as a wake-up call, and many of those deaths were preventable. “With monkeypox, they’re faltering; it should have been a layup, not a disaster,” he emphasized.

Ellen Eaton, MD, associate professor of infectious diseases at the University of Alabama at Birmingham, also pointed to the reality that by the time COVID-19 reached North America, the United States had already moved away from the model of the public health department as the epicenter of knowledge, education, awareness, and, ironically, public health.

“Thinking about my community, very few people knew the face and name of our local and state health officers,” she told this news organization.  

“There was just this inherent mistrust of these people. If you add in a lot of talking heads, a lot of politicians and messaging from non-experts that countered what was coming out of our public health agencies early, you had this huge disconnect; in the South, it was the perfect storm for vaccine hesitancy.”

At last count, this perfect storm has led to 1.46 million COVID cases and just over 20,000 deaths – many of which were preventable – in Alabama alone. 

“In certain parts of America, we were starting with a broken system with limited resources and few providers,” Dr. Eaton explained.

Dr. Eaton said that a lot of fields, not just medicine and public health, have finite resources that have been stretched to capacity by COVID, and now monkeypox, and wondered what was next as we’re headed into autumn and influenza season. But she also mentioned the tremendous implications of climate change on infectious diseases and community health and wellness.

“There’s a tremendous need to have the ability to survey not just humans but also how the disease burden in our environment that is fluctuating with climate change is going to impact communities in really important ways,” Dr. Eaton said. 
 

Upstream prevention

Dr. Vora said he could not agree more and believes that upstream prevention holds the key. 

“We have to make sure while there’s tension on this issue that the right solutions are implemented,” he said. 

In coming years, postspillover containment strategies – vaccine research and development and strengthening health care surveillance, for example – are likely to become inadequate.

“We saw it with COVID and we are seeing it again with monkeypox,” Dr. Vora said. “We also have to invest further upstream to prevent spillovers in the first place, for example, by addressing deforestation, commercial wildlife markets and trade, [and] infection control when raising farm animals.”

“The thing is, when you invest in those upstream solutions, you are also mitigating climate change and loss of biodiversity. I’m not saying that we should not invest in postspillover containment efforts; we’re never going to contain every spillover. But we also have to invest in prevention,” he added.

In a piece published in Nature, Dr. Vora and his coauthors acknowledge that several international bodies such as the World Health Organization and G7 have invested in initiatives to facilitate coordinated, global responses to climate change, pandemic preparedness, and response. But they point out that these efforts fail to “explicitly address the negative feedback cycle between environmental degradation, wildlife exploitation, and the emergence of pathogens.”

“Environmental conservation is no longer a left-wing fringe issue, it’s moving into public consciousness, and ... it is public health,” Dr. Vora said. “When we destroy nature, we’re destroying our own ability to survive.”

Dr. Adalja, Dr. Vora, and Dr. Eaton report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Monkeypox. Polio. Covid. A quick glance at the news on any given day seems to indicate that outbreaks, epidemics, and perhaps even pandemics are increasing in frequency.

Granted, these types of events are hardly new; from the plagues of the 5th and 13th centuries to the Spanish flu in the 20th century and SARS-CoV-2 today, they’ve been with us from time immemorial. 

What appears to be different, however, is not their frequency, but their intensity, with research reinforcing that we may be facing unique challenges and smaller windows to intervene as we move forward.

Findings from a modeling study, published in 2021 in Proceedings of the National Academy of Sciences, underscore that without effective intervention, the probability of extreme events like COVID-19 will likely increase threefold in the coming decades.

“The fact is, pandemic preparedness is not something that people have valued or thought of as important, or paid much attention to,” Amesh Adalja, MD, senior scholar, Johns Hopkins Center for Health Security, Baltimore, told this news organization.

“It’s all been based on some unusual cluster of cases that were causing severe disease and overwhelming local authorities. So often, like Indiana Jones, somebody got dispatched to deal with an outbreak,” Dr. Adalja said.

In a perfect post-COVID world, government bodies, scientists, clinicians, and others would cross silos to coordinate pandemic prevention, not just preparedness. The public would trust those who carry the title “public health” in their daily responsibilities, and in turn, public health experts would get back to their core responsibility – infectious disease preparedness – the role they were initially assigned following Europe’s Black Death during the 14th century. Instead, the world finds itself at a crossroads, with emerging and reemerging infectious disease outbreaks that on the surface appear to arise haphazardly but in reality are the result of decades of reaction and containment policies aimed at putting out fires, not addressing their cause.

Dr. Adalja noted that only when the threat of biological weapons became a reality in the mid-2000s was there a realization that economies of scale could be exploited by merging interests and efforts to develop health security medical countermeasures. For example, it encouraged governments to more closely integrate agencies like the Biomedical Advanced Research and Development Authority and infectious disease research organizations and individuals.

Still, while significant strides have been made in certain areas, the ongoing COVID-19 pandemic has revealed substantial weaknesses remaining in public and private health systems, as well as major gaps in infectious disease preparedness.
 

The role of spillover events

No matter whom you ask, scientists, public health and conservation experts, and infectious disease clinicians all point to one of the most important threats to human health. As Walt Kelly’s Pogo famously put it, “We have met the enemy, and he is us.”

“The reason why these outbreaks of novel infectious diseases are increasingly occurring is because of human-driven environmental change, particularly land use, unsafe practices when raising farmed animals, and commercial wildlife markets,” Neil M. Vora, MD, a physician specializing in pandemic prevention at Conservation International and a former Centers for Disease Control and Prevention epidemic intelligence officer, said in an interview.

In fact, more than 60% of emerging infections and diseases are due to these “spillover events” (zoonotic spillover) that occur when pathogens that commonly circulate in wildlife jump over to new, human hosts.

Several examples come to mind.

COVID-19 may have begun as an enzootic virus from two undetermined animals, using the Huanan Seafood Market as a possible intermediate reservoir, according to a July 26 preprint in the journal Science. 

Likewise, while the Ebola virus was originally attributed to deforestation efforts to create palm oil (which allowed fruit bat carriers to transfer the virus to humans), recent research suggests that bats dwelling in the walls of human dwellings and hospitals are responsible for the 2018 outbreak in the Democratic Republic of Congo. 

(Incidentally, just this week, a new Ebola case was confirmed in Eastern Congo, and it has been genetically linked to the previous outbreak, despite that outbreak having been declared over in early July.)

“When we clear forests, we create opportunities for humans to live alongside the forest edge and displace wildlife. There’s evidence that shows when [these] biodiverse areas are cleared, specialist species that evolved to live in the forests first start to disappear, whereas generalist species – rodents and bats – continue to survive and are able to carry pathogens that can be passed on to humans,” Dr. Vora explained.

So far, China’s outbreak of the novel Langya henipavirus is believed to have spread (either directly or indirectly) by rodents and shrews, according to reports from public health authorities like the European Centre for Disease Prevention and Control, which is currently monitoring the situation. 

Yet, an overreliance on surveillance and containment only perpetuates what Dr. Vora says are cycles of panic and neglect.

“We saw it with Ebola in 2015, in 2016 to 2017 with Zika, you see it with tuberculosis, with sexually transmitted infections, and with COVID. You have policymakers working on solutions, and once they think that they’ve fixed the problem, they’re going to move on to the next crisis.”

It’s also a question of equity.

Reports detailing the reemergence of monkeypox in Nigeria in 2017 were largely ignored, despite the fact that the United States assisted in diagnosing an early case in an 11-year-old boy. At the time, it was clear that the virus was spreading by human-to-human transmission versus animal-to-human transmission, something that had not been seen previously. 

“The current model [is] waiting for pathogens to spill over and then [continuing] to spread signals that rich countries are tolerant of these outbreaks so long as they don’t grow into epidemics or pandemics,” Dr. Vora said.

This model is clearly broken; roughly 5 years after Nigeria reported the resurgence of monkeypox, the United States has more than 14,000 confirmed cases, which represents more than a quarter of the total number of cases reported worldwide. 
 

 

 

Public health on the brink

I’s difficult to imagine a future without outbreaks and more pandemics, and if experts are to be believed, we are ill-prepared. 

“I think that we are in a situation where this is a major threat, and people have become complacent about it,” said Dr. Adalja, who noted that we should be asking ourselves if the “government is actually in a position to be able to respond in a way that we need them to or is [that response] tied up in bureaucracy and inefficiency?”

COVID-19 should have been seen as a wake-up call, and many of those deaths were preventable. “With monkeypox, they’re faltering; it should have been a layup, not a disaster,” he emphasized.

Ellen Eaton, MD, associate professor of infectious diseases at the University of Alabama at Birmingham, also pointed to the reality that by the time COVID-19 reached North America, the United States had already moved away from the model of the public health department as the epicenter of knowledge, education, awareness, and, ironically, public health.

“Thinking about my community, very few people knew the face and name of our local and state health officers,” she told this news organization.  

“There was just this inherent mistrust of these people. If you add in a lot of talking heads, a lot of politicians and messaging from non-experts that countered what was coming out of our public health agencies early, you had this huge disconnect; in the South, it was the perfect storm for vaccine hesitancy.”

At last count, this perfect storm has led to 1.46 million COVID cases and just over 20,000 deaths – many of which were preventable – in Alabama alone. 

“In certain parts of America, we were starting with a broken system with limited resources and few providers,” Dr. Eaton explained.

Dr. Eaton said that a lot of fields, not just medicine and public health, have finite resources that have been stretched to capacity by COVID, and now monkeypox, and wondered what was next as we’re headed into autumn and influenza season. But she also mentioned the tremendous implications of climate change on infectious diseases and community health and wellness.

“There’s a tremendous need to have the ability to survey not just humans but also how the disease burden in our environment that is fluctuating with climate change is going to impact communities in really important ways,” Dr. Eaton said. 
 

Upstream prevention

Dr. Vora said he could not agree more and believes that upstream prevention holds the key. 

“We have to make sure while there’s tension on this issue that the right solutions are implemented,” he said. 

In coming years, postspillover containment strategies – vaccine research and development and strengthening health care surveillance, for example – are likely to become inadequate.

“We saw it with COVID and we are seeing it again with monkeypox,” Dr. Vora said. “We also have to invest further upstream to prevent spillovers in the first place, for example, by addressing deforestation, commercial wildlife markets and trade, [and] infection control when raising farm animals.”

“The thing is, when you invest in those upstream solutions, you are also mitigating climate change and loss of biodiversity. I’m not saying that we should not invest in postspillover containment efforts; we’re never going to contain every spillover. But we also have to invest in prevention,” he added.

In a piece published in Nature, Dr. Vora and his coauthors acknowledge that several international bodies such as the World Health Organization and G7 have invested in initiatives to facilitate coordinated, global responses to climate change, pandemic preparedness, and response. But they point out that these efforts fail to “explicitly address the negative feedback cycle between environmental degradation, wildlife exploitation, and the emergence of pathogens.”

“Environmental conservation is no longer a left-wing fringe issue, it’s moving into public consciousness, and ... it is public health,” Dr. Vora said. “When we destroy nature, we’re destroying our own ability to survive.”

Dr. Adalja, Dr. Vora, and Dr. Eaton report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Monkeypox. Polio. Covid. A quick glance at the news on any given day seems to indicate that outbreaks, epidemics, and perhaps even pandemics are increasing in frequency.

Granted, these types of events are hardly new; from the plagues of the 5th and 13th centuries to the Spanish flu in the 20th century and SARS-CoV-2 today, they’ve been with us from time immemorial. 

What appears to be different, however, is not their frequency, but their intensity, with research reinforcing that we may be facing unique challenges and smaller windows to intervene as we move forward.

Findings from a modeling study, published in 2021 in Proceedings of the National Academy of Sciences, underscore that without effective intervention, the probability of extreme events like COVID-19 will likely increase threefold in the coming decades.

“The fact is, pandemic preparedness is not something that people have valued or thought of as important, or paid much attention to,” Amesh Adalja, MD, senior scholar, Johns Hopkins Center for Health Security, Baltimore, told this news organization.

“It’s all been based on some unusual cluster of cases that were causing severe disease and overwhelming local authorities. So often, like Indiana Jones, somebody got dispatched to deal with an outbreak,” Dr. Adalja said.

In a perfect post-COVID world, government bodies, scientists, clinicians, and others would cross silos to coordinate pandemic prevention, not just preparedness. The public would trust those who carry the title “public health” in their daily responsibilities, and in turn, public health experts would get back to their core responsibility – infectious disease preparedness – the role they were initially assigned following Europe’s Black Death during the 14th century. Instead, the world finds itself at a crossroads, with emerging and reemerging infectious disease outbreaks that on the surface appear to arise haphazardly but in reality are the result of decades of reaction and containment policies aimed at putting out fires, not addressing their cause.

Dr. Adalja noted that only when the threat of biological weapons became a reality in the mid-2000s was there a realization that economies of scale could be exploited by merging interests and efforts to develop health security medical countermeasures. For example, it encouraged governments to more closely integrate agencies like the Biomedical Advanced Research and Development Authority and infectious disease research organizations and individuals.

Still, while significant strides have been made in certain areas, the ongoing COVID-19 pandemic has revealed substantial weaknesses remaining in public and private health systems, as well as major gaps in infectious disease preparedness.
 

The role of spillover events

No matter whom you ask, scientists, public health and conservation experts, and infectious disease clinicians all point to one of the most important threats to human health. As Walt Kelly’s Pogo famously put it, “We have met the enemy, and he is us.”

“The reason why these outbreaks of novel infectious diseases are increasingly occurring is because of human-driven environmental change, particularly land use, unsafe practices when raising farmed animals, and commercial wildlife markets,” Neil M. Vora, MD, a physician specializing in pandemic prevention at Conservation International and a former Centers for Disease Control and Prevention epidemic intelligence officer, said in an interview.

In fact, more than 60% of emerging infections and diseases are due to these “spillover events” (zoonotic spillover) that occur when pathogens that commonly circulate in wildlife jump over to new, human hosts.

Several examples come to mind.

COVID-19 may have begun as an enzootic virus from two undetermined animals, using the Huanan Seafood Market as a possible intermediate reservoir, according to a July 26 preprint in the journal Science. 

Likewise, while the Ebola virus was originally attributed to deforestation efforts to create palm oil (which allowed fruit bat carriers to transfer the virus to humans), recent research suggests that bats dwelling in the walls of human dwellings and hospitals are responsible for the 2018 outbreak in the Democratic Republic of Congo. 

(Incidentally, just this week, a new Ebola case was confirmed in Eastern Congo, and it has been genetically linked to the previous outbreak, despite that outbreak having been declared over in early July.)

“When we clear forests, we create opportunities for humans to live alongside the forest edge and displace wildlife. There’s evidence that shows when [these] biodiverse areas are cleared, specialist species that evolved to live in the forests first start to disappear, whereas generalist species – rodents and bats – continue to survive and are able to carry pathogens that can be passed on to humans,” Dr. Vora explained.

So far, China’s outbreak of the novel Langya henipavirus is believed to have spread (either directly or indirectly) by rodents and shrews, according to reports from public health authorities like the European Centre for Disease Prevention and Control, which is currently monitoring the situation. 

Yet, an overreliance on surveillance and containment only perpetuates what Dr. Vora says are cycles of panic and neglect.

“We saw it with Ebola in 2015, in 2016 to 2017 with Zika, you see it with tuberculosis, with sexually transmitted infections, and with COVID. You have policymakers working on solutions, and once they think that they’ve fixed the problem, they’re going to move on to the next crisis.”

It’s also a question of equity.

Reports detailing the reemergence of monkeypox in Nigeria in 2017 were largely ignored, despite the fact that the United States assisted in diagnosing an early case in an 11-year-old boy. At the time, it was clear that the virus was spreading by human-to-human transmission versus animal-to-human transmission, something that had not been seen previously. 

“The current model [is] waiting for pathogens to spill over and then [continuing] to spread signals that rich countries are tolerant of these outbreaks so long as they don’t grow into epidemics or pandemics,” Dr. Vora said.

This model is clearly broken; roughly 5 years after Nigeria reported the resurgence of monkeypox, the United States has more than 14,000 confirmed cases, which represents more than a quarter of the total number of cases reported worldwide. 
 

 

 

Public health on the brink

I’s difficult to imagine a future without outbreaks and more pandemics, and if experts are to be believed, we are ill-prepared. 

“I think that we are in a situation where this is a major threat, and people have become complacent about it,” said Dr. Adalja, who noted that we should be asking ourselves if the “government is actually in a position to be able to respond in a way that we need them to or is [that response] tied up in bureaucracy and inefficiency?”

COVID-19 should have been seen as a wake-up call, and many of those deaths were preventable. “With monkeypox, they’re faltering; it should have been a layup, not a disaster,” he emphasized.

Ellen Eaton, MD, associate professor of infectious diseases at the University of Alabama at Birmingham, also pointed to the reality that by the time COVID-19 reached North America, the United States had already moved away from the model of the public health department as the epicenter of knowledge, education, awareness, and, ironically, public health.

“Thinking about my community, very few people knew the face and name of our local and state health officers,” she told this news organization.  

“There was just this inherent mistrust of these people. If you add in a lot of talking heads, a lot of politicians and messaging from non-experts that countered what was coming out of our public health agencies early, you had this huge disconnect; in the South, it was the perfect storm for vaccine hesitancy.”

At last count, this perfect storm has led to 1.46 million COVID cases and just over 20,000 deaths – many of which were preventable – in Alabama alone. 

“In certain parts of America, we were starting with a broken system with limited resources and few providers,” Dr. Eaton explained.

Dr. Eaton said that a lot of fields, not just medicine and public health, have finite resources that have been stretched to capacity by COVID, and now monkeypox, and wondered what was next as we’re headed into autumn and influenza season. But she also mentioned the tremendous implications of climate change on infectious diseases and community health and wellness.

“There’s a tremendous need to have the ability to survey not just humans but also how the disease burden in our environment that is fluctuating with climate change is going to impact communities in really important ways,” Dr. Eaton said. 
 

Upstream prevention

Dr. Vora said he could not agree more and believes that upstream prevention holds the key. 

“We have to make sure while there’s tension on this issue that the right solutions are implemented,” he said. 

In coming years, postspillover containment strategies – vaccine research and development and strengthening health care surveillance, for example – are likely to become inadequate.

“We saw it with COVID and we are seeing it again with monkeypox,” Dr. Vora said. “We also have to invest further upstream to prevent spillovers in the first place, for example, by addressing deforestation, commercial wildlife markets and trade, [and] infection control when raising farm animals.”

“The thing is, when you invest in those upstream solutions, you are also mitigating climate change and loss of biodiversity. I’m not saying that we should not invest in postspillover containment efforts; we’re never going to contain every spillover. But we also have to invest in prevention,” he added.

In a piece published in Nature, Dr. Vora and his coauthors acknowledge that several international bodies such as the World Health Organization and G7 have invested in initiatives to facilitate coordinated, global responses to climate change, pandemic preparedness, and response. But they point out that these efforts fail to “explicitly address the negative feedback cycle between environmental degradation, wildlife exploitation, and the emergence of pathogens.”

“Environmental conservation is no longer a left-wing fringe issue, it’s moving into public consciousness, and ... it is public health,” Dr. Vora said. “When we destroy nature, we’re destroying our own ability to survive.”

Dr. Adalja, Dr. Vora, and Dr. Eaton report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pink Nodule Behind the Ear

Article Type
Changed
Mon, 01/09/2023 - 09:03
Display Headline
Pink Nodule Behind the Ear

The Diagnosis: Acanthoma Fissuratum

Acanthoma fissuratum is a skin lesion that results from consistent pressure, typically from ill-fitting eyeglass frames.1 The chronic irritation leads to collagen deposition and inflammation that gradually creates the lesion. Many patients never seek care, making incidence figures undeterminable.2 It usually presents as a firm, tender, flesh-colored or pink nodule or plaque with a central indentation from where the frame rests. This indentation splits the lesion in half and classically gives the appearance of a coffee bean.1 The repeated minor trauma at this point of contact also may lead to centralized ulceration, which further blurs the diagnosis to include basal cell carcinoma (BCC).3,4 Although the postauricular groove is the most cited location, lesions also may occur at other contact points of the glasses, such as the lateral aspect of the bridge of the nose and the superior auricular sulcus.5 Acanthoma fissuratum is not limited to the external head. Other etiologies of local trauma and pressure have led to its diagnosis in the upper labioalveolar fold, posterior fourchette of the vulva, penis, and external auditory canal.6-9

The diagnosis of acanthoma fissuratum mainly is clinical; however, due to its similar appearance to BCC and other lesions, a biopsy can be taken to support the diagnosis; a biopsy was not performed in our patient. The main features seen on histopathology include acanthosis, hyperkeratosis, variable parakeratosis, and perivascular nonspecific inflammatory infiltration. The epidermis may reflect the macroscopic frame indentation with central attenuation of the epidermis, which potentially is filled with inflammatory cells or keratin.5

Treatment normally encompasses removing the illfitting frames or fixing the fit, which gradually leads to reduction of the lesion.4,5 This occurred in our patient, who changed eyeglasses and saw an 80% resolution of the lesion in 8 months. Such improvement after removal of a trauma-inducing stimulus would not be seen in malignancies (eg, BCC, squamous cell carcinoma [SCC]), keloids, or cylindromas. If the granulation tissue does not regress or recurs, other potential treatments include excision, intralesional corticosteroids, and electrosurgery.5

Basal cell carcinoma is a common nonmelanoma skin cancer that most often presents on the sun-exposed areas of the head and neck, especially the cheeks, nasolabial folds, and forehead. Although the nodular subtype may clinically appear similar to acanthoma fissuratum, it more typically presents as a pearly papule or nodule with a sharp border, small telangiectases, and potential ulceration.10 Squamous cell carcinoma is another common nonmelanoma skin cancer that often arises in sun-exposed areas, which can include the postauricular area. Although the lesion can be associated with chronic wounds and also can grow vertically, SCC typically has a scalier and more hyperkeratotic surface that can ulcerate.1 A cylindroma is a benign sweat gland tumor that most commonly presents on the head and neck (also known as the turban tumor), though it can develop on the ear. It appears as solitary or multiple nodules that often are flesh colored, red, or blue with a shiny surface.1 Cylindromas are not known to be associated with chronic local trauma or irritation,11 such as wearing ill-fitting eyeglasses. Unlike acanthoma fissuratum, the treatment of cylindromas, BCC, and SCC most often involves excision.1 A keloid presents as a flesh-colored, red, or purple exophytic plaque that is composed of dense dermal tissue and progressively forms after local trauma. Although keloids can spontaneously develop, they commonly form on the ears in susceptible individuals after skin excisions including prior keloid removal, piercings, repairment of auricular traumas, or infections.1 The patient’s coffee bean–like lesion that coincided with wearing new eyeglasses better fits the diagnosis of acanthoma fissuratum than a keloid. Additionally, keloids typically do not regress without treatment. Keloid treatment consists of intralesional steroid injections, occlusive silicone dressings, compression, cryotherapy, radiation, and excisional surgery.1

References
  1. Sand M, Sand D, Brors D, et al. Cutaneous lesions of the external ear. Head Face Med. 2008;4. doi:10.1186/1746-160X-4-2
  2. Orengo I, Robbins K, Marsch A. Pathology of the ear. Semin Plast Surg. 2011;25:279-287. doi:10.1055/s-0031-1288920
  3. Ramroop S. Successful treatment of acanthoma fissuratum with intralesional triamcinolone acetonide. Clin Case Rep. 2020;8:702-703. doi:10.1002/ccr3.2708
  4. Delaney TJ, Stewart TW. Granuloma fissuratum. Br J Dermatol. 1971;84:373-375. doi:10.1111/j.1365-2133.1971.tb14235.x
  5. Deshpande NS, Sen A, Vasudevan B, et al. Acanthoma fissuratum: lest we forget. Indian Dermatol Online J. 2017;8:141-143. doi:10.4103/2229- 5178.202267
  6. Surron RL Jr. A fissured granulomatous lesion of the upper labioalveolar fold. Arch Dermatol Syph. 1932;26:425. doi:10.1001 /archderm.1932.01450030423004
  7. Kennedy CM, Dewdney S, Galask RP. Vulvar granuloma fissuratum: a description of fissuring of the posterior fourchette and the repair. Obstet Gynecol. 2005;105:1018-1023. doi:10.1097/01. AOG.0000158863.70819.53
  8. Lee JL, Lee YB, Cho BK, et al. Acanthoma fissuratum on the penis. Int J Dermatol. 2013;52:382-384. doi:10.1111/j.1365-4632.2011.04903.x
  9. Gonzalez SA, Moore AGN. Acanthoma fissuratum of the outer auditory canal from a hearing aid. J Cutan Pathol. 1989;16:304.
  10. Fania L, Didona D, Morese R, et al. Basal cell carcinoma: from pathophysiology to novel therapeutic approaches. Biomedicines. 2020;8:449. doi:10.3390/biomedicines8110449
  11. Chauhan DS, Guruprasad Y. Dermal cylindroma of the scalp. Natl J Maxillofac Surg. 2012;3:59-61. doi:10.4103/0975-5950.102163
Article PDF
Author and Disclosure Information

From the Ohio University Heritage College of Osteopathic Medicine, Dublin. Dr. Trotter also is from the OhioHealth Dermatology Residency Program, Columbus, and is in private practice at Dermatologists of Central States, Canal Winchester, Ohio.

The authors report no conflict of interest.

Correspondence: Ryan Jay, DO, OhioHealth Riverside Methodist Hospital, 3535 Olentangy River Rd, Columbus, OH 43214 ([email protected]).

Issue
Cutis - 110(2)
Publications
Topics
Page Number
E35-E36
Sections
Author and Disclosure Information

From the Ohio University Heritage College of Osteopathic Medicine, Dublin. Dr. Trotter also is from the OhioHealth Dermatology Residency Program, Columbus, and is in private practice at Dermatologists of Central States, Canal Winchester, Ohio.

The authors report no conflict of interest.

Correspondence: Ryan Jay, DO, OhioHealth Riverside Methodist Hospital, 3535 Olentangy River Rd, Columbus, OH 43214 ([email protected]).

Author and Disclosure Information

From the Ohio University Heritage College of Osteopathic Medicine, Dublin. Dr. Trotter also is from the OhioHealth Dermatology Residency Program, Columbus, and is in private practice at Dermatologists of Central States, Canal Winchester, Ohio.

The authors report no conflict of interest.

Correspondence: Ryan Jay, DO, OhioHealth Riverside Methodist Hospital, 3535 Olentangy River Rd, Columbus, OH 43214 ([email protected]).

Article PDF
Article PDF
Related Articles

The Diagnosis: Acanthoma Fissuratum

Acanthoma fissuratum is a skin lesion that results from consistent pressure, typically from ill-fitting eyeglass frames.1 The chronic irritation leads to collagen deposition and inflammation that gradually creates the lesion. Many patients never seek care, making incidence figures undeterminable.2 It usually presents as a firm, tender, flesh-colored or pink nodule or plaque with a central indentation from where the frame rests. This indentation splits the lesion in half and classically gives the appearance of a coffee bean.1 The repeated minor trauma at this point of contact also may lead to centralized ulceration, which further blurs the diagnosis to include basal cell carcinoma (BCC).3,4 Although the postauricular groove is the most cited location, lesions also may occur at other contact points of the glasses, such as the lateral aspect of the bridge of the nose and the superior auricular sulcus.5 Acanthoma fissuratum is not limited to the external head. Other etiologies of local trauma and pressure have led to its diagnosis in the upper labioalveolar fold, posterior fourchette of the vulva, penis, and external auditory canal.6-9

The diagnosis of acanthoma fissuratum mainly is clinical; however, due to its similar appearance to BCC and other lesions, a biopsy can be taken to support the diagnosis; a biopsy was not performed in our patient. The main features seen on histopathology include acanthosis, hyperkeratosis, variable parakeratosis, and perivascular nonspecific inflammatory infiltration. The epidermis may reflect the macroscopic frame indentation with central attenuation of the epidermis, which potentially is filled with inflammatory cells or keratin.5

Treatment normally encompasses removing the illfitting frames or fixing the fit, which gradually leads to reduction of the lesion.4,5 This occurred in our patient, who changed eyeglasses and saw an 80% resolution of the lesion in 8 months. Such improvement after removal of a trauma-inducing stimulus would not be seen in malignancies (eg, BCC, squamous cell carcinoma [SCC]), keloids, or cylindromas. If the granulation tissue does not regress or recurs, other potential treatments include excision, intralesional corticosteroids, and electrosurgery.5

Basal cell carcinoma is a common nonmelanoma skin cancer that most often presents on the sun-exposed areas of the head and neck, especially the cheeks, nasolabial folds, and forehead. Although the nodular subtype may clinically appear similar to acanthoma fissuratum, it more typically presents as a pearly papule or nodule with a sharp border, small telangiectases, and potential ulceration.10 Squamous cell carcinoma is another common nonmelanoma skin cancer that often arises in sun-exposed areas, which can include the postauricular area. Although the lesion can be associated with chronic wounds and also can grow vertically, SCC typically has a scalier and more hyperkeratotic surface that can ulcerate.1 A cylindroma is a benign sweat gland tumor that most commonly presents on the head and neck (also known as the turban tumor), though it can develop on the ear. It appears as solitary or multiple nodules that often are flesh colored, red, or blue with a shiny surface.1 Cylindromas are not known to be associated with chronic local trauma or irritation,11 such as wearing ill-fitting eyeglasses. Unlike acanthoma fissuratum, the treatment of cylindromas, BCC, and SCC most often involves excision.1 A keloid presents as a flesh-colored, red, or purple exophytic plaque that is composed of dense dermal tissue and progressively forms after local trauma. Although keloids can spontaneously develop, they commonly form on the ears in susceptible individuals after skin excisions including prior keloid removal, piercings, repairment of auricular traumas, or infections.1 The patient’s coffee bean–like lesion that coincided with wearing new eyeglasses better fits the diagnosis of acanthoma fissuratum than a keloid. Additionally, keloids typically do not regress without treatment. Keloid treatment consists of intralesional steroid injections, occlusive silicone dressings, compression, cryotherapy, radiation, and excisional surgery.1

The Diagnosis: Acanthoma Fissuratum

Acanthoma fissuratum is a skin lesion that results from consistent pressure, typically from ill-fitting eyeglass frames.1 The chronic irritation leads to collagen deposition and inflammation that gradually creates the lesion. Many patients never seek care, making incidence figures undeterminable.2 It usually presents as a firm, tender, flesh-colored or pink nodule or plaque with a central indentation from where the frame rests. This indentation splits the lesion in half and classically gives the appearance of a coffee bean.1 The repeated minor trauma at this point of contact also may lead to centralized ulceration, which further blurs the diagnosis to include basal cell carcinoma (BCC).3,4 Although the postauricular groove is the most cited location, lesions also may occur at other contact points of the glasses, such as the lateral aspect of the bridge of the nose and the superior auricular sulcus.5 Acanthoma fissuratum is not limited to the external head. Other etiologies of local trauma and pressure have led to its diagnosis in the upper labioalveolar fold, posterior fourchette of the vulva, penis, and external auditory canal.6-9

The diagnosis of acanthoma fissuratum mainly is clinical; however, due to its similar appearance to BCC and other lesions, a biopsy can be taken to support the diagnosis; a biopsy was not performed in our patient. The main features seen on histopathology include acanthosis, hyperkeratosis, variable parakeratosis, and perivascular nonspecific inflammatory infiltration. The epidermis may reflect the macroscopic frame indentation with central attenuation of the epidermis, which potentially is filled with inflammatory cells or keratin.5

Treatment normally encompasses removing the illfitting frames or fixing the fit, which gradually leads to reduction of the lesion.4,5 This occurred in our patient, who changed eyeglasses and saw an 80% resolution of the lesion in 8 months. Such improvement after removal of a trauma-inducing stimulus would not be seen in malignancies (eg, BCC, squamous cell carcinoma [SCC]), keloids, or cylindromas. If the granulation tissue does not regress or recurs, other potential treatments include excision, intralesional corticosteroids, and electrosurgery.5

Basal cell carcinoma is a common nonmelanoma skin cancer that most often presents on the sun-exposed areas of the head and neck, especially the cheeks, nasolabial folds, and forehead. Although the nodular subtype may clinically appear similar to acanthoma fissuratum, it more typically presents as a pearly papule or nodule with a sharp border, small telangiectases, and potential ulceration.10 Squamous cell carcinoma is another common nonmelanoma skin cancer that often arises in sun-exposed areas, which can include the postauricular area. Although the lesion can be associated with chronic wounds and also can grow vertically, SCC typically has a scalier and more hyperkeratotic surface that can ulcerate.1 A cylindroma is a benign sweat gland tumor that most commonly presents on the head and neck (also known as the turban tumor), though it can develop on the ear. It appears as solitary or multiple nodules that often are flesh colored, red, or blue with a shiny surface.1 Cylindromas are not known to be associated with chronic local trauma or irritation,11 such as wearing ill-fitting eyeglasses. Unlike acanthoma fissuratum, the treatment of cylindromas, BCC, and SCC most often involves excision.1 A keloid presents as a flesh-colored, red, or purple exophytic plaque that is composed of dense dermal tissue and progressively forms after local trauma. Although keloids can spontaneously develop, they commonly form on the ears in susceptible individuals after skin excisions including prior keloid removal, piercings, repairment of auricular traumas, or infections.1 The patient’s coffee bean–like lesion that coincided with wearing new eyeglasses better fits the diagnosis of acanthoma fissuratum than a keloid. Additionally, keloids typically do not regress without treatment. Keloid treatment consists of intralesional steroid injections, occlusive silicone dressings, compression, cryotherapy, radiation, and excisional surgery.1

References
  1. Sand M, Sand D, Brors D, et al. Cutaneous lesions of the external ear. Head Face Med. 2008;4. doi:10.1186/1746-160X-4-2
  2. Orengo I, Robbins K, Marsch A. Pathology of the ear. Semin Plast Surg. 2011;25:279-287. doi:10.1055/s-0031-1288920
  3. Ramroop S. Successful treatment of acanthoma fissuratum with intralesional triamcinolone acetonide. Clin Case Rep. 2020;8:702-703. doi:10.1002/ccr3.2708
  4. Delaney TJ, Stewart TW. Granuloma fissuratum. Br J Dermatol. 1971;84:373-375. doi:10.1111/j.1365-2133.1971.tb14235.x
  5. Deshpande NS, Sen A, Vasudevan B, et al. Acanthoma fissuratum: lest we forget. Indian Dermatol Online J. 2017;8:141-143. doi:10.4103/2229- 5178.202267
  6. Surron RL Jr. A fissured granulomatous lesion of the upper labioalveolar fold. Arch Dermatol Syph. 1932;26:425. doi:10.1001 /archderm.1932.01450030423004
  7. Kennedy CM, Dewdney S, Galask RP. Vulvar granuloma fissuratum: a description of fissuring of the posterior fourchette and the repair. Obstet Gynecol. 2005;105:1018-1023. doi:10.1097/01. AOG.0000158863.70819.53
  8. Lee JL, Lee YB, Cho BK, et al. Acanthoma fissuratum on the penis. Int J Dermatol. 2013;52:382-384. doi:10.1111/j.1365-4632.2011.04903.x
  9. Gonzalez SA, Moore AGN. Acanthoma fissuratum of the outer auditory canal from a hearing aid. J Cutan Pathol. 1989;16:304.
  10. Fania L, Didona D, Morese R, et al. Basal cell carcinoma: from pathophysiology to novel therapeutic approaches. Biomedicines. 2020;8:449. doi:10.3390/biomedicines8110449
  11. Chauhan DS, Guruprasad Y. Dermal cylindroma of the scalp. Natl J Maxillofac Surg. 2012;3:59-61. doi:10.4103/0975-5950.102163
References
  1. Sand M, Sand D, Brors D, et al. Cutaneous lesions of the external ear. Head Face Med. 2008;4. doi:10.1186/1746-160X-4-2
  2. Orengo I, Robbins K, Marsch A. Pathology of the ear. Semin Plast Surg. 2011;25:279-287. doi:10.1055/s-0031-1288920
  3. Ramroop S. Successful treatment of acanthoma fissuratum with intralesional triamcinolone acetonide. Clin Case Rep. 2020;8:702-703. doi:10.1002/ccr3.2708
  4. Delaney TJ, Stewart TW. Granuloma fissuratum. Br J Dermatol. 1971;84:373-375. doi:10.1111/j.1365-2133.1971.tb14235.x
  5. Deshpande NS, Sen A, Vasudevan B, et al. Acanthoma fissuratum: lest we forget. Indian Dermatol Online J. 2017;8:141-143. doi:10.4103/2229- 5178.202267
  6. Surron RL Jr. A fissured granulomatous lesion of the upper labioalveolar fold. Arch Dermatol Syph. 1932;26:425. doi:10.1001 /archderm.1932.01450030423004
  7. Kennedy CM, Dewdney S, Galask RP. Vulvar granuloma fissuratum: a description of fissuring of the posterior fourchette and the repair. Obstet Gynecol. 2005;105:1018-1023. doi:10.1097/01. AOG.0000158863.70819.53
  8. Lee JL, Lee YB, Cho BK, et al. Acanthoma fissuratum on the penis. Int J Dermatol. 2013;52:382-384. doi:10.1111/j.1365-4632.2011.04903.x
  9. Gonzalez SA, Moore AGN. Acanthoma fissuratum of the outer auditory canal from a hearing aid. J Cutan Pathol. 1989;16:304.
  10. Fania L, Didona D, Morese R, et al. Basal cell carcinoma: from pathophysiology to novel therapeutic approaches. Biomedicines. 2020;8:449. doi:10.3390/biomedicines8110449
  11. Chauhan DS, Guruprasad Y. Dermal cylindroma of the scalp. Natl J Maxillofac Surg. 2012;3:59-61. doi:10.4103/0975-5950.102163
Issue
Cutis - 110(2)
Issue
Cutis - 110(2)
Page Number
E35-E36
Page Number
E35-E36
Publications
Publications
Topics
Article Type
Display Headline
Pink Nodule Behind the Ear
Display Headline
Pink Nodule Behind the Ear
Sections
Questionnaire Body

A 62-year-old man presented to the dermatology office with a 1.5-cm, pink, rubbery nodule behind the left ear that sometimes was tender. He stated that the lesion gradually grew in size over the last 2 years, and it developed after he was fitted for new glasses.

Pink nodule behind the ear

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 08/24/2022 - 09:30
Un-Gate On Date
Wed, 08/24/2022 - 09:30
Use ProPublica
CFC Schedule Remove Status
Wed, 08/24/2022 - 09:30
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Parameters of Scratch Pleasurability in the Management of Pruritic Conditions

Article Type
Changed
Wed, 08/24/2022 - 10:06
Display Headline
Parameters of Scratch Pleasurability in the Management of Pruritic Conditions

To the Editor:

The itch-scratch cycle refers to the sequence created when a pruritic skin condition leads to scratching and skin barrier disruption, ultimately facilitating secondary skin changes and neural activation that prolongs pruritus. In patients with pruritic conditions, the itch-scratch cycle often can run unrestrained, with patients unaware of their scratching habits. Understanding what drives a patient to scratch, such as the pleasure gained from scratching, may be beneficial for dermatologists combating a patient’s scratching habits. The earliest documented attempts to understand the mechanism of an itch were made in Greece around the fifth century, but the pathophysiology of this sensation still is not fully understood. The Latin term pruritus refers to itching, irritation, or sexual excitement, while the Greek term knêsmos and related words also denote itch in an irritating or pleasurable sense.1 This paradoxical duality of irritation and pleasure is a phenomenon all too well understood by those affected with pruritic symptoms.

Although there are many measured characteristics of an itch, the pleasure granted from scratching an itch rarely is addressed. Understanding the factors influencing the pleasurability of scratching could help improve management and outcomes of patients’ pruritic conditions.

Pruritus is associated with a wide array of etiologies including dermatologic, infectious, metabolic, and autoimmune, but unanimously it evokes a strong desire to scratch. Scratching an itch often yields temporary relief from the irritation by dispensing a complex sensory concoction between pleasure and pain.2 The neurobiology behind this pleasure phenomenon is inconclusive. Some hypotheses point to how scratching-induced pleasure may be derived from the deactivation or inhibition of the unpleasant sensation of an itch in the central nervous system, the stimulation of the reward signals in the C-fiber system in the peripheral nervous system, the release of pruritis-inhibiting prostaglandin D2, or a combination of these pathways. Levels of sensation and pleasure induced from itch attenuation by scratching even vary based on anatomic location. One study demonstrated that, when compared to the forearms, the ankles and back perceived baseline induced itch most intensely, but no significant difference in perceived itch intensity was found between the ankles and back. Additionally, scratching an itchy back or ankle notably induced more pleasure when compared to the forearms, but there was no significant difference in scratching pleasurability between the ankle and back.3

Although there are adequate questionnaires and scales (eg, ItchyQoL,4 Skindex-16, Skindex-29) to quantify the severity of pruritus and its effects on a patient’s quality of life, these measurements do not assess the pleasure yielded from scratching, the impact of scratch pleasure on the patient experience, or the effect of scratch pleasure on the disease state.4 It appears that there are inadequate assessment tools to define factors associated with the pleasurability of scratching. A PubMed search of articles indexed for MEDLINE using the terms scratching pleasure scale and pruritus pleasure questionnaire yielded scarce results measuring patient perspectives on scratching-associated pleasure. A pertinent study performed by O’Neill et al5 compared the differences in itch characteristics between patients with psoriasis and those with atopic dermatitis using a web-based questionnaire featuring a numerical pleasure scale (ranging from 5 [highly unpleasurable] to +5 [highly pleasurable]) on an 11-point Likert scale. The questionnaire sought to measure the effects of scratching during a typical episode of itch within the past 2 weeks. Scratching was found pleasurable in both groups of patients.5 Another web-based questionnaire that characterized pleasurability in scratching a typical episode of itch in individuals with atopic dermatitis using a 5 to +5 Likert scale (5 [highly unpleasurable] to +5 [highly pleasurable]) found that most participants perceived scratching as pleasurable and that there was a positive correlation between itch intensity and scratch pleasurability.6 Both of these studies quantified that scratching an itch is pleasurable, a correlation that may not come as a surprise. This direct correlation suggests that a more detailed analysis of this scratch pleasure could be beneficial in the management of pruritic conditions.

Treating the underlying cause of an itch is key to inhibiting the sensation; in some cases, anti-itch medications must be used. Current medications have limited effects on itch relief, but an expanding understanding of itch pathophysiology through clinical and laboratory research in the fields of dermatology, immunology, and neurology is paving the way for promising new therapeutic medications.7-11 In a review of the literature, Sanders and Akiyama12 elucidated the influence of stress and anxiety in scratching an itch and the way in which both pharmacologic and nonpharmacologic (ie, psychological and educational interventions) may be used to help break the itch-scratch cycle. Possible techniques include habit-reversal training, relaxation therapy, and cognitive behavioral therapy.13 Understanding patient perspectives on the pleasure yielded from scratching an itch and the disease factors that influence this pleasure seeking are paramount to reducing patient scratching. In understanding the pleasurability of scratching in pruritic conditions, the itch-scratch cycle and its accompanying deleterious effects (eg, stress, anxiety, pain, infection, secondary skin changes) can be broken.

The pleasure yielded from scratching an itch is a component of patient scratching habits that should be analyzed and quantified to reduce itch in pruritic conditions, mitigate damaging consequences of scratching, and improve the quality of life of patients with pruritic conditions. Furthermore, this understanding may help guide clinicians in management, such as counseling patients on the itch-scratch cycle and deciding which forthcoming medications could ameliorate a patient’s pruritic symptoms.

References
  1. Weisshaar E, Grüll V, König A, et al. The symptom of itch in medical history: highlights through the centuries. Int J Dermatol. 2009;48:1385-1394.
  2. Lavery MJ, Kinney MO, Mochizuki H, et al. Pruritus: an overview. what drives people to scratch an itch? Ulster Med J. 2016;85:164-173.
  3. Bin Saif GA, Papoiu ADP, Banari L, et al. The pleasurability of scratching an itch: a psychophysical and topographical assessment. Br J Dermatol. 2012;166:981-985.
  4. Desai NS, Poindexter GB, Monthrope YM, et al. A pilot quality-of-life instrument for pruritus. J Am Acad Dermatol. 2008;59:234-244.
  5. O’Neill JL, Chan YH, Rapp SR, et al. Differences in itch characteristics between psoriasis and atopic dermatitis patients: results of a web-based questionnaire. Acta Derm Venereol. 2011;91:537-540.
  6. Dawn A, Papoiu ADP, Chan YH, et al. Itch characteristics in atopic dermatitis: results of a web-based questionnaire. Br J Dermatol. 2009;160:642-644.
  7. Yosipovitch G, Rosen JD, Hashimoto T. Itch: from mechanism to (novel) therapeutic approaches. J Allergy Clin Immunol. 2018;142:1375-1390.
  8. Yosipovitch G, Misery L, Proksch E, et al. Skin barrier damage and itch: review of mechanisms, topical management and future directions. Acta Derm Venereol. 2019;99:1201-1209.
  9. Dong X, Dong X. Peripheral and central mechanisms of itch. Neuron. 2018;98:482-494.
  10. Lerner EA. Pathophysiology of itch. Dermatol Clin. 2018;36:175-177.
  11. Cevikbas F, Lerner EA. Physiology and pathophysiology of itch. Physiol Rev. 2020;100:945-982.
  12. Sanders KM, Akiyama T. The vicious cycle of itch and anxiety. Neurosci Biobehav Rev. 2018;87:17-26.
  13. Sanders KM, Nattkemper LA, Yosipovitch G. Advances in understanding itching and scratching: a new era of targeted treatments [published online August 22, 2016]. F1000Res. doi:10.12688/f1000research.8659.
Article PDF
Author and Disclosure Information

Dr. LaCour and Ms. Rimmer are from the Louisiana State University Health Sciences Center, New Orleans. Dr. LaCour is from the Department of Dermatology, and Ms. Rimmer is from the School of Medicine. Dr. Kelly is from the Department of Dermatology, University of Texas Medical Branch, Galveston.

The authors report no conflict of interest.

Correspondence: Matthew LaCour, MD, 2020 Gravier St, New Orleans, LA 70112 ([email protected]).

Issue
Cutis - 110(2)
Publications
Topics
Page Number
E24-E25
Sections
Author and Disclosure Information

Dr. LaCour and Ms. Rimmer are from the Louisiana State University Health Sciences Center, New Orleans. Dr. LaCour is from the Department of Dermatology, and Ms. Rimmer is from the School of Medicine. Dr. Kelly is from the Department of Dermatology, University of Texas Medical Branch, Galveston.

The authors report no conflict of interest.

Correspondence: Matthew LaCour, MD, 2020 Gravier St, New Orleans, LA 70112 ([email protected]).

Author and Disclosure Information

Dr. LaCour and Ms. Rimmer are from the Louisiana State University Health Sciences Center, New Orleans. Dr. LaCour is from the Department of Dermatology, and Ms. Rimmer is from the School of Medicine. Dr. Kelly is from the Department of Dermatology, University of Texas Medical Branch, Galveston.

The authors report no conflict of interest.

Correspondence: Matthew LaCour, MD, 2020 Gravier St, New Orleans, LA 70112 ([email protected]).

Article PDF
Article PDF

To the Editor:

The itch-scratch cycle refers to the sequence created when a pruritic skin condition leads to scratching and skin barrier disruption, ultimately facilitating secondary skin changes and neural activation that prolongs pruritus. In patients with pruritic conditions, the itch-scratch cycle often can run unrestrained, with patients unaware of their scratching habits. Understanding what drives a patient to scratch, such as the pleasure gained from scratching, may be beneficial for dermatologists combating a patient’s scratching habits. The earliest documented attempts to understand the mechanism of an itch were made in Greece around the fifth century, but the pathophysiology of this sensation still is not fully understood. The Latin term pruritus refers to itching, irritation, or sexual excitement, while the Greek term knêsmos and related words also denote itch in an irritating or pleasurable sense.1 This paradoxical duality of irritation and pleasure is a phenomenon all too well understood by those affected with pruritic symptoms.

Although there are many measured characteristics of an itch, the pleasure granted from scratching an itch rarely is addressed. Understanding the factors influencing the pleasurability of scratching could help improve management and outcomes of patients’ pruritic conditions.

Pruritus is associated with a wide array of etiologies including dermatologic, infectious, metabolic, and autoimmune, but unanimously it evokes a strong desire to scratch. Scratching an itch often yields temporary relief from the irritation by dispensing a complex sensory concoction between pleasure and pain.2 The neurobiology behind this pleasure phenomenon is inconclusive. Some hypotheses point to how scratching-induced pleasure may be derived from the deactivation or inhibition of the unpleasant sensation of an itch in the central nervous system, the stimulation of the reward signals in the C-fiber system in the peripheral nervous system, the release of pruritis-inhibiting prostaglandin D2, or a combination of these pathways. Levels of sensation and pleasure induced from itch attenuation by scratching even vary based on anatomic location. One study demonstrated that, when compared to the forearms, the ankles and back perceived baseline induced itch most intensely, but no significant difference in perceived itch intensity was found between the ankles and back. Additionally, scratching an itchy back or ankle notably induced more pleasure when compared to the forearms, but there was no significant difference in scratching pleasurability between the ankle and back.3

Although there are adequate questionnaires and scales (eg, ItchyQoL,4 Skindex-16, Skindex-29) to quantify the severity of pruritus and its effects on a patient’s quality of life, these measurements do not assess the pleasure yielded from scratching, the impact of scratch pleasure on the patient experience, or the effect of scratch pleasure on the disease state.4 It appears that there are inadequate assessment tools to define factors associated with the pleasurability of scratching. A PubMed search of articles indexed for MEDLINE using the terms scratching pleasure scale and pruritus pleasure questionnaire yielded scarce results measuring patient perspectives on scratching-associated pleasure. A pertinent study performed by O’Neill et al5 compared the differences in itch characteristics between patients with psoriasis and those with atopic dermatitis using a web-based questionnaire featuring a numerical pleasure scale (ranging from 5 [highly unpleasurable] to +5 [highly pleasurable]) on an 11-point Likert scale. The questionnaire sought to measure the effects of scratching during a typical episode of itch within the past 2 weeks. Scratching was found pleasurable in both groups of patients.5 Another web-based questionnaire that characterized pleasurability in scratching a typical episode of itch in individuals with atopic dermatitis using a 5 to +5 Likert scale (5 [highly unpleasurable] to +5 [highly pleasurable]) found that most participants perceived scratching as pleasurable and that there was a positive correlation between itch intensity and scratch pleasurability.6 Both of these studies quantified that scratching an itch is pleasurable, a correlation that may not come as a surprise. This direct correlation suggests that a more detailed analysis of this scratch pleasure could be beneficial in the management of pruritic conditions.

Treating the underlying cause of an itch is key to inhibiting the sensation; in some cases, anti-itch medications must be used. Current medications have limited effects on itch relief, but an expanding understanding of itch pathophysiology through clinical and laboratory research in the fields of dermatology, immunology, and neurology is paving the way for promising new therapeutic medications.7-11 In a review of the literature, Sanders and Akiyama12 elucidated the influence of stress and anxiety in scratching an itch and the way in which both pharmacologic and nonpharmacologic (ie, psychological and educational interventions) may be used to help break the itch-scratch cycle. Possible techniques include habit-reversal training, relaxation therapy, and cognitive behavioral therapy.13 Understanding patient perspectives on the pleasure yielded from scratching an itch and the disease factors that influence this pleasure seeking are paramount to reducing patient scratching. In understanding the pleasurability of scratching in pruritic conditions, the itch-scratch cycle and its accompanying deleterious effects (eg, stress, anxiety, pain, infection, secondary skin changes) can be broken.

The pleasure yielded from scratching an itch is a component of patient scratching habits that should be analyzed and quantified to reduce itch in pruritic conditions, mitigate damaging consequences of scratching, and improve the quality of life of patients with pruritic conditions. Furthermore, this understanding may help guide clinicians in management, such as counseling patients on the itch-scratch cycle and deciding which forthcoming medications could ameliorate a patient’s pruritic symptoms.

To the Editor:

The itch-scratch cycle refers to the sequence created when a pruritic skin condition leads to scratching and skin barrier disruption, ultimately facilitating secondary skin changes and neural activation that prolongs pruritus. In patients with pruritic conditions, the itch-scratch cycle often can run unrestrained, with patients unaware of their scratching habits. Understanding what drives a patient to scratch, such as the pleasure gained from scratching, may be beneficial for dermatologists combating a patient’s scratching habits. The earliest documented attempts to understand the mechanism of an itch were made in Greece around the fifth century, but the pathophysiology of this sensation still is not fully understood. The Latin term pruritus refers to itching, irritation, or sexual excitement, while the Greek term knêsmos and related words also denote itch in an irritating or pleasurable sense.1 This paradoxical duality of irritation and pleasure is a phenomenon all too well understood by those affected with pruritic symptoms.

Although there are many measured characteristics of an itch, the pleasure granted from scratching an itch rarely is addressed. Understanding the factors influencing the pleasurability of scratching could help improve management and outcomes of patients’ pruritic conditions.

Pruritus is associated with a wide array of etiologies including dermatologic, infectious, metabolic, and autoimmune, but unanimously it evokes a strong desire to scratch. Scratching an itch often yields temporary relief from the irritation by dispensing a complex sensory concoction between pleasure and pain.2 The neurobiology behind this pleasure phenomenon is inconclusive. Some hypotheses point to how scratching-induced pleasure may be derived from the deactivation or inhibition of the unpleasant sensation of an itch in the central nervous system, the stimulation of the reward signals in the C-fiber system in the peripheral nervous system, the release of pruritis-inhibiting prostaglandin D2, or a combination of these pathways. Levels of sensation and pleasure induced from itch attenuation by scratching even vary based on anatomic location. One study demonstrated that, when compared to the forearms, the ankles and back perceived baseline induced itch most intensely, but no significant difference in perceived itch intensity was found between the ankles and back. Additionally, scratching an itchy back or ankle notably induced more pleasure when compared to the forearms, but there was no significant difference in scratching pleasurability between the ankle and back.3

Although there are adequate questionnaires and scales (eg, ItchyQoL,4 Skindex-16, Skindex-29) to quantify the severity of pruritus and its effects on a patient’s quality of life, these measurements do not assess the pleasure yielded from scratching, the impact of scratch pleasure on the patient experience, or the effect of scratch pleasure on the disease state.4 It appears that there are inadequate assessment tools to define factors associated with the pleasurability of scratching. A PubMed search of articles indexed for MEDLINE using the terms scratching pleasure scale and pruritus pleasure questionnaire yielded scarce results measuring patient perspectives on scratching-associated pleasure. A pertinent study performed by O’Neill et al5 compared the differences in itch characteristics between patients with psoriasis and those with atopic dermatitis using a web-based questionnaire featuring a numerical pleasure scale (ranging from 5 [highly unpleasurable] to +5 [highly pleasurable]) on an 11-point Likert scale. The questionnaire sought to measure the effects of scratching during a typical episode of itch within the past 2 weeks. Scratching was found pleasurable in both groups of patients.5 Another web-based questionnaire that characterized pleasurability in scratching a typical episode of itch in individuals with atopic dermatitis using a 5 to +5 Likert scale (5 [highly unpleasurable] to +5 [highly pleasurable]) found that most participants perceived scratching as pleasurable and that there was a positive correlation between itch intensity and scratch pleasurability.6 Both of these studies quantified that scratching an itch is pleasurable, a correlation that may not come as a surprise. This direct correlation suggests that a more detailed analysis of this scratch pleasure could be beneficial in the management of pruritic conditions.

Treating the underlying cause of an itch is key to inhibiting the sensation; in some cases, anti-itch medications must be used. Current medications have limited effects on itch relief, but an expanding understanding of itch pathophysiology through clinical and laboratory research in the fields of dermatology, immunology, and neurology is paving the way for promising new therapeutic medications.7-11 In a review of the literature, Sanders and Akiyama12 elucidated the influence of stress and anxiety in scratching an itch and the way in which both pharmacologic and nonpharmacologic (ie, psychological and educational interventions) may be used to help break the itch-scratch cycle. Possible techniques include habit-reversal training, relaxation therapy, and cognitive behavioral therapy.13 Understanding patient perspectives on the pleasure yielded from scratching an itch and the disease factors that influence this pleasure seeking are paramount to reducing patient scratching. In understanding the pleasurability of scratching in pruritic conditions, the itch-scratch cycle and its accompanying deleterious effects (eg, stress, anxiety, pain, infection, secondary skin changes) can be broken.

The pleasure yielded from scratching an itch is a component of patient scratching habits that should be analyzed and quantified to reduce itch in pruritic conditions, mitigate damaging consequences of scratching, and improve the quality of life of patients with pruritic conditions. Furthermore, this understanding may help guide clinicians in management, such as counseling patients on the itch-scratch cycle and deciding which forthcoming medications could ameliorate a patient’s pruritic symptoms.

References
  1. Weisshaar E, Grüll V, König A, et al. The symptom of itch in medical history: highlights through the centuries. Int J Dermatol. 2009;48:1385-1394.
  2. Lavery MJ, Kinney MO, Mochizuki H, et al. Pruritus: an overview. what drives people to scratch an itch? Ulster Med J. 2016;85:164-173.
  3. Bin Saif GA, Papoiu ADP, Banari L, et al. The pleasurability of scratching an itch: a psychophysical and topographical assessment. Br J Dermatol. 2012;166:981-985.
  4. Desai NS, Poindexter GB, Monthrope YM, et al. A pilot quality-of-life instrument for pruritus. J Am Acad Dermatol. 2008;59:234-244.
  5. O’Neill JL, Chan YH, Rapp SR, et al. Differences in itch characteristics between psoriasis and atopic dermatitis patients: results of a web-based questionnaire. Acta Derm Venereol. 2011;91:537-540.
  6. Dawn A, Papoiu ADP, Chan YH, et al. Itch characteristics in atopic dermatitis: results of a web-based questionnaire. Br J Dermatol. 2009;160:642-644.
  7. Yosipovitch G, Rosen JD, Hashimoto T. Itch: from mechanism to (novel) therapeutic approaches. J Allergy Clin Immunol. 2018;142:1375-1390.
  8. Yosipovitch G, Misery L, Proksch E, et al. Skin barrier damage and itch: review of mechanisms, topical management and future directions. Acta Derm Venereol. 2019;99:1201-1209.
  9. Dong X, Dong X. Peripheral and central mechanisms of itch. Neuron. 2018;98:482-494.
  10. Lerner EA. Pathophysiology of itch. Dermatol Clin. 2018;36:175-177.
  11. Cevikbas F, Lerner EA. Physiology and pathophysiology of itch. Physiol Rev. 2020;100:945-982.
  12. Sanders KM, Akiyama T. The vicious cycle of itch and anxiety. Neurosci Biobehav Rev. 2018;87:17-26.
  13. Sanders KM, Nattkemper LA, Yosipovitch G. Advances in understanding itching and scratching: a new era of targeted treatments [published online August 22, 2016]. F1000Res. doi:10.12688/f1000research.8659.
References
  1. Weisshaar E, Grüll V, König A, et al. The symptom of itch in medical history: highlights through the centuries. Int J Dermatol. 2009;48:1385-1394.
  2. Lavery MJ, Kinney MO, Mochizuki H, et al. Pruritus: an overview. what drives people to scratch an itch? Ulster Med J. 2016;85:164-173.
  3. Bin Saif GA, Papoiu ADP, Banari L, et al. The pleasurability of scratching an itch: a psychophysical and topographical assessment. Br J Dermatol. 2012;166:981-985.
  4. Desai NS, Poindexter GB, Monthrope YM, et al. A pilot quality-of-life instrument for pruritus. J Am Acad Dermatol. 2008;59:234-244.
  5. O’Neill JL, Chan YH, Rapp SR, et al. Differences in itch characteristics between psoriasis and atopic dermatitis patients: results of a web-based questionnaire. Acta Derm Venereol. 2011;91:537-540.
  6. Dawn A, Papoiu ADP, Chan YH, et al. Itch characteristics in atopic dermatitis: results of a web-based questionnaire. Br J Dermatol. 2009;160:642-644.
  7. Yosipovitch G, Rosen JD, Hashimoto T. Itch: from mechanism to (novel) therapeutic approaches. J Allergy Clin Immunol. 2018;142:1375-1390.
  8. Yosipovitch G, Misery L, Proksch E, et al. Skin barrier damage and itch: review of mechanisms, topical management and future directions. Acta Derm Venereol. 2019;99:1201-1209.
  9. Dong X, Dong X. Peripheral and central mechanisms of itch. Neuron. 2018;98:482-494.
  10. Lerner EA. Pathophysiology of itch. Dermatol Clin. 2018;36:175-177.
  11. Cevikbas F, Lerner EA. Physiology and pathophysiology of itch. Physiol Rev. 2020;100:945-982.
  12. Sanders KM, Akiyama T. The vicious cycle of itch and anxiety. Neurosci Biobehav Rev. 2018;87:17-26.
  13. Sanders KM, Nattkemper LA, Yosipovitch G. Advances in understanding itching and scratching: a new era of targeted treatments [published online August 22, 2016]. F1000Res. doi:10.12688/f1000research.8659.
Issue
Cutis - 110(2)
Issue
Cutis - 110(2)
Page Number
E24-E25
Page Number
E24-E25
Publications
Publications
Topics
Article Type
Display Headline
Parameters of Scratch Pleasurability in the Management of Pruritic Conditions
Display Headline
Parameters of Scratch Pleasurability in the Management of Pruritic Conditions
Sections
Inside the Article

Practice Points

  • In individuals with pruritic skin conditions, the itch-scratch cycle can have damaging consequences such as anxiety, infection, and secondary skin changes.
  • Understanding the pleasurability of scratching in pruritic skin conditions allows providers to help patients break the itch-scratch cycle and improve quality of life.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media