User login
Persistent abdominal pain: Not always IBS
Persistent abdominal pain may be caused by a whole range of different conditions, say French experts who call for more physician awareness to achieve early diagnosis and treatment so as to improve patient outcomes.
Benoit Coffin, MD, PhD, and Henri Duboc, MD, PhD, from Hôpital Louis Mourier, Colombes, France, conducted a literature review to identify rare and less well-known causes of persistent abdominal pain, identifying almost 50 across several categories.
“Some causes of persistent abdominal pain can be effectively treated using established approaches after a definitive diagnosis has been reached,” they wrote.
“Other causes are more complex and may benefit from a multidisciplinary approach involving gastroenterologists, pain specialists, allergists, immunologists, rheumatologists, psychologists, physiotherapists, dietitians, and primary care clinicians,” they wrote.
The research was published online in Alimentary Pharmacology and Therapeutics.
Frequent and frustrating symptoms
Although there is “no commonly accepted definition” for persistent abdominal pain, the authors said it may be defined as “continuous or intermittent abdominal discomfort that persists for at least 6 months and fails to respond to conventional therapeutic approaches.”
They highlight that it is “frequently encountered” by physicians and has a prevalence of 22.9 per 1,000 person-years, regardless of age group, ethnicity, or geographical region, with many patients experiencing pain for more than 5 years.
The cause of persistent abdominal pain can be organic with a clear cause or functional, making diagnosis and management “challenging and frustrating for patients and physicians.”
“Clinicians not only need to recognize somatic abnormalities, but they must also perceive the patient’s cognitions and emotions related to the pain,” they added, suggesting that clinicians take time to “listen to the patient and perceive psychological factors.”
Dr. Coffin and Dr. Duboc write that the most common conditions associated with persistent abdominal pain are irritable bowel syndrome and functional dyspepsia, as well as inflammatory bowel disease, chronic pancreatitis, and gallstones.
To examine the diagnosis and management of its less well-known causes, the authors conducted a literature review, beginning with the diagnosis of persistent abdominal pain.
Diagnostic workup
“Given its chronicity, many patients will have already undergone extensive and redundant medical testing,” they wrote, emphasizing that clinicians should be on the lookout for any change in the description of persistent abdominal pain or new symptoms.
“Other ‘red-flag’ symptoms include fever, vomiting, diarrhea, acute change in bowel habit, obstipation, syncope, tachycardia, hypotension, concomitant chest or back pain, unintentional weight loss, night sweats, and acute gastrointestinal bleeding,” the authors said.
They stressed the need to determine whether the origin of the pain is organic or functional, as well as the importance of identifying a “triggering event, such as an adverse life event, infection, initiating a new medication, or surgical procedure.” They also recommend discussing the patient’s diet.
There are currently no specific algorithms for diagnostic workup of persistent abdominal pain, the authors said. Patients will have undergone repeated laboratory tests, “upper and lower endoscopic examinations, abdominal ultrasounds, and computed tomography scans of the abdominal/pelvic area.”
Consequently, “in the absence of alarm features, any additional tests should be ordered in a conservative and cost-effective manner,” they advised.
They suggested that, at a tertiary center, patients should be assessed in three steps:
- In-depth questioning of the symptoms and medical history
- Summary of all previous investigations and treatments and their effectiveness
- Determination of the complementary explorations to be performed
The authors went on to list 49 rare or less well-known potential causes of persistent abdominal pain, some linked to digestive disorders, such as eosinophilic gastroenteritis, mesenteric panniculitis, and chronic mesenteric ischemia, as well as endometriosis, chronic abdominal wall pain, and referred osteoarticular pain.
Systemic causes of persistent abdominal pain may include adrenal insufficiency and mast cell activation syndrome, while acute hepatic porphyrias and Ehlers-Danlos syndrome may be genetic causes.
There are also centrally mediated disorders that lead to persistent abdominal pain, the authors noted, including postural orthostatic tachycardia syndrome and narcotic bowel syndrome caused by opioid therapy, among others.
Writing support for the manuscript was funded by Alnylam Switzerland. Dr. Coffin has served as a speaker for Kyowa Kyrin and Mayoly Spindler and as an advisory board member for Sanofi and Alnylam. Dr. Duboc reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Persistent abdominal pain may be caused by a whole range of different conditions, say French experts who call for more physician awareness to achieve early diagnosis and treatment so as to improve patient outcomes.
Benoit Coffin, MD, PhD, and Henri Duboc, MD, PhD, from Hôpital Louis Mourier, Colombes, France, conducted a literature review to identify rare and less well-known causes of persistent abdominal pain, identifying almost 50 across several categories.
“Some causes of persistent abdominal pain can be effectively treated using established approaches after a definitive diagnosis has been reached,” they wrote.
“Other causes are more complex and may benefit from a multidisciplinary approach involving gastroenterologists, pain specialists, allergists, immunologists, rheumatologists, psychologists, physiotherapists, dietitians, and primary care clinicians,” they wrote.
The research was published online in Alimentary Pharmacology and Therapeutics.
Frequent and frustrating symptoms
Although there is “no commonly accepted definition” for persistent abdominal pain, the authors said it may be defined as “continuous or intermittent abdominal discomfort that persists for at least 6 months and fails to respond to conventional therapeutic approaches.”
They highlight that it is “frequently encountered” by physicians and has a prevalence of 22.9 per 1,000 person-years, regardless of age group, ethnicity, or geographical region, with many patients experiencing pain for more than 5 years.
The cause of persistent abdominal pain can be organic with a clear cause or functional, making diagnosis and management “challenging and frustrating for patients and physicians.”
“Clinicians not only need to recognize somatic abnormalities, but they must also perceive the patient’s cognitions and emotions related to the pain,” they added, suggesting that clinicians take time to “listen to the patient and perceive psychological factors.”
Dr. Coffin and Dr. Duboc write that the most common conditions associated with persistent abdominal pain are irritable bowel syndrome and functional dyspepsia, as well as inflammatory bowel disease, chronic pancreatitis, and gallstones.
To examine the diagnosis and management of its less well-known causes, the authors conducted a literature review, beginning with the diagnosis of persistent abdominal pain.
Diagnostic workup
“Given its chronicity, many patients will have already undergone extensive and redundant medical testing,” they wrote, emphasizing that clinicians should be on the lookout for any change in the description of persistent abdominal pain or new symptoms.
“Other ‘red-flag’ symptoms include fever, vomiting, diarrhea, acute change in bowel habit, obstipation, syncope, tachycardia, hypotension, concomitant chest or back pain, unintentional weight loss, night sweats, and acute gastrointestinal bleeding,” the authors said.
They stressed the need to determine whether the origin of the pain is organic or functional, as well as the importance of identifying a “triggering event, such as an adverse life event, infection, initiating a new medication, or surgical procedure.” They also recommend discussing the patient’s diet.
There are currently no specific algorithms for diagnostic workup of persistent abdominal pain, the authors said. Patients will have undergone repeated laboratory tests, “upper and lower endoscopic examinations, abdominal ultrasounds, and computed tomography scans of the abdominal/pelvic area.”
Consequently, “in the absence of alarm features, any additional tests should be ordered in a conservative and cost-effective manner,” they advised.
They suggested that, at a tertiary center, patients should be assessed in three steps:
- In-depth questioning of the symptoms and medical history
- Summary of all previous investigations and treatments and their effectiveness
- Determination of the complementary explorations to be performed
The authors went on to list 49 rare or less well-known potential causes of persistent abdominal pain, some linked to digestive disorders, such as eosinophilic gastroenteritis, mesenteric panniculitis, and chronic mesenteric ischemia, as well as endometriosis, chronic abdominal wall pain, and referred osteoarticular pain.
Systemic causes of persistent abdominal pain may include adrenal insufficiency and mast cell activation syndrome, while acute hepatic porphyrias and Ehlers-Danlos syndrome may be genetic causes.
There are also centrally mediated disorders that lead to persistent abdominal pain, the authors noted, including postural orthostatic tachycardia syndrome and narcotic bowel syndrome caused by opioid therapy, among others.
Writing support for the manuscript was funded by Alnylam Switzerland. Dr. Coffin has served as a speaker for Kyowa Kyrin and Mayoly Spindler and as an advisory board member for Sanofi and Alnylam. Dr. Duboc reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Persistent abdominal pain may be caused by a whole range of different conditions, say French experts who call for more physician awareness to achieve early diagnosis and treatment so as to improve patient outcomes.
Benoit Coffin, MD, PhD, and Henri Duboc, MD, PhD, from Hôpital Louis Mourier, Colombes, France, conducted a literature review to identify rare and less well-known causes of persistent abdominal pain, identifying almost 50 across several categories.
“Some causes of persistent abdominal pain can be effectively treated using established approaches after a definitive diagnosis has been reached,” they wrote.
“Other causes are more complex and may benefit from a multidisciplinary approach involving gastroenterologists, pain specialists, allergists, immunologists, rheumatologists, psychologists, physiotherapists, dietitians, and primary care clinicians,” they wrote.
The research was published online in Alimentary Pharmacology and Therapeutics.
Frequent and frustrating symptoms
Although there is “no commonly accepted definition” for persistent abdominal pain, the authors said it may be defined as “continuous or intermittent abdominal discomfort that persists for at least 6 months and fails to respond to conventional therapeutic approaches.”
They highlight that it is “frequently encountered” by physicians and has a prevalence of 22.9 per 1,000 person-years, regardless of age group, ethnicity, or geographical region, with many patients experiencing pain for more than 5 years.
The cause of persistent abdominal pain can be organic with a clear cause or functional, making diagnosis and management “challenging and frustrating for patients and physicians.”
“Clinicians not only need to recognize somatic abnormalities, but they must also perceive the patient’s cognitions and emotions related to the pain,” they added, suggesting that clinicians take time to “listen to the patient and perceive psychological factors.”
Dr. Coffin and Dr. Duboc write that the most common conditions associated with persistent abdominal pain are irritable bowel syndrome and functional dyspepsia, as well as inflammatory bowel disease, chronic pancreatitis, and gallstones.
To examine the diagnosis and management of its less well-known causes, the authors conducted a literature review, beginning with the diagnosis of persistent abdominal pain.
Diagnostic workup
“Given its chronicity, many patients will have already undergone extensive and redundant medical testing,” they wrote, emphasizing that clinicians should be on the lookout for any change in the description of persistent abdominal pain or new symptoms.
“Other ‘red-flag’ symptoms include fever, vomiting, diarrhea, acute change in bowel habit, obstipation, syncope, tachycardia, hypotension, concomitant chest or back pain, unintentional weight loss, night sweats, and acute gastrointestinal bleeding,” the authors said.
They stressed the need to determine whether the origin of the pain is organic or functional, as well as the importance of identifying a “triggering event, such as an adverse life event, infection, initiating a new medication, or surgical procedure.” They also recommend discussing the patient’s diet.
There are currently no specific algorithms for diagnostic workup of persistent abdominal pain, the authors said. Patients will have undergone repeated laboratory tests, “upper and lower endoscopic examinations, abdominal ultrasounds, and computed tomography scans of the abdominal/pelvic area.”
Consequently, “in the absence of alarm features, any additional tests should be ordered in a conservative and cost-effective manner,” they advised.
They suggested that, at a tertiary center, patients should be assessed in three steps:
- In-depth questioning of the symptoms and medical history
- Summary of all previous investigations and treatments and their effectiveness
- Determination of the complementary explorations to be performed
The authors went on to list 49 rare or less well-known potential causes of persistent abdominal pain, some linked to digestive disorders, such as eosinophilic gastroenteritis, mesenteric panniculitis, and chronic mesenteric ischemia, as well as endometriosis, chronic abdominal wall pain, and referred osteoarticular pain.
Systemic causes of persistent abdominal pain may include adrenal insufficiency and mast cell activation syndrome, while acute hepatic porphyrias and Ehlers-Danlos syndrome may be genetic causes.
There are also centrally mediated disorders that lead to persistent abdominal pain, the authors noted, including postural orthostatic tachycardia syndrome and narcotic bowel syndrome caused by opioid therapy, among others.
Writing support for the manuscript was funded by Alnylam Switzerland. Dr. Coffin has served as a speaker for Kyowa Kyrin and Mayoly Spindler and as an advisory board member for Sanofi and Alnylam. Dr. Duboc reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM ALIMENTARY PHARMACOLOGY AND THERAPEUTICS
Impact of eliminating cost-sharing on follow-up colonoscopy mixed
Oregon and Kentucky recently enacted policies to eliminate financial disincentives that may have deterred people from undergoing a follow-up colonoscopy after a positive result on a noninvasive screening test for colorectal cancer (CRC).
A new analysis shows that the impact has been mixed. The policies led to significantly increased overall CRC screening and use of noninvasive testing in Oregon but not Kentucky.
The study was published online in JAMA Network Open.
The Affordable Care Act mandates that several CRC screening tests be covered without cost-sharing for people at average risk for CRC. However, lingering cost barriers remain for some people who have a positive initial screening test result and who need follow-up colonoscopy.
This led Kentucky in 2016 and Oregon in 2017 to enact policies that eliminate cost-sharing. Earlier this year, federal guidance eliminated cost-sharing for colonoscopies following noninvasive CRC screening tests for commercial insurers, and a similar policy is under consideration for Medicare.
For their study, Douglas Barthold, PhD, of the University of Washington, Seattle, and colleagues used claims data to evaluate CRC screening rates in Oregon and Kentucky, compared with rates in neighboring states that do not have cost-sharing policies.
The sample included more than 1.2 million individuals aged 45-64 living in Oregon, Kentucky, and nearby states from 2012 to 2019. Overall, about 15% of the cohort underwent any CRC screening; 8% underwent colonoscopy.
After the Oregon policy that eliminated cost-sharing went into effect, Oregonians had 6% higher odds of receiving any CRC screening (odds ratio [OR], 1.06; 95% confidence interval [CI], 1.00-1.06; P = .03) and 35% higher odds of undergoing an initial noninvasive test (OR, 0.65; 95% CI, 0.58-0.73; P < .001), compared with neighboring states that did not implement a similar policy.
But there were no significant differences in total CRC screening use in Kentucky after policy implementation compared with neighboring states.
The odds of receiving a colonoscopy conditional on undergoing noninvasive CRC screening were not statistically different in Oregon or Kentucky, compared with neighboring states.
“These findings suggest that the enactment of policies that remove financial barriers is merely one of many elements (e.g., health literacy, outreach, transportation, access to care) that may help to achieve desired cancer screening outcomes,” wrote Dr. Barthold and colleagues.
The study had no commercial funding. Dr. Barthold reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Oregon and Kentucky recently enacted policies to eliminate financial disincentives that may have deterred people from undergoing a follow-up colonoscopy after a positive result on a noninvasive screening test for colorectal cancer (CRC).
A new analysis shows that the impact has been mixed. The policies led to significantly increased overall CRC screening and use of noninvasive testing in Oregon but not Kentucky.
The study was published online in JAMA Network Open.
The Affordable Care Act mandates that several CRC screening tests be covered without cost-sharing for people at average risk for CRC. However, lingering cost barriers remain for some people who have a positive initial screening test result and who need follow-up colonoscopy.
This led Kentucky in 2016 and Oregon in 2017 to enact policies that eliminate cost-sharing. Earlier this year, federal guidance eliminated cost-sharing for colonoscopies following noninvasive CRC screening tests for commercial insurers, and a similar policy is under consideration for Medicare.
For their study, Douglas Barthold, PhD, of the University of Washington, Seattle, and colleagues used claims data to evaluate CRC screening rates in Oregon and Kentucky, compared with rates in neighboring states that do not have cost-sharing policies.
The sample included more than 1.2 million individuals aged 45-64 living in Oregon, Kentucky, and nearby states from 2012 to 2019. Overall, about 15% of the cohort underwent any CRC screening; 8% underwent colonoscopy.
After the Oregon policy that eliminated cost-sharing went into effect, Oregonians had 6% higher odds of receiving any CRC screening (odds ratio [OR], 1.06; 95% confidence interval [CI], 1.00-1.06; P = .03) and 35% higher odds of undergoing an initial noninvasive test (OR, 0.65; 95% CI, 0.58-0.73; P < .001), compared with neighboring states that did not implement a similar policy.
But there were no significant differences in total CRC screening use in Kentucky after policy implementation compared with neighboring states.
The odds of receiving a colonoscopy conditional on undergoing noninvasive CRC screening were not statistically different in Oregon or Kentucky, compared with neighboring states.
“These findings suggest that the enactment of policies that remove financial barriers is merely one of many elements (e.g., health literacy, outreach, transportation, access to care) that may help to achieve desired cancer screening outcomes,” wrote Dr. Barthold and colleagues.
The study had no commercial funding. Dr. Barthold reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Oregon and Kentucky recently enacted policies to eliminate financial disincentives that may have deterred people from undergoing a follow-up colonoscopy after a positive result on a noninvasive screening test for colorectal cancer (CRC).
A new analysis shows that the impact has been mixed. The policies led to significantly increased overall CRC screening and use of noninvasive testing in Oregon but not Kentucky.
The study was published online in JAMA Network Open.
The Affordable Care Act mandates that several CRC screening tests be covered without cost-sharing for people at average risk for CRC. However, lingering cost barriers remain for some people who have a positive initial screening test result and who need follow-up colonoscopy.
This led Kentucky in 2016 and Oregon in 2017 to enact policies that eliminate cost-sharing. Earlier this year, federal guidance eliminated cost-sharing for colonoscopies following noninvasive CRC screening tests for commercial insurers, and a similar policy is under consideration for Medicare.
For their study, Douglas Barthold, PhD, of the University of Washington, Seattle, and colleagues used claims data to evaluate CRC screening rates in Oregon and Kentucky, compared with rates in neighboring states that do not have cost-sharing policies.
The sample included more than 1.2 million individuals aged 45-64 living in Oregon, Kentucky, and nearby states from 2012 to 2019. Overall, about 15% of the cohort underwent any CRC screening; 8% underwent colonoscopy.
After the Oregon policy that eliminated cost-sharing went into effect, Oregonians had 6% higher odds of receiving any CRC screening (odds ratio [OR], 1.06; 95% confidence interval [CI], 1.00-1.06; P = .03) and 35% higher odds of undergoing an initial noninvasive test (OR, 0.65; 95% CI, 0.58-0.73; P < .001), compared with neighboring states that did not implement a similar policy.
But there were no significant differences in total CRC screening use in Kentucky after policy implementation compared with neighboring states.
The odds of receiving a colonoscopy conditional on undergoing noninvasive CRC screening were not statistically different in Oregon or Kentucky, compared with neighboring states.
“These findings suggest that the enactment of policies that remove financial barriers is merely one of many elements (e.g., health literacy, outreach, transportation, access to care) that may help to achieve desired cancer screening outcomes,” wrote Dr. Barthold and colleagues.
The study had no commercial funding. Dr. Barthold reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Lawmakers argue for changes in prior authorization processes
Republican and Democratic members of the House called for changes in how insurer-run Medicare plans manage the prior authorization process, following testimony from a federal watchdog organization about improper denials of payment for care.
About 18% of payment denials in a sample examined by the Office of Inspector General (OIG) of the Department of Health and Human Services (HHS) either met Medicare coverage rules or the rules of the insurance plan.
As such, they should not have been denied, according to the OIG. That was the finding of an April OIG report, based on a sample of 2019 denials from large insurer-run Medicare plans.
Erin Bliss, an assistant inspector general with the OIG, appeared as a witness at a June 28 Energy and Commerce Subcommittee on Oversight and Investigations hearing to discuss this investigation and other issues with prior authorization and insurer-run Medicare, also known as the Advantage plans.
Most of these payment denials of appropriate services were due to human error during manual claims-processing reviews, Ms. Bliss told the subcommittee, such as overlooking a document, and to system processing errors, such as a Medicare insurance plan failing to program or update a system correctly.
In many cases, these denials were reversed, but patient care was still disrupted and clinicians lost time chasing clearances for services that plans already had covered, Ms. Bliss said in her testimony.
The April report was not the OIG’s first look into concerns about insurer-run plans inappropriately denying care through prior authorizations. The OIG in 2018 reported that insurer-run Medicare plans overturned 75% of their own denials during 2014-2016 when patients and clinicians appealed these decisions, overturning approximately 216,000 denials each year.
‘Numerous hoops’ unnecessary for doctors, patients
Lawmakers at the hearing supported the idea of the need for prior authorization as a screening tool to prevent unneeded care.
But they chided insurance companies for their execution of this process, with clinicians and patients often frustrated by complex steps needed. Medicare Advantage plans sometimes require prior authorization for “relatively standard medical services,” said Subcommittee on Oversight and Investigations Chair Diana DeGette (D-Colo.).
“Our seniors and their doctors should not be required to jump through numerous hoops to ensure coverage for straightforward and medically necessary procedures,” Rep. DeGette said.
Several lawmakers spoke at the hearing about the need for changes to prior authorization, including calling for action on a pending bill intended to compel insurers to streamline the review process. The Improving Seniors’ Timely Access to Care Act of 2021 already has attracted more than 300 bipartisan sponsors. A companion Senate bill has more than 30 sponsors.
The bill’s aim is to shift this process away from faxes and phone calls while also encouraging plans to adhere to evidence-based medical guidelines in consultation with physicians. The bill calls for the establishment of an electronic prior authorization program that could issue real-time decisions.
“The result will be less administrative burden for providers and more information in the hands of patients. It will allow more patients to receive care when they need it, reducing the likelihood of additional, often more severe complications,” said Rep. Larry Bucshon, MD, (R-Ind.) who is among the active sponsors of the bill.
“In the long term, I believe it would also result in cost savings for the health care system at large by identifying problems earlier and getting them treated before their patients have more complications,” Rep. Bucshon added.
Finding ‘room for improvement’ for prior authorizations
There’s strong bipartisan support in Congress for insurer-run Medicare, which has grown by 10% per year over the last several years and has doubled since 2010, according to the Medicare Payment Advisory Commission (MedPAC). About 27 million people are now enrolled in these plans.
But for that reason, insurer-run Medicare may also need more careful watching, lawmakers made clear at the hearing.
“We’ve heard quite a bit of evidence today that there is room for improvement,” said Rep. Bucshon, a strong supporter of insurer-run Medicare, which can offer patients added benefits such as dental coverage.
Rep. Ann Kuster (D-N.H.) said simplifying prior authorization would reduce stress on clinicians already dealing with burnout.
“They’re just so tired of all this paperwork and red tape,” Rep. Kuster said. “In 2022 can’t we at least consider electronic prior authorization?”
At the hearing, Rep. Michael C. Burgess, MD, (R-Tex.) noted that his home state already has taken a step toward reducing the burden of prior authorization with its “gold card” program.
In 2021, a new Texas law called on the state department of insurance to develop rules to require health plans to provide an exemption from preauthorization requirements for a particular health care service if the issuer has approved, or would have approved, at least 90% of the preauthorization requests submitted by the physician or provider for that service. The law also mandates that a physician participating in a peer-to-peer review on behalf of a health benefit plan issuer must be a Texas-licensed physician who has the same or similar specialty as the physician or clinician requesting the service, according to the state insurance department.
Separately, Rep. Suzan DelBene (D-Wash.), the sponsor of the Improving Seniors’ Timely Access to Care Act, told the American Medical Association in a recent interview that she expects the House Ways and Means Committee, on which she serves, to mark up her bill in July. (A mark-up is the process by which a House or Senate committee considers and often amends a bill and then sends it to the chamber’s leadership for a floor vote.)
In a statement issued about the hearing, America’s Health Insurance Plans (AHIP) noted that there has been work in recent years toward streamlining prior authorization. AHIP said it launched the Fast Prior Authorization Technology Highway (Fast PATH) initiative in 2020 to study electronic procedures for handling these reviews.
“The findings of this study showed that ePA delivered improvements with a strong majority of experienced providers reporting faster time to patient care, fewer phone calls and faxes, better understanding of [prior authorization] requirements, and faster time to decisions,” AHIP said.
A version of this article first appeared on Medscape.com.
Republican and Democratic members of the House called for changes in how insurer-run Medicare plans manage the prior authorization process, following testimony from a federal watchdog organization about improper denials of payment for care.
About 18% of payment denials in a sample examined by the Office of Inspector General (OIG) of the Department of Health and Human Services (HHS) either met Medicare coverage rules or the rules of the insurance plan.
As such, they should not have been denied, according to the OIG. That was the finding of an April OIG report, based on a sample of 2019 denials from large insurer-run Medicare plans.
Erin Bliss, an assistant inspector general with the OIG, appeared as a witness at a June 28 Energy and Commerce Subcommittee on Oversight and Investigations hearing to discuss this investigation and other issues with prior authorization and insurer-run Medicare, also known as the Advantage plans.
Most of these payment denials of appropriate services were due to human error during manual claims-processing reviews, Ms. Bliss told the subcommittee, such as overlooking a document, and to system processing errors, such as a Medicare insurance plan failing to program or update a system correctly.
In many cases, these denials were reversed, but patient care was still disrupted and clinicians lost time chasing clearances for services that plans already had covered, Ms. Bliss said in her testimony.
The April report was not the OIG’s first look into concerns about insurer-run plans inappropriately denying care through prior authorizations. The OIG in 2018 reported that insurer-run Medicare plans overturned 75% of their own denials during 2014-2016 when patients and clinicians appealed these decisions, overturning approximately 216,000 denials each year.
‘Numerous hoops’ unnecessary for doctors, patients
Lawmakers at the hearing supported the idea of the need for prior authorization as a screening tool to prevent unneeded care.
But they chided insurance companies for their execution of this process, with clinicians and patients often frustrated by complex steps needed. Medicare Advantage plans sometimes require prior authorization for “relatively standard medical services,” said Subcommittee on Oversight and Investigations Chair Diana DeGette (D-Colo.).
“Our seniors and their doctors should not be required to jump through numerous hoops to ensure coverage for straightforward and medically necessary procedures,” Rep. DeGette said.
Several lawmakers spoke at the hearing about the need for changes to prior authorization, including calling for action on a pending bill intended to compel insurers to streamline the review process. The Improving Seniors’ Timely Access to Care Act of 2021 already has attracted more than 300 bipartisan sponsors. A companion Senate bill has more than 30 sponsors.
The bill’s aim is to shift this process away from faxes and phone calls while also encouraging plans to adhere to evidence-based medical guidelines in consultation with physicians. The bill calls for the establishment of an electronic prior authorization program that could issue real-time decisions.
“The result will be less administrative burden for providers and more information in the hands of patients. It will allow more patients to receive care when they need it, reducing the likelihood of additional, often more severe complications,” said Rep. Larry Bucshon, MD, (R-Ind.) who is among the active sponsors of the bill.
“In the long term, I believe it would also result in cost savings for the health care system at large by identifying problems earlier and getting them treated before their patients have more complications,” Rep. Bucshon added.
Finding ‘room for improvement’ for prior authorizations
There’s strong bipartisan support in Congress for insurer-run Medicare, which has grown by 10% per year over the last several years and has doubled since 2010, according to the Medicare Payment Advisory Commission (MedPAC). About 27 million people are now enrolled in these plans.
But for that reason, insurer-run Medicare may also need more careful watching, lawmakers made clear at the hearing.
“We’ve heard quite a bit of evidence today that there is room for improvement,” said Rep. Bucshon, a strong supporter of insurer-run Medicare, which can offer patients added benefits such as dental coverage.
Rep. Ann Kuster (D-N.H.) said simplifying prior authorization would reduce stress on clinicians already dealing with burnout.
“They’re just so tired of all this paperwork and red tape,” Rep. Kuster said. “In 2022 can’t we at least consider electronic prior authorization?”
At the hearing, Rep. Michael C. Burgess, MD, (R-Tex.) noted that his home state already has taken a step toward reducing the burden of prior authorization with its “gold card” program.
In 2021, a new Texas law called on the state department of insurance to develop rules to require health plans to provide an exemption from preauthorization requirements for a particular health care service if the issuer has approved, or would have approved, at least 90% of the preauthorization requests submitted by the physician or provider for that service. The law also mandates that a physician participating in a peer-to-peer review on behalf of a health benefit plan issuer must be a Texas-licensed physician who has the same or similar specialty as the physician or clinician requesting the service, according to the state insurance department.
Separately, Rep. Suzan DelBene (D-Wash.), the sponsor of the Improving Seniors’ Timely Access to Care Act, told the American Medical Association in a recent interview that she expects the House Ways and Means Committee, on which she serves, to mark up her bill in July. (A mark-up is the process by which a House or Senate committee considers and often amends a bill and then sends it to the chamber’s leadership for a floor vote.)
In a statement issued about the hearing, America’s Health Insurance Plans (AHIP) noted that there has been work in recent years toward streamlining prior authorization. AHIP said it launched the Fast Prior Authorization Technology Highway (Fast PATH) initiative in 2020 to study electronic procedures for handling these reviews.
“The findings of this study showed that ePA delivered improvements with a strong majority of experienced providers reporting faster time to patient care, fewer phone calls and faxes, better understanding of [prior authorization] requirements, and faster time to decisions,” AHIP said.
A version of this article first appeared on Medscape.com.
Republican and Democratic members of the House called for changes in how insurer-run Medicare plans manage the prior authorization process, following testimony from a federal watchdog organization about improper denials of payment for care.
About 18% of payment denials in a sample examined by the Office of Inspector General (OIG) of the Department of Health and Human Services (HHS) either met Medicare coverage rules or the rules of the insurance plan.
As such, they should not have been denied, according to the OIG. That was the finding of an April OIG report, based on a sample of 2019 denials from large insurer-run Medicare plans.
Erin Bliss, an assistant inspector general with the OIG, appeared as a witness at a June 28 Energy and Commerce Subcommittee on Oversight and Investigations hearing to discuss this investigation and other issues with prior authorization and insurer-run Medicare, also known as the Advantage plans.
Most of these payment denials of appropriate services were due to human error during manual claims-processing reviews, Ms. Bliss told the subcommittee, such as overlooking a document, and to system processing errors, such as a Medicare insurance plan failing to program or update a system correctly.
In many cases, these denials were reversed, but patient care was still disrupted and clinicians lost time chasing clearances for services that plans already had covered, Ms. Bliss said in her testimony.
The April report was not the OIG’s first look into concerns about insurer-run plans inappropriately denying care through prior authorizations. The OIG in 2018 reported that insurer-run Medicare plans overturned 75% of their own denials during 2014-2016 when patients and clinicians appealed these decisions, overturning approximately 216,000 denials each year.
‘Numerous hoops’ unnecessary for doctors, patients
Lawmakers at the hearing supported the idea of the need for prior authorization as a screening tool to prevent unneeded care.
But they chided insurance companies for their execution of this process, with clinicians and patients often frustrated by complex steps needed. Medicare Advantage plans sometimes require prior authorization for “relatively standard medical services,” said Subcommittee on Oversight and Investigations Chair Diana DeGette (D-Colo.).
“Our seniors and their doctors should not be required to jump through numerous hoops to ensure coverage for straightforward and medically necessary procedures,” Rep. DeGette said.
Several lawmakers spoke at the hearing about the need for changes to prior authorization, including calling for action on a pending bill intended to compel insurers to streamline the review process. The Improving Seniors’ Timely Access to Care Act of 2021 already has attracted more than 300 bipartisan sponsors. A companion Senate bill has more than 30 sponsors.
The bill’s aim is to shift this process away from faxes and phone calls while also encouraging plans to adhere to evidence-based medical guidelines in consultation with physicians. The bill calls for the establishment of an electronic prior authorization program that could issue real-time decisions.
“The result will be less administrative burden for providers and more information in the hands of patients. It will allow more patients to receive care when they need it, reducing the likelihood of additional, often more severe complications,” said Rep. Larry Bucshon, MD, (R-Ind.) who is among the active sponsors of the bill.
“In the long term, I believe it would also result in cost savings for the health care system at large by identifying problems earlier and getting them treated before their patients have more complications,” Rep. Bucshon added.
Finding ‘room for improvement’ for prior authorizations
There’s strong bipartisan support in Congress for insurer-run Medicare, which has grown by 10% per year over the last several years and has doubled since 2010, according to the Medicare Payment Advisory Commission (MedPAC). About 27 million people are now enrolled in these plans.
But for that reason, insurer-run Medicare may also need more careful watching, lawmakers made clear at the hearing.
“We’ve heard quite a bit of evidence today that there is room for improvement,” said Rep. Bucshon, a strong supporter of insurer-run Medicare, which can offer patients added benefits such as dental coverage.
Rep. Ann Kuster (D-N.H.) said simplifying prior authorization would reduce stress on clinicians already dealing with burnout.
“They’re just so tired of all this paperwork and red tape,” Rep. Kuster said. “In 2022 can’t we at least consider electronic prior authorization?”
At the hearing, Rep. Michael C. Burgess, MD, (R-Tex.) noted that his home state already has taken a step toward reducing the burden of prior authorization with its “gold card” program.
In 2021, a new Texas law called on the state department of insurance to develop rules to require health plans to provide an exemption from preauthorization requirements for a particular health care service if the issuer has approved, or would have approved, at least 90% of the preauthorization requests submitted by the physician or provider for that service. The law also mandates that a physician participating in a peer-to-peer review on behalf of a health benefit plan issuer must be a Texas-licensed physician who has the same or similar specialty as the physician or clinician requesting the service, according to the state insurance department.
Separately, Rep. Suzan DelBene (D-Wash.), the sponsor of the Improving Seniors’ Timely Access to Care Act, told the American Medical Association in a recent interview that she expects the House Ways and Means Committee, on which she serves, to mark up her bill in July. (A mark-up is the process by which a House or Senate committee considers and often amends a bill and then sends it to the chamber’s leadership for a floor vote.)
In a statement issued about the hearing, America’s Health Insurance Plans (AHIP) noted that there has been work in recent years toward streamlining prior authorization. AHIP said it launched the Fast Prior Authorization Technology Highway (Fast PATH) initiative in 2020 to study electronic procedures for handling these reviews.
“The findings of this study showed that ePA delivered improvements with a strong majority of experienced providers reporting faster time to patient care, fewer phone calls and faxes, better understanding of [prior authorization] requirements, and faster time to decisions,” AHIP said.
A version of this article first appeared on Medscape.com.
More reflux after sleeve gastrectomy vs. gastric bypass at 10 years
Sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB) each led to good and sustainable weight loss 10 years later, although reflux was more prevalent after SG, according to the Sleeve vs. Bypass (SLEEVEPASS) randomized clinical trial.
At 10 years, there were no statistically significant between-procedure differences in type 2 diabetes remission, dyslipidemia, or obstructive sleep apnea, but hypertension remission was greater with RYGB.
However, importantly, the cumulative incidence of Barrett’s esophagus was similar after both procedures (4%) and markedly lower than reported in previous trials (14%-17%).
To their knowledge, this is the largest randomized controlled trial with the longest follow-up comparing these two laparoscopic bariatric surgeries, Paulina Salminen, MD, PhD, and colleagues write in their study published online in JAMA Surgery.
They aimed to clarify the “controversial issues” of long-term gastroesophageal reflux disease (GERD) symptoms, endoscopic esophagitis, and Barrett’s esophagus after SG vs. RYGB.
The findings showed that “there was no difference in the prevalence of Barrett’s esophagus, contrary to previous reports of alarming rates of Barrett’s [esophagus] after sleeve gastrectomy,” Dr. Salminen from Turku (Finland) University Hospital, told this news organization in an email.
“However, our results also show that esophagitis and GERD symptoms are significantly more prevalent after sleeve [gastrectomy], and GERD is an important factor to be considered in the preoperative assessment of bariatric surgery and procedure choice,” she said.
The takeaway is that “we have two good procedures providing good and sustainable 10-year results for both weight loss and remission of comorbidities” for severe obesity, a major health risk, Dr. Salminen summarized.
10-year data analysis
Long-term outcomes from randomized clinical trials of laparoscopic SG vs. RYGB are limited, and recent studies have shown a high incidence of worsening of de novo GERD, esophagitis, and Barrett’s esophagus, after laparoscopic SG, Dr. Salminen and colleagues write.
To investigate, they analyzed 10-year data from SLEEVEPASS, which had randomized 240 adult patients with severe obesity to either SG or RYGB at three hospitals in Finland during 2008-2010.
At baseline, 121 patients were randomized to SG and 119 to RYGB. They had a mean age of 48 years, a mean body mass index of 45.9 kg/m2, and 70% were women.
Two patients never had the surgery, and at 10 years, 10 patients had died of causes unrelated to bariatric surgery.
At 10 years, 193 of the 288 remaining patients (85%) completed the follow-up for weight loss and other comorbidity outcomes, and 176 of 228 (77%) underwent gastroscopy.
The primary study endpoint of the trial was percent excess weight loss (%EWL). At 10 years, the median %EWL was 43.5% after SG vs. 50.7% after RYGB, with a wide range for both procedures (roughly 2%-110% excess weight loss). Mean estimate %EWL was not equivalent, with it being 8.4% in favor of RYGB.
After SG and RYGB, there were no statistically significant differences in type 2 diabetes remission (26% and 33%, respectively), dyslipidemia (19% and 35%, respectively), or obstructive sleep apnea (16% and 31%, respectively).
Hypertension remission was superior after RYGB (8% vs. 24%; P = .04).
Esophagitis was more prevalent after SG (31% vs. 7%; P < .001).
‘Very important study’
“This is a very important study, the first to report 10-year results of a randomized controlled trial comparing the two most frequently used bariatric operations, SG and RYGB,” Beat Peter Müller, MD, MBA, and Adrian Billeter, MD, PhD, who were not involved with this research, told this news organization in an email.
“The results will have a major impact on the future of bariatric surgery,” according to Dr. Müller and Dr. Billeter, from Heidelberg (Germany) University.
The most relevant findings are the GERD outcomes, they said. Because of the high rate of upper endoscopies at 10 years (73%), the study allowed a good assessment of this.
“While this study confirms that SG is a GERD-prone procedure, it clearly demonstrates that GERD after SG does not induce severe esophagitis and Barrett’s esophagus,” they said.
Most importantly, the rate of Barrett’s esophagus, the precursor lesion of adenocarcinomas of the esophago-gastric junction is similar (4%) after both operations and there was no dysplasia in either group, they stressed.
“The main problem after SG remains new-onset GERD, for which still no predictive parameter exists,” according to Dr. Müller and Dr. Billeter.
“The take home message … is that GERD after SG is generally mild and the risk of Barrett’s esophagus is equally higher after SG and RYGB,” they said. “Therefore, all patients after any bariatric operations should undergo regular upper endoscopies.”
However, “RYGB still leads to an increase in proton-pump inhibitor use, despite RYGB being one of the most effective antireflux procedures,” they said. “This finding needs further investigation.”
Furthermore, “a 4% Barrett esophagus rate 10 years after RYGB is troublesome, and the reasons should be investigated,” they added.
“Another relevant finding is that after 10 years, RYGB has a statistically better weight loss, which reaches the primary endpoint of the SLEEVEPASS trial for the first time,” they noted, yet the clinical relevance of this is not clear, since there was no difference in resolution of comorbidities, except for hypertension.
Gyanprakash A. Ketwaroo, MD, of Baylor College of Medicine, Houston, who was not involved with this research, agreed that “the study shows durable and good weight loss for either type of laparoscopic surgery with important metabolic effects and confirms the long-term benefits of weight-loss surgery.”
“What is somewhat new is the lower levels of Barrett’s esophagus after sleeve gastrectomy compared with several earlier studies,” he told this news organization in an email.
“This is somewhat incongruent with the relatively high incidence of postsleeve esophagitis noted in the study, which is an accepted risk factor for Barrett’s esophagus,” he continued. “Thus, I believe concern will still remain about GERD-related complications, including Barrett’s [esophagus], after sleeve gastrectomy.”
“This paper highlights the need for larger prospective studies, especially those that include diverse, older populations with multiple risk factors for Barrett’s esophagus,” Dr. Ketwaroo said.
Looking ahead
Using a large data set, such as that from SLEEVEPASS and possibly with data from the SM-BOSS trial and the BariSurg trial, with machine learning and other sophisticated analyses might identify parameters that could be used to choose the best operation for an individual patient, Dr. Salminen speculated.
“I think what we have learned from these long-term follow-up results is that GERD assessment should be a part of the preoperative assessment, and for patients who have preoperative GERD symptoms and GERD-related endoscopic findings (e.g., hiatal hernia), gastric bypass would be a more optimal procedure choice, if there are no contraindications for it,” she said.
Patient discussions should also cover “long-term symptoms, for example, abdominal pain after RYGB,” she added.
“I am looking forward to our future 20-year follow-up results,” Dr. Salminen said, “which will shed more light on this topic of postoperative [endoscopic] surveillance.
In the meantime, “preoperative gastroscopy is necessary and beneficial, at least when considering sleeve gastrectomy,” she said.
The SLEEVEPASS trial was supported by the Mary and Georg C. Ehrnrooth Foundation, the Government Research Foundation (in a grant awarded to Turku University Hospital), the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation. Dr. Salminen reported receiving grants from the Government Research Foundation awarded to Turku University Hospital and the Mary and Georg C. Ehrnrooth Foundation. Another coauthor received grants from the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation during the study. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
Sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB) each led to good and sustainable weight loss 10 years later, although reflux was more prevalent after SG, according to the Sleeve vs. Bypass (SLEEVEPASS) randomized clinical trial.
At 10 years, there were no statistically significant between-procedure differences in type 2 diabetes remission, dyslipidemia, or obstructive sleep apnea, but hypertension remission was greater with RYGB.
However, importantly, the cumulative incidence of Barrett’s esophagus was similar after both procedures (4%) and markedly lower than reported in previous trials (14%-17%).
To their knowledge, this is the largest randomized controlled trial with the longest follow-up comparing these two laparoscopic bariatric surgeries, Paulina Salminen, MD, PhD, and colleagues write in their study published online in JAMA Surgery.
They aimed to clarify the “controversial issues” of long-term gastroesophageal reflux disease (GERD) symptoms, endoscopic esophagitis, and Barrett’s esophagus after SG vs. RYGB.
The findings showed that “there was no difference in the prevalence of Barrett’s esophagus, contrary to previous reports of alarming rates of Barrett’s [esophagus] after sleeve gastrectomy,” Dr. Salminen from Turku (Finland) University Hospital, told this news organization in an email.
“However, our results also show that esophagitis and GERD symptoms are significantly more prevalent after sleeve [gastrectomy], and GERD is an important factor to be considered in the preoperative assessment of bariatric surgery and procedure choice,” she said.
The takeaway is that “we have two good procedures providing good and sustainable 10-year results for both weight loss and remission of comorbidities” for severe obesity, a major health risk, Dr. Salminen summarized.
10-year data analysis
Long-term outcomes from randomized clinical trials of laparoscopic SG vs. RYGB are limited, and recent studies have shown a high incidence of worsening of de novo GERD, esophagitis, and Barrett’s esophagus, after laparoscopic SG, Dr. Salminen and colleagues write.
To investigate, they analyzed 10-year data from SLEEVEPASS, which had randomized 240 adult patients with severe obesity to either SG or RYGB at three hospitals in Finland during 2008-2010.
At baseline, 121 patients were randomized to SG and 119 to RYGB. They had a mean age of 48 years, a mean body mass index of 45.9 kg/m2, and 70% were women.
Two patients never had the surgery, and at 10 years, 10 patients had died of causes unrelated to bariatric surgery.
At 10 years, 193 of the 288 remaining patients (85%) completed the follow-up for weight loss and other comorbidity outcomes, and 176 of 228 (77%) underwent gastroscopy.
The primary study endpoint of the trial was percent excess weight loss (%EWL). At 10 years, the median %EWL was 43.5% after SG vs. 50.7% after RYGB, with a wide range for both procedures (roughly 2%-110% excess weight loss). Mean estimate %EWL was not equivalent, with it being 8.4% in favor of RYGB.
After SG and RYGB, there were no statistically significant differences in type 2 diabetes remission (26% and 33%, respectively), dyslipidemia (19% and 35%, respectively), or obstructive sleep apnea (16% and 31%, respectively).
Hypertension remission was superior after RYGB (8% vs. 24%; P = .04).
Esophagitis was more prevalent after SG (31% vs. 7%; P < .001).
‘Very important study’
“This is a very important study, the first to report 10-year results of a randomized controlled trial comparing the two most frequently used bariatric operations, SG and RYGB,” Beat Peter Müller, MD, MBA, and Adrian Billeter, MD, PhD, who were not involved with this research, told this news organization in an email.
“The results will have a major impact on the future of bariatric surgery,” according to Dr. Müller and Dr. Billeter, from Heidelberg (Germany) University.
The most relevant findings are the GERD outcomes, they said. Because of the high rate of upper endoscopies at 10 years (73%), the study allowed a good assessment of this.
“While this study confirms that SG is a GERD-prone procedure, it clearly demonstrates that GERD after SG does not induce severe esophagitis and Barrett’s esophagus,” they said.
Most importantly, the rate of Barrett’s esophagus, the precursor lesion of adenocarcinomas of the esophago-gastric junction is similar (4%) after both operations and there was no dysplasia in either group, they stressed.
“The main problem after SG remains new-onset GERD, for which still no predictive parameter exists,” according to Dr. Müller and Dr. Billeter.
“The take home message … is that GERD after SG is generally mild and the risk of Barrett’s esophagus is equally higher after SG and RYGB,” they said. “Therefore, all patients after any bariatric operations should undergo regular upper endoscopies.”
However, “RYGB still leads to an increase in proton-pump inhibitor use, despite RYGB being one of the most effective antireflux procedures,” they said. “This finding needs further investigation.”
Furthermore, “a 4% Barrett esophagus rate 10 years after RYGB is troublesome, and the reasons should be investigated,” they added.
“Another relevant finding is that after 10 years, RYGB has a statistically better weight loss, which reaches the primary endpoint of the SLEEVEPASS trial for the first time,” they noted, yet the clinical relevance of this is not clear, since there was no difference in resolution of comorbidities, except for hypertension.
Gyanprakash A. Ketwaroo, MD, of Baylor College of Medicine, Houston, who was not involved with this research, agreed that “the study shows durable and good weight loss for either type of laparoscopic surgery with important metabolic effects and confirms the long-term benefits of weight-loss surgery.”
“What is somewhat new is the lower levels of Barrett’s esophagus after sleeve gastrectomy compared with several earlier studies,” he told this news organization in an email.
“This is somewhat incongruent with the relatively high incidence of postsleeve esophagitis noted in the study, which is an accepted risk factor for Barrett’s esophagus,” he continued. “Thus, I believe concern will still remain about GERD-related complications, including Barrett’s [esophagus], after sleeve gastrectomy.”
“This paper highlights the need for larger prospective studies, especially those that include diverse, older populations with multiple risk factors for Barrett’s esophagus,” Dr. Ketwaroo said.
Looking ahead
Using a large data set, such as that from SLEEVEPASS and possibly with data from the SM-BOSS trial and the BariSurg trial, with machine learning and other sophisticated analyses might identify parameters that could be used to choose the best operation for an individual patient, Dr. Salminen speculated.
“I think what we have learned from these long-term follow-up results is that GERD assessment should be a part of the preoperative assessment, and for patients who have preoperative GERD symptoms and GERD-related endoscopic findings (e.g., hiatal hernia), gastric bypass would be a more optimal procedure choice, if there are no contraindications for it,” she said.
Patient discussions should also cover “long-term symptoms, for example, abdominal pain after RYGB,” she added.
“I am looking forward to our future 20-year follow-up results,” Dr. Salminen said, “which will shed more light on this topic of postoperative [endoscopic] surveillance.
In the meantime, “preoperative gastroscopy is necessary and beneficial, at least when considering sleeve gastrectomy,” she said.
The SLEEVEPASS trial was supported by the Mary and Georg C. Ehrnrooth Foundation, the Government Research Foundation (in a grant awarded to Turku University Hospital), the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation. Dr. Salminen reported receiving grants from the Government Research Foundation awarded to Turku University Hospital and the Mary and Georg C. Ehrnrooth Foundation. Another coauthor received grants from the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation during the study. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
Sleeve gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB) each led to good and sustainable weight loss 10 years later, although reflux was more prevalent after SG, according to the Sleeve vs. Bypass (SLEEVEPASS) randomized clinical trial.
At 10 years, there were no statistically significant between-procedure differences in type 2 diabetes remission, dyslipidemia, or obstructive sleep apnea, but hypertension remission was greater with RYGB.
However, importantly, the cumulative incidence of Barrett’s esophagus was similar after both procedures (4%) and markedly lower than reported in previous trials (14%-17%).
To their knowledge, this is the largest randomized controlled trial with the longest follow-up comparing these two laparoscopic bariatric surgeries, Paulina Salminen, MD, PhD, and colleagues write in their study published online in JAMA Surgery.
They aimed to clarify the “controversial issues” of long-term gastroesophageal reflux disease (GERD) symptoms, endoscopic esophagitis, and Barrett’s esophagus after SG vs. RYGB.
The findings showed that “there was no difference in the prevalence of Barrett’s esophagus, contrary to previous reports of alarming rates of Barrett’s [esophagus] after sleeve gastrectomy,” Dr. Salminen from Turku (Finland) University Hospital, told this news organization in an email.
“However, our results also show that esophagitis and GERD symptoms are significantly more prevalent after sleeve [gastrectomy], and GERD is an important factor to be considered in the preoperative assessment of bariatric surgery and procedure choice,” she said.
The takeaway is that “we have two good procedures providing good and sustainable 10-year results for both weight loss and remission of comorbidities” for severe obesity, a major health risk, Dr. Salminen summarized.
10-year data analysis
Long-term outcomes from randomized clinical trials of laparoscopic SG vs. RYGB are limited, and recent studies have shown a high incidence of worsening of de novo GERD, esophagitis, and Barrett’s esophagus, after laparoscopic SG, Dr. Salminen and colleagues write.
To investigate, they analyzed 10-year data from SLEEVEPASS, which had randomized 240 adult patients with severe obesity to either SG or RYGB at three hospitals in Finland during 2008-2010.
At baseline, 121 patients were randomized to SG and 119 to RYGB. They had a mean age of 48 years, a mean body mass index of 45.9 kg/m2, and 70% were women.
Two patients never had the surgery, and at 10 years, 10 patients had died of causes unrelated to bariatric surgery.
At 10 years, 193 of the 288 remaining patients (85%) completed the follow-up for weight loss and other comorbidity outcomes, and 176 of 228 (77%) underwent gastroscopy.
The primary study endpoint of the trial was percent excess weight loss (%EWL). At 10 years, the median %EWL was 43.5% after SG vs. 50.7% after RYGB, with a wide range for both procedures (roughly 2%-110% excess weight loss). Mean estimate %EWL was not equivalent, with it being 8.4% in favor of RYGB.
After SG and RYGB, there were no statistically significant differences in type 2 diabetes remission (26% and 33%, respectively), dyslipidemia (19% and 35%, respectively), or obstructive sleep apnea (16% and 31%, respectively).
Hypertension remission was superior after RYGB (8% vs. 24%; P = .04).
Esophagitis was more prevalent after SG (31% vs. 7%; P < .001).
‘Very important study’
“This is a very important study, the first to report 10-year results of a randomized controlled trial comparing the two most frequently used bariatric operations, SG and RYGB,” Beat Peter Müller, MD, MBA, and Adrian Billeter, MD, PhD, who were not involved with this research, told this news organization in an email.
“The results will have a major impact on the future of bariatric surgery,” according to Dr. Müller and Dr. Billeter, from Heidelberg (Germany) University.
The most relevant findings are the GERD outcomes, they said. Because of the high rate of upper endoscopies at 10 years (73%), the study allowed a good assessment of this.
“While this study confirms that SG is a GERD-prone procedure, it clearly demonstrates that GERD after SG does not induce severe esophagitis and Barrett’s esophagus,” they said.
Most importantly, the rate of Barrett’s esophagus, the precursor lesion of adenocarcinomas of the esophago-gastric junction is similar (4%) after both operations and there was no dysplasia in either group, they stressed.
“The main problem after SG remains new-onset GERD, for which still no predictive parameter exists,” according to Dr. Müller and Dr. Billeter.
“The take home message … is that GERD after SG is generally mild and the risk of Barrett’s esophagus is equally higher after SG and RYGB,” they said. “Therefore, all patients after any bariatric operations should undergo regular upper endoscopies.”
However, “RYGB still leads to an increase in proton-pump inhibitor use, despite RYGB being one of the most effective antireflux procedures,” they said. “This finding needs further investigation.”
Furthermore, “a 4% Barrett esophagus rate 10 years after RYGB is troublesome, and the reasons should be investigated,” they added.
“Another relevant finding is that after 10 years, RYGB has a statistically better weight loss, which reaches the primary endpoint of the SLEEVEPASS trial for the first time,” they noted, yet the clinical relevance of this is not clear, since there was no difference in resolution of comorbidities, except for hypertension.
Gyanprakash A. Ketwaroo, MD, of Baylor College of Medicine, Houston, who was not involved with this research, agreed that “the study shows durable and good weight loss for either type of laparoscopic surgery with important metabolic effects and confirms the long-term benefits of weight-loss surgery.”
“What is somewhat new is the lower levels of Barrett’s esophagus after sleeve gastrectomy compared with several earlier studies,” he told this news organization in an email.
“This is somewhat incongruent with the relatively high incidence of postsleeve esophagitis noted in the study, which is an accepted risk factor for Barrett’s esophagus,” he continued. “Thus, I believe concern will still remain about GERD-related complications, including Barrett’s [esophagus], after sleeve gastrectomy.”
“This paper highlights the need for larger prospective studies, especially those that include diverse, older populations with multiple risk factors for Barrett’s esophagus,” Dr. Ketwaroo said.
Looking ahead
Using a large data set, such as that from SLEEVEPASS and possibly with data from the SM-BOSS trial and the BariSurg trial, with machine learning and other sophisticated analyses might identify parameters that could be used to choose the best operation for an individual patient, Dr. Salminen speculated.
“I think what we have learned from these long-term follow-up results is that GERD assessment should be a part of the preoperative assessment, and for patients who have preoperative GERD symptoms and GERD-related endoscopic findings (e.g., hiatal hernia), gastric bypass would be a more optimal procedure choice, if there are no contraindications for it,” she said.
Patient discussions should also cover “long-term symptoms, for example, abdominal pain after RYGB,” she added.
“I am looking forward to our future 20-year follow-up results,” Dr. Salminen said, “which will shed more light on this topic of postoperative [endoscopic] surveillance.
In the meantime, “preoperative gastroscopy is necessary and beneficial, at least when considering sleeve gastrectomy,” she said.
The SLEEVEPASS trial was supported by the Mary and Georg C. Ehrnrooth Foundation, the Government Research Foundation (in a grant awarded to Turku University Hospital), the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation. Dr. Salminen reported receiving grants from the Government Research Foundation awarded to Turku University Hospital and the Mary and Georg C. Ehrnrooth Foundation. Another coauthor received grants from the Orion Research Foundation, the Paulo Foundation, and the Gastroenterological Research Foundation during the study. No other disclosures were reported.
A version of this article first appeared on Medscape.com.
FROM JAMA SURGERY
Best strategy to prevent schizophrenia relapse yields unexpected results
A large meta-analysis sheds light on the best antipsychotic maintenance strategy to prevent relapse in clinically stable schizophrenia – with some unexpected results that have potential implications for changes to current guidelines.
Consistent with the researchers’ hypothesis, continuing antipsychotic treatment at the standard dose, switching to another antipsychotic, and reducing the dose were all significantly more effective than stopping antipsychotic treatment in preventing relapse.
However, contrary to the researchers’ hypothesis, which was based on current literature, switching to another antipsychotic was just as effective as continuing an antipsychotic at the standard dose.
Switching to another antipsychotic “does not increase the risk of relapse. This result was not expected, as previous literature suggested otherwise,” Giovanni Ostuzzi, MD, PhD, with University of Verona (Italy) said in an interview.
“On the other hand, reducing the dose below the standard range used in the acute phase carries a tangible risk of relapse, and should be limited to selected cases, for example those where the risk of withdrawing the treatment altogether is particularly high,” Dr. Ostuzzi said.
“These results should inform evidence-based guidelines, considering that clinical practices for relapse prevention are still heterogeneous and too often guided by clinical common sense only,” he added.
The study was published online in Lancet Psychiatry.
Guideline update warranted
The researchers evaluated the effect of different antipsychotic treatment strategies on risk for relapse in a network meta-analysis of 98 randomized controlled trials (RCTs) involving nearly 14,000 patients.
Compared to stopping the antipsychotic, all continuation strategies were effective in preventing relapse.
The risk for relapse was largely (and similarly) reduced when continuing the antipsychotic at the standard dose or switching to a different antipsychotic (relative risk, 0.37 and RR, 0.44, respectively), the researchers found.
Both strategies outperformed the strategy of reducing the antipsychotic dose below the standard (RR, 0.68), which was inferior to the other two strategies.
For every three patients continuing an antipsychotic at standard doses, one additional patient will avoid relapse, compared with patients stopping an antipsychotic, “which can be regarded as a large-effect magnitude according to commonly used thresholds and results from RCTs in acute schizophrenia,” the researchers write.
The number needed to treat (NNT) slightly increased to about 3.5 for patients who switched antipsychotic treatment – “still regarded as a large-effect magnitude,” they note.
“Currently, most psychiatrists are aware of the benefits of continuing antipsychotics in clinically stable individuals. However, they might face the necessity of changing the ongoing treatment strategy, generally because of burdening side effects, poor adherence, or both,” said Dr. Ostuzzi.
the investigators write.
More to the story
In an accompanying editorial, Marieke J.H. Begemann, PhD, University Medical Center Groningen (the Netherlands) and colleagues note the large number of patients included in the analysis provide “great credibility” to the findings, which are “trustworthy and important, yet only tell part of the story.”
They note that, while tapering information was often missing, antipsychotic discontinuation was probably abrupt for about two-thirds of the included studies.
“The issue of slow versus swift tapering is not yet settled, as there is a scarcity of RCTs that provide very gradual tapering over several months,” the editorialists write.
To fill this gap, several randomized trials are now in progress to specifically address the effects of gradual tapering or discontinuation vs. antipsychotic maintenance treatment in clinically stable schizophrenia.
“Time is pressing, as patients, their families, and clinicians need evidence-based data to weigh up the risks and benefits of maintaining, switching, or reducing medication with respect to a range of outcomes that are important to them, including social functioning, cognition, physical health, sexual health, and quality of life, thus going well beyond relapse prevention,” the editorialists note.
“Schizophrenia-spectrum disorders are heterogeneous with a largely unpredictable course, and we have known for a long time that a substantial proportion of patients who experienced a first psychosis can manage without antipsychotic medication. The challenge for future research is therefore to identify this subgroup on the basis of individual characteristics and guide them in tapering medication safely,” they add.
The study had no funding source. Dr. Ostuzzi reports no relevant financial relationships. A complete list of author disclosures is available with the original article. The editorialists have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A large meta-analysis sheds light on the best antipsychotic maintenance strategy to prevent relapse in clinically stable schizophrenia – with some unexpected results that have potential implications for changes to current guidelines.
Consistent with the researchers’ hypothesis, continuing antipsychotic treatment at the standard dose, switching to another antipsychotic, and reducing the dose were all significantly more effective than stopping antipsychotic treatment in preventing relapse.
However, contrary to the researchers’ hypothesis, which was based on current literature, switching to another antipsychotic was just as effective as continuing an antipsychotic at the standard dose.
Switching to another antipsychotic “does not increase the risk of relapse. This result was not expected, as previous literature suggested otherwise,” Giovanni Ostuzzi, MD, PhD, with University of Verona (Italy) said in an interview.
“On the other hand, reducing the dose below the standard range used in the acute phase carries a tangible risk of relapse, and should be limited to selected cases, for example those where the risk of withdrawing the treatment altogether is particularly high,” Dr. Ostuzzi said.
“These results should inform evidence-based guidelines, considering that clinical practices for relapse prevention are still heterogeneous and too often guided by clinical common sense only,” he added.
The study was published online in Lancet Psychiatry.
Guideline update warranted
The researchers evaluated the effect of different antipsychotic treatment strategies on risk for relapse in a network meta-analysis of 98 randomized controlled trials (RCTs) involving nearly 14,000 patients.
Compared to stopping the antipsychotic, all continuation strategies were effective in preventing relapse.
The risk for relapse was largely (and similarly) reduced when continuing the antipsychotic at the standard dose or switching to a different antipsychotic (relative risk, 0.37 and RR, 0.44, respectively), the researchers found.
Both strategies outperformed the strategy of reducing the antipsychotic dose below the standard (RR, 0.68), which was inferior to the other two strategies.
For every three patients continuing an antipsychotic at standard doses, one additional patient will avoid relapse, compared with patients stopping an antipsychotic, “which can be regarded as a large-effect magnitude according to commonly used thresholds and results from RCTs in acute schizophrenia,” the researchers write.
The number needed to treat (NNT) slightly increased to about 3.5 for patients who switched antipsychotic treatment – “still regarded as a large-effect magnitude,” they note.
“Currently, most psychiatrists are aware of the benefits of continuing antipsychotics in clinically stable individuals. However, they might face the necessity of changing the ongoing treatment strategy, generally because of burdening side effects, poor adherence, or both,” said Dr. Ostuzzi.
the investigators write.
More to the story
In an accompanying editorial, Marieke J.H. Begemann, PhD, University Medical Center Groningen (the Netherlands) and colleagues note the large number of patients included in the analysis provide “great credibility” to the findings, which are “trustworthy and important, yet only tell part of the story.”
They note that, while tapering information was often missing, antipsychotic discontinuation was probably abrupt for about two-thirds of the included studies.
“The issue of slow versus swift tapering is not yet settled, as there is a scarcity of RCTs that provide very gradual tapering over several months,” the editorialists write.
To fill this gap, several randomized trials are now in progress to specifically address the effects of gradual tapering or discontinuation vs. antipsychotic maintenance treatment in clinically stable schizophrenia.
“Time is pressing, as patients, their families, and clinicians need evidence-based data to weigh up the risks and benefits of maintaining, switching, or reducing medication with respect to a range of outcomes that are important to them, including social functioning, cognition, physical health, sexual health, and quality of life, thus going well beyond relapse prevention,” the editorialists note.
“Schizophrenia-spectrum disorders are heterogeneous with a largely unpredictable course, and we have known for a long time that a substantial proportion of patients who experienced a first psychosis can manage without antipsychotic medication. The challenge for future research is therefore to identify this subgroup on the basis of individual characteristics and guide them in tapering medication safely,” they add.
The study had no funding source. Dr. Ostuzzi reports no relevant financial relationships. A complete list of author disclosures is available with the original article. The editorialists have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A large meta-analysis sheds light on the best antipsychotic maintenance strategy to prevent relapse in clinically stable schizophrenia – with some unexpected results that have potential implications for changes to current guidelines.
Consistent with the researchers’ hypothesis, continuing antipsychotic treatment at the standard dose, switching to another antipsychotic, and reducing the dose were all significantly more effective than stopping antipsychotic treatment in preventing relapse.
However, contrary to the researchers’ hypothesis, which was based on current literature, switching to another antipsychotic was just as effective as continuing an antipsychotic at the standard dose.
Switching to another antipsychotic “does not increase the risk of relapse. This result was not expected, as previous literature suggested otherwise,” Giovanni Ostuzzi, MD, PhD, with University of Verona (Italy) said in an interview.
“On the other hand, reducing the dose below the standard range used in the acute phase carries a tangible risk of relapse, and should be limited to selected cases, for example those where the risk of withdrawing the treatment altogether is particularly high,” Dr. Ostuzzi said.
“These results should inform evidence-based guidelines, considering that clinical practices for relapse prevention are still heterogeneous and too often guided by clinical common sense only,” he added.
The study was published online in Lancet Psychiatry.
Guideline update warranted
The researchers evaluated the effect of different antipsychotic treatment strategies on risk for relapse in a network meta-analysis of 98 randomized controlled trials (RCTs) involving nearly 14,000 patients.
Compared to stopping the antipsychotic, all continuation strategies were effective in preventing relapse.
The risk for relapse was largely (and similarly) reduced when continuing the antipsychotic at the standard dose or switching to a different antipsychotic (relative risk, 0.37 and RR, 0.44, respectively), the researchers found.
Both strategies outperformed the strategy of reducing the antipsychotic dose below the standard (RR, 0.68), which was inferior to the other two strategies.
For every three patients continuing an antipsychotic at standard doses, one additional patient will avoid relapse, compared with patients stopping an antipsychotic, “which can be regarded as a large-effect magnitude according to commonly used thresholds and results from RCTs in acute schizophrenia,” the researchers write.
The number needed to treat (NNT) slightly increased to about 3.5 for patients who switched antipsychotic treatment – “still regarded as a large-effect magnitude,” they note.
“Currently, most psychiatrists are aware of the benefits of continuing antipsychotics in clinically stable individuals. However, they might face the necessity of changing the ongoing treatment strategy, generally because of burdening side effects, poor adherence, or both,” said Dr. Ostuzzi.
the investigators write.
More to the story
In an accompanying editorial, Marieke J.H. Begemann, PhD, University Medical Center Groningen (the Netherlands) and colleagues note the large number of patients included in the analysis provide “great credibility” to the findings, which are “trustworthy and important, yet only tell part of the story.”
They note that, while tapering information was often missing, antipsychotic discontinuation was probably abrupt for about two-thirds of the included studies.
“The issue of slow versus swift tapering is not yet settled, as there is a scarcity of RCTs that provide very gradual tapering over several months,” the editorialists write.
To fill this gap, several randomized trials are now in progress to specifically address the effects of gradual tapering or discontinuation vs. antipsychotic maintenance treatment in clinically stable schizophrenia.
“Time is pressing, as patients, their families, and clinicians need evidence-based data to weigh up the risks and benefits of maintaining, switching, or reducing medication with respect to a range of outcomes that are important to them, including social functioning, cognition, physical health, sexual health, and quality of life, thus going well beyond relapse prevention,” the editorialists note.
“Schizophrenia-spectrum disorders are heterogeneous with a largely unpredictable course, and we have known for a long time that a substantial proportion of patients who experienced a first psychosis can manage without antipsychotic medication. The challenge for future research is therefore to identify this subgroup on the basis of individual characteristics and guide them in tapering medication safely,” they add.
The study had no funding source. Dr. Ostuzzi reports no relevant financial relationships. A complete list of author disclosures is available with the original article. The editorialists have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE LANCET PSYCHIATRY
Lifestyle medicine eases anxiety symptoms
Despite the availability of effective treatment strategies, including pharmacotherapy, psychotherapy, and combination therapy, the prevalence of anxiety continues to increase, especially in low-income and conflict-ridden countries, Vincent Wing-Hei Wong, a PhD student at The Chinese University of Hong Kong, and colleagues wrote.
Previous studies have shown that lifestyle factors including diet, sleep, and sedentary behavior are involved in the development of anxiety symptoms, but the impact of lifestyle medicine (LM) as a treatment for anxiety has not been well studied, they wrote.
In a meta-analysis published in the Journal of Affective Disorders, the researchers identified 53 randomized, controlled trials with a total of 18,894 participants. Anxiety symptoms were measured using self-report questionnaires including the Hospital Anxiety and Depression Scale, the Depression Anxiety and Stress Scale, and the General Anxiety Disorder–7. Random-effects models were used to assess the effect of the intervention at immediate post treatment, short-term follow-up (1-3 months post treatment), medium follow-up (4-6 months), and long-term follow-up (7 months or more).
The studies included various combinations of LM intervention involving exercise, stress management, and sleep management. The interventions ranged from 1 month to 4 years, with an average duration of 6.3 months.
Overall, patients randomized to multicomponent LM interventions showed significantly improved symptoms compared to controls immediately after treatment and at short-term follow-up (P < .001 for both).
However, no significant differences were noted between the multicomponent LM intervention and control groups at medium-term follow-up, the researchers said. Only one study included data on long-term effects, so these effects were not evaluated in a meta-analysis, and more research is needed.
In a subgroup analysis, the effect was greatest for individuals with moderate anxiety symptoms at baseline (P < .05). “Our results could perhaps be explained by the occurrence of floor effect; those with higher baseline anxiety symptoms have greater room for improvement relative to those with fewer symptoms,” the researchers wrote.
The study findings were limited by several factors including the risk of overall bias and publication bias for the selected studies, as well as the limited degree of improvement because most patients had minimal anxiety symptoms at baseline, the researchers noted. Other limitations included the small number of studies for subgroup comparisons and the use of self-reports.
However, the results were strengthened by the use of broad search terms to capture multiple lifestyle determinants, and the diverse study populations and backgrounds from individuals in 19 countries.
The results support findings from previous studies, and support the value of multicomponent LM interventions for patients with anxiety in the short-term and immediately after treatment, the researchers emphasized.
“The LM approach, which leverages a range of universal lifestyle measures to manage anxiety and other common mental disorders such as depression, may be a viable solution to address the huge mental health burden through empowering individuals to practice self-management,” they concluded.
However, the researchers acknowledged the need for more randomized, controlled trials targeting patients with higher baseline anxiety levels or anxiety disorders, and using technology to improve treatment adherence.
The study received no outside funding. The researchers had no financial conflicts to disclose.
Despite the availability of effective treatment strategies, including pharmacotherapy, psychotherapy, and combination therapy, the prevalence of anxiety continues to increase, especially in low-income and conflict-ridden countries, Vincent Wing-Hei Wong, a PhD student at The Chinese University of Hong Kong, and colleagues wrote.
Previous studies have shown that lifestyle factors including diet, sleep, and sedentary behavior are involved in the development of anxiety symptoms, but the impact of lifestyle medicine (LM) as a treatment for anxiety has not been well studied, they wrote.
In a meta-analysis published in the Journal of Affective Disorders, the researchers identified 53 randomized, controlled trials with a total of 18,894 participants. Anxiety symptoms were measured using self-report questionnaires including the Hospital Anxiety and Depression Scale, the Depression Anxiety and Stress Scale, and the General Anxiety Disorder–7. Random-effects models were used to assess the effect of the intervention at immediate post treatment, short-term follow-up (1-3 months post treatment), medium follow-up (4-6 months), and long-term follow-up (7 months or more).
The studies included various combinations of LM intervention involving exercise, stress management, and sleep management. The interventions ranged from 1 month to 4 years, with an average duration of 6.3 months.
Overall, patients randomized to multicomponent LM interventions showed significantly improved symptoms compared to controls immediately after treatment and at short-term follow-up (P < .001 for both).
However, no significant differences were noted between the multicomponent LM intervention and control groups at medium-term follow-up, the researchers said. Only one study included data on long-term effects, so these effects were not evaluated in a meta-analysis, and more research is needed.
In a subgroup analysis, the effect was greatest for individuals with moderate anxiety symptoms at baseline (P < .05). “Our results could perhaps be explained by the occurrence of floor effect; those with higher baseline anxiety symptoms have greater room for improvement relative to those with fewer symptoms,” the researchers wrote.
The study findings were limited by several factors including the risk of overall bias and publication bias for the selected studies, as well as the limited degree of improvement because most patients had minimal anxiety symptoms at baseline, the researchers noted. Other limitations included the small number of studies for subgroup comparisons and the use of self-reports.
However, the results were strengthened by the use of broad search terms to capture multiple lifestyle determinants, and the diverse study populations and backgrounds from individuals in 19 countries.
The results support findings from previous studies, and support the value of multicomponent LM interventions for patients with anxiety in the short-term and immediately after treatment, the researchers emphasized.
“The LM approach, which leverages a range of universal lifestyle measures to manage anxiety and other common mental disorders such as depression, may be a viable solution to address the huge mental health burden through empowering individuals to practice self-management,” they concluded.
However, the researchers acknowledged the need for more randomized, controlled trials targeting patients with higher baseline anxiety levels or anxiety disorders, and using technology to improve treatment adherence.
The study received no outside funding. The researchers had no financial conflicts to disclose.
Despite the availability of effective treatment strategies, including pharmacotherapy, psychotherapy, and combination therapy, the prevalence of anxiety continues to increase, especially in low-income and conflict-ridden countries, Vincent Wing-Hei Wong, a PhD student at The Chinese University of Hong Kong, and colleagues wrote.
Previous studies have shown that lifestyle factors including diet, sleep, and sedentary behavior are involved in the development of anxiety symptoms, but the impact of lifestyle medicine (LM) as a treatment for anxiety has not been well studied, they wrote.
In a meta-analysis published in the Journal of Affective Disorders, the researchers identified 53 randomized, controlled trials with a total of 18,894 participants. Anxiety symptoms were measured using self-report questionnaires including the Hospital Anxiety and Depression Scale, the Depression Anxiety and Stress Scale, and the General Anxiety Disorder–7. Random-effects models were used to assess the effect of the intervention at immediate post treatment, short-term follow-up (1-3 months post treatment), medium follow-up (4-6 months), and long-term follow-up (7 months or more).
The studies included various combinations of LM intervention involving exercise, stress management, and sleep management. The interventions ranged from 1 month to 4 years, with an average duration of 6.3 months.
Overall, patients randomized to multicomponent LM interventions showed significantly improved symptoms compared to controls immediately after treatment and at short-term follow-up (P < .001 for both).
However, no significant differences were noted between the multicomponent LM intervention and control groups at medium-term follow-up, the researchers said. Only one study included data on long-term effects, so these effects were not evaluated in a meta-analysis, and more research is needed.
In a subgroup analysis, the effect was greatest for individuals with moderate anxiety symptoms at baseline (P < .05). “Our results could perhaps be explained by the occurrence of floor effect; those with higher baseline anxiety symptoms have greater room for improvement relative to those with fewer symptoms,” the researchers wrote.
The study findings were limited by several factors including the risk of overall bias and publication bias for the selected studies, as well as the limited degree of improvement because most patients had minimal anxiety symptoms at baseline, the researchers noted. Other limitations included the small number of studies for subgroup comparisons and the use of self-reports.
However, the results were strengthened by the use of broad search terms to capture multiple lifestyle determinants, and the diverse study populations and backgrounds from individuals in 19 countries.
The results support findings from previous studies, and support the value of multicomponent LM interventions for patients with anxiety in the short-term and immediately after treatment, the researchers emphasized.
“The LM approach, which leverages a range of universal lifestyle measures to manage anxiety and other common mental disorders such as depression, may be a viable solution to address the huge mental health burden through empowering individuals to practice self-management,” they concluded.
However, the researchers acknowledged the need for more randomized, controlled trials targeting patients with higher baseline anxiety levels or anxiety disorders, and using technology to improve treatment adherence.
The study received no outside funding. The researchers had no financial conflicts to disclose.
FROM THE JOURNAL OF AFFECTIVE DISORDERS
Fertility rates lower in disadvantaged neighborhoods
A new study ties the odds of conception to the advantages of the neighborhood a woman lives in.
In a cohort of more than 6,000 women who were trying to get pregnant without fertility treatments, the probability of conception was reduced 21%-23% per menstrual cycle when comparing the most disadvantaged neighborhoods with the least disadvantaged.
“When disadvantaged neighborhood status was categorized within each state (as opposed to nationally), the results were slightly larger in magnitude,” wrote authors of the study published online in JAMA Network Open.
Among 6,356 participants, 3,725 pregnancies were observed for 27,427 menstrual cycles of follow-up. Average age was 30, and most participants were non-Hispanic White (5,297 [83.3%]) and had not previously given birth (4,179 [65.7%]).
When the researchers compared the top and bottom deciles of disadvantaged neighborhood status, adjusted fecundability ratios (the per-cycle probability of conception) were 0.79 (95% confidence interval [CI], 0.66-0.96) for national-level area deprivation index (ADI) rankings and 0.77 (95% CI, 0.65-0.92) for within-state ADI rankings. ADI score includes population indicators related to educational attainment, housing, employment, and poverty.
“These findings suggest that investments in disadvantaged neighborhoods may yield positive cobenefits for fertility,” the authors wrote.
The researchers used the Pregnancy Study Online, for which baseline data were collected from women in the United States from June 19, 2013, through April 12, 2019.
In the United States, 10%-15% of reproductive-aged couples experience infertility, defined as the inability to conceive after a year of unprotected intercourse.
Reason behind the numbers unclear
Mark Hornstein, MD, director in the reproductive endocrinology division of Brigham and Women’s Hospital and professor at Harvard Medical School, both in Boston, said in an interview that this study gives the “what” but the “why” is harder to pinpoint.
What is not known, he said, is what kind of access the women had to fertility counseling or treatment.
The association between fertility and neighborhood advantage status is very plausible given the well-established links between disadvantaged regions and poorer health outcomes, he said, adding that the authors make a good case for their conclusions in the paper.
The authors ruled out many potential confounders, such as age of the women, reproductive history, multivitamin use, education level, household income, and frequency of intercourse, and still there was a difference between disadvantaged and advantaged neighborhoods, he noted.
Dr. Hornstein said his own research team has found that lack of knowledge about insurance coverage regarding infertility services may keep women from seeking the services.
“One of the things I worry about it access,” he said. “[The study authors] didn’t really look at that. They just looked at what the chances were that they got pregnant. But they didn’t say how many of those women had a workup, an evaluation, for why they were having difficulty, if they were, or had treatment. So I don’t know if some or all or none of that difference that they saw from the highest neighborhood health score to the most disadvantaged – if that was from inherent problems in the area, access to the best health care, or some combination.”
Discussions have focused on changing personal behaviors
Discussions on improving fertility often center on changing personal behaviors, the authors noted. “However, structural, political, and environmental factors may also play a substantial role,” they wrote.
The findings are in line with previous research on the effect of stress on in vitro outcomes, they pointed out. “Perceived stress has been associated with poorer in vitro fertilization outcomes and reduced fecundability among couples attempting spontaneous conception,” the authors noted.
Studies also have shown that living in a disadvantaged neighborhood is linked with comorbidities during pregnancy, such as increased risks of gestational hypertension (risk ratio for lowest vs. highest quartile: 1.24 [95% CI, 1.14-1.35]) and poor gestational weight gain (relative risk for lowest vs. highest quartile: 1.1 [95% CI, 1.1-1.2]).
In addition, policies such as those that support civil rights, protect the environment, and invest in underresourced communities have been shown to improve health markers such as life expectancy.
Policy decisions can also perpetuate a cycle of stress, they wrote. Disadvantaged communities may have more air pollution, which has been shown to have negative effects on fertility. Unemployment has been linked with decreased population-level fertility rates. Lack of green space may result in fewer areas to reduce stress.
A study coauthor reported grants from the National Institutes of Health during the conduct of the study; nonfinancial support from Swiss Precision Diagnostics GmbH, Labcorp, Kindara.com, and FertilityFriend.com; and consulting for AbbVie outside the submitted work. No other author disclosures were reported. Dr. Hornstein reported no relevant financial relationships.
A new study ties the odds of conception to the advantages of the neighborhood a woman lives in.
In a cohort of more than 6,000 women who were trying to get pregnant without fertility treatments, the probability of conception was reduced 21%-23% per menstrual cycle when comparing the most disadvantaged neighborhoods with the least disadvantaged.
“When disadvantaged neighborhood status was categorized within each state (as opposed to nationally), the results were slightly larger in magnitude,” wrote authors of the study published online in JAMA Network Open.
Among 6,356 participants, 3,725 pregnancies were observed for 27,427 menstrual cycles of follow-up. Average age was 30, and most participants were non-Hispanic White (5,297 [83.3%]) and had not previously given birth (4,179 [65.7%]).
When the researchers compared the top and bottom deciles of disadvantaged neighborhood status, adjusted fecundability ratios (the per-cycle probability of conception) were 0.79 (95% confidence interval [CI], 0.66-0.96) for national-level area deprivation index (ADI) rankings and 0.77 (95% CI, 0.65-0.92) for within-state ADI rankings. ADI score includes population indicators related to educational attainment, housing, employment, and poverty.
“These findings suggest that investments in disadvantaged neighborhoods may yield positive cobenefits for fertility,” the authors wrote.
The researchers used the Pregnancy Study Online, for which baseline data were collected from women in the United States from June 19, 2013, through April 12, 2019.
In the United States, 10%-15% of reproductive-aged couples experience infertility, defined as the inability to conceive after a year of unprotected intercourse.
Reason behind the numbers unclear
Mark Hornstein, MD, director in the reproductive endocrinology division of Brigham and Women’s Hospital and professor at Harvard Medical School, both in Boston, said in an interview that this study gives the “what” but the “why” is harder to pinpoint.
What is not known, he said, is what kind of access the women had to fertility counseling or treatment.
The association between fertility and neighborhood advantage status is very plausible given the well-established links between disadvantaged regions and poorer health outcomes, he said, adding that the authors make a good case for their conclusions in the paper.
The authors ruled out many potential confounders, such as age of the women, reproductive history, multivitamin use, education level, household income, and frequency of intercourse, and still there was a difference between disadvantaged and advantaged neighborhoods, he noted.
Dr. Hornstein said his own research team has found that lack of knowledge about insurance coverage regarding infertility services may keep women from seeking the services.
“One of the things I worry about it access,” he said. “[The study authors] didn’t really look at that. They just looked at what the chances were that they got pregnant. But they didn’t say how many of those women had a workup, an evaluation, for why they were having difficulty, if they were, or had treatment. So I don’t know if some or all or none of that difference that they saw from the highest neighborhood health score to the most disadvantaged – if that was from inherent problems in the area, access to the best health care, or some combination.”
Discussions have focused on changing personal behaviors
Discussions on improving fertility often center on changing personal behaviors, the authors noted. “However, structural, political, and environmental factors may also play a substantial role,” they wrote.
The findings are in line with previous research on the effect of stress on in vitro outcomes, they pointed out. “Perceived stress has been associated with poorer in vitro fertilization outcomes and reduced fecundability among couples attempting spontaneous conception,” the authors noted.
Studies also have shown that living in a disadvantaged neighborhood is linked with comorbidities during pregnancy, such as increased risks of gestational hypertension (risk ratio for lowest vs. highest quartile: 1.24 [95% CI, 1.14-1.35]) and poor gestational weight gain (relative risk for lowest vs. highest quartile: 1.1 [95% CI, 1.1-1.2]).
In addition, policies such as those that support civil rights, protect the environment, and invest in underresourced communities have been shown to improve health markers such as life expectancy.
Policy decisions can also perpetuate a cycle of stress, they wrote. Disadvantaged communities may have more air pollution, which has been shown to have negative effects on fertility. Unemployment has been linked with decreased population-level fertility rates. Lack of green space may result in fewer areas to reduce stress.
A study coauthor reported grants from the National Institutes of Health during the conduct of the study; nonfinancial support from Swiss Precision Diagnostics GmbH, Labcorp, Kindara.com, and FertilityFriend.com; and consulting for AbbVie outside the submitted work. No other author disclosures were reported. Dr. Hornstein reported no relevant financial relationships.
A new study ties the odds of conception to the advantages of the neighborhood a woman lives in.
In a cohort of more than 6,000 women who were trying to get pregnant without fertility treatments, the probability of conception was reduced 21%-23% per menstrual cycle when comparing the most disadvantaged neighborhoods with the least disadvantaged.
“When disadvantaged neighborhood status was categorized within each state (as opposed to nationally), the results were slightly larger in magnitude,” wrote authors of the study published online in JAMA Network Open.
Among 6,356 participants, 3,725 pregnancies were observed for 27,427 menstrual cycles of follow-up. Average age was 30, and most participants were non-Hispanic White (5,297 [83.3%]) and had not previously given birth (4,179 [65.7%]).
When the researchers compared the top and bottom deciles of disadvantaged neighborhood status, adjusted fecundability ratios (the per-cycle probability of conception) were 0.79 (95% confidence interval [CI], 0.66-0.96) for national-level area deprivation index (ADI) rankings and 0.77 (95% CI, 0.65-0.92) for within-state ADI rankings. ADI score includes population indicators related to educational attainment, housing, employment, and poverty.
“These findings suggest that investments in disadvantaged neighborhoods may yield positive cobenefits for fertility,” the authors wrote.
The researchers used the Pregnancy Study Online, for which baseline data were collected from women in the United States from June 19, 2013, through April 12, 2019.
In the United States, 10%-15% of reproductive-aged couples experience infertility, defined as the inability to conceive after a year of unprotected intercourse.
Reason behind the numbers unclear
Mark Hornstein, MD, director in the reproductive endocrinology division of Brigham and Women’s Hospital and professor at Harvard Medical School, both in Boston, said in an interview that this study gives the “what” but the “why” is harder to pinpoint.
What is not known, he said, is what kind of access the women had to fertility counseling or treatment.
The association between fertility and neighborhood advantage status is very plausible given the well-established links between disadvantaged regions and poorer health outcomes, he said, adding that the authors make a good case for their conclusions in the paper.
The authors ruled out many potential confounders, such as age of the women, reproductive history, multivitamin use, education level, household income, and frequency of intercourse, and still there was a difference between disadvantaged and advantaged neighborhoods, he noted.
Dr. Hornstein said his own research team has found that lack of knowledge about insurance coverage regarding infertility services may keep women from seeking the services.
“One of the things I worry about it access,” he said. “[The study authors] didn’t really look at that. They just looked at what the chances were that they got pregnant. But they didn’t say how many of those women had a workup, an evaluation, for why they were having difficulty, if they were, or had treatment. So I don’t know if some or all or none of that difference that they saw from the highest neighborhood health score to the most disadvantaged – if that was from inherent problems in the area, access to the best health care, or some combination.”
Discussions have focused on changing personal behaviors
Discussions on improving fertility often center on changing personal behaviors, the authors noted. “However, structural, political, and environmental factors may also play a substantial role,” they wrote.
The findings are in line with previous research on the effect of stress on in vitro outcomes, they pointed out. “Perceived stress has been associated with poorer in vitro fertilization outcomes and reduced fecundability among couples attempting spontaneous conception,” the authors noted.
Studies also have shown that living in a disadvantaged neighborhood is linked with comorbidities during pregnancy, such as increased risks of gestational hypertension (risk ratio for lowest vs. highest quartile: 1.24 [95% CI, 1.14-1.35]) and poor gestational weight gain (relative risk for lowest vs. highest quartile: 1.1 [95% CI, 1.1-1.2]).
In addition, policies such as those that support civil rights, protect the environment, and invest in underresourced communities have been shown to improve health markers such as life expectancy.
Policy decisions can also perpetuate a cycle of stress, they wrote. Disadvantaged communities may have more air pollution, which has been shown to have negative effects on fertility. Unemployment has been linked with decreased population-level fertility rates. Lack of green space may result in fewer areas to reduce stress.
A study coauthor reported grants from the National Institutes of Health during the conduct of the study; nonfinancial support from Swiss Precision Diagnostics GmbH, Labcorp, Kindara.com, and FertilityFriend.com; and consulting for AbbVie outside the submitted work. No other author disclosures were reported. Dr. Hornstein reported no relevant financial relationships.
Will ESD replace EMR for large colorectal polyps?
Dear colleagues,
Resection of polyps is the bread and butter of endoscopy. Advances in technology have enabled us to tackle larger and more complex lesions throughout the gastrointestinal tract, especially through endoscopic mucosal resection (EMR). Endoscopic submucosal dissection (ESD) is another technique that offers much promise for complex colorectal polyps and is being rapidly adopted in the West. But do its benefits outweigh the costs in time, money and additional training needed for successful ESD? How can we justify higher recurrence rates with EMR when ESD is available? Will reimbursement continue to favor EMR? In this issue of Perspectives, Dr. Alexis Bayudan and Dr. Craig A. Munroe make the case for adopting ESD, while Dr. Sumeet Tewani highlights all the benefits of EMR. I invite you to a great debate and look forward to hearing your own thoughts on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is assistant professor of medicine at Baylor College of Medicine, Houston. He is an associate editor for GI & Hepatology News.
The future standard of care
BY ALEXIS BAYUDAN, MD, AND CRAIG A. MUNROE, MD
Endoscopic submucosal dissection (ESD) is a minimally invasive, organ-sparing, flexible endoscopic technique used to treat advanced neoplasia of the digestive tract, with the goal of en bloc resection for accurate histologic assessment. ESD was introduced over 25 years ago in Japan by a small group of innovative endoscopists.1 After its initial adoption and success with removing gastric lesions, ESD later evolved as a technique used for complete resection of lesions throughout the gastrointestinal tract.
The intent of ESD is to achieve clear pathologic evaluation of deep and lateral margins, which is generally lost when piecemeal EMR (pEMR) is performed on lesions larger than 2 cm. With growing global experience, the evidence is clear that ESD is advantageous when compared to pEMR in the resection of large colorectal lesions en bloc, leading to improved curative resection rates and less local recurrence.
From our own experience, and from the results of many studies, we know that although procedure time in ESD can be longer, the rate of complete resection is far superior. ESD was previously cited as having a 10% risk of perforation in the 1990s and early 2000s, but current rates are closer to 4.5%, as noted by Nimii et al., with nearly complete successful treatment with endoscopic closure.1 In a 2021 meta-analysis reviewing a total of 21 studies, Lim et. al demonstrate that, although there is an increased risk of perforation with ESD compared to EMR (risk ratio, 7.597; 95% confidence interval, 4.281-13.479; P < .001), there is no significant difference in bleeding risk between the two techniques (RR, 7.597; 95% CI, 4.281-13.479; P < .001).2
Since its inception, many refinements of the ESD technique have occurred through technology, and better understanding of anatomy and disease states. These include, but are not limited to, improvements in hemostatic and closure techniques, electrosurgical equipment, resection and traction devices, the use of carbon dioxide, the ability to perform full-thickness endoscopic surgery, and submucosal lifting.1 The realm of endoscopic innovation is moving at a rapid pace within commercial and noncommercial entities, and advancements in ESD devices will allow for further improvements in procedure times and decreased procedural complications. Conversely, there have been few advancements in EMR technique in decades.
Further developments in ESD will continue to democratize this intervention, so that it can be practiced in all medical centers, not just expert centers. However, for ESD to become standard of care in the Western world, it will require more exposure and training. ESD has rapidly spread throughout Japan because of the master-mentor relationship that fosters safe learning, in addition to an abundance of highly skilled EMR-experienced physicians who went on to acquire their skills under the supervision of ESD experts. Current methods of teaching ESD, such as using pig models to practice specific steps of the procedure, can be implemented in Western gastroenterology training programs and through GI and surgical society training programs to learn safe operation in the third space. Mentorship and proctorship are also mandatory. The incorporation of ESD into standard practice over pEMR is very akin to laparoscopic cholecystectomy revolutionizing gallbladder surgery, even though open cholecystectomy was known to be effective.
A major limitation in the adoption of ESD in the West is reimbursement. Despite mounting evidence of the superiority of ESD in well-trained hands, and the additional training needed to safely perform these procedures, there had not been a pathway forward for payment for the increased requirements needed to perform these procedures safely.3 This leads to more endoscopists performing pEMR in the West which is anti-innovative. In October 2021, the Centers for Medicare and Medicaid Services expanded the reimbursement for ESD (Healthcare Common Procedure Coding System C9779). The availability of billing codes paves the way for increasing patient access to these therapies. Hopefully, additional codes will follow.
With the mounting evidence demonstrating ESD is superior to pEMR in terms of curative resection and recurrence rates, we think it is time for ESD to be incorporated widely into Western practice. ESD is still evolving and improving; ESD will become both safer and more effective. ESD has revolutionized endoscopic resection, and we are just beginning to see the possibilities and value of these techniques.
Dr. Baydan is a second-year fellow, and Dr. Munroe is an associate professor, both at the University of California, San Francisco. They have no relevant conflicts of interest.
References
1. Ferreira J et al. Clin Colon Rectal Surg. 2015 Sep; 28(3):146-151.
2. Lim X et al. World J Gastroenterol. 2021 Jul 7;27(25):3925-39.
3. Iqbal S et al. World J Gastrointest Endosc. 2020 Jan 16; 12(1):49-52.
More investment than payoff
Most large colorectal polyps are best managed by endoscopic mucosal resection (EMR) and do not require endoscopic submucosal dissection (ESD). EMR can provide complete, safe, and effective removal, preventing colorectal cancer while avoiding the risks of surgery or ESD. EMR has several advantages over ESD. It is minimally invasive, low cost, well tolerated, and allows excellent histopathologic examination. It is performed during colonoscopy in an outpatient endoscopy lab or ambulatory surgery center. There are several techniques that can be performed safely and efficiently using accessories that are readily available. It is easier to learn and perform, with lower risks and fewer resources. Endoscopists can effectively integrate EMR into a busy practice, without making significant additional investments.
EMR of large adenomas has improved morbidity, mortality, and cost compared to surgery.1-3 I first carefully inspect the lesion to plan the approach and exclude submucosal invasion, which should be referred for ESD or surgery instead. This includes understanding the size, location, morphology, and surface characteristics, using high-definition and narrow-band imaging or Fujinon intelligent chromoendoscopy. Conventional EMR utilizes submucosal injection to lift the polyp away from the underlying muscle layer before hot snare resection. Injection needles and snares of various shapes, sizes, and stiffness are available in most endoscopy labs. The goal is en bloc resection to achieve potential cure with complete histological assessment and low rate of recurrence. This can be achieved for lesions up to 2 cm in size, although larger lesions require piecemeal resection, which limits accurate histopathology and carries a recurrence rate up to 25%.1 Thermal ablation of the resection margins with argon plasma coagulation or snare-tip soft coagulation can reduce the rate of recurrence. Additionally, most recurrences are identified during early surveillance within 6 months and managed endoscopically. The rates of adverse events, including bleeding (6%-15%), perforation (1%-2%), and postpolypectomy syndrome (< 1%) remain at acceptable low levels.1,4
For many polyps, saline injection is safe, effective, and inexpensive, but it dissipates rapidly with limited duration of effect. Alternative agents can improve the lift, at additional cost.4 I prefer adding dye, such as methylene blue, to differentiate the submucosa from the muscularis, demarcate the lesion margins, and allow easier inspection of the defect. Dilute epinephrine can also be added to reduce intraprocedural bleeding and maintain a clean resection field. I reserve this for the duodenum, but it can be an important adjunct for some colorectal polyps. Submucosal injection also allows assessment for a “nonlifting sign,” which raises suspicion for invasive carcinoma but can also occur with benign submucosal fibrosis from previous biopsy, partial resection, or adjacent tattoo. In these cases, effective management can still be achieved using EMR in combination with avulsion and thermal ablation techniques.
Alternative techniques include cold EMR and underwater EMR.1,4 These are gaining popularity because of their excellent safety profile and favorable outcomes. Cold EMR involves submucosal injection followed by cold-snare resection, eliminating the use of cautery and its associated risks. Cold EMR is very safe and effective for small polyps, and we use this for progressively larger polyps given the low complication rate. Despite the need for piecemeal resection of polyps larger than 10 mm, local recurrence rates are comparable to conventional EMR. Sessile serrated polyps are especially ideal for piecemeal cold EMR. Meanwhile, underwater EMR eliminates the need for submucosal injection by utilizing water immersion, which elevates the mucosal lesion away from the muscularis layer. Either hot or cold snare resection can be performed. Benefits include reduced procedure time and cost, and relatively low complication and recurrence rates, compared with conventional EMR. I find this to be a nice option for laterally spreading polyps, with potentially higher rates of en bloc resection.1,4
ESD involves similar techniques but includes careful dissection of the submucosal layer beneath the lesion. In addition to the tools for EMR, a specialized electrosurgical knife is necessary, as well as dedicated training and mentorship that can be difficult to accommodate for an active endoscopist in practice. The primary advantage of ESD is higher en bloc resection rates for larger and potentially deeper lesions, with accurate histologic assessment and staging, and very low recurrence rates.1,4,5 However, ESD is more complex, technically challenging, and time and resource intensive, with higher risk of complications. Intraprocedural bleeding is common and requires immediate management. Additional risks include 2% risk of delayed bleeding and 5% risk of perforation.1,5 ESD involves an operating room, longer procedure times, and higher cost including surgical, anesthesia, and nursing costs. Some of this may be balanced by reduced frequency of surveillance and therapeutic procedures. While both EMR and ESD carry significant cost savings, compared with surgery, ESD is additionally disadvantaged by lack of reimbursement.
Regardless of the technique, EMR is easier to learn and perform than ESD, uses a limited number of devices that are readily available, and carries lower cost-burden. EMR is successful for most colorectal polyps, with the primary disadvantage being piecemeal resection of larger polyps. The rates of adverse events are lower, and appropriate surveillance is essential to ensuring complete resection and eliminating recurrence. Japanese and European guidelines endorse ESD for lesions that have a high likelihood of cancer invading the submucosa and for lesions that cannot be removed by EMR because of submucosal fibrosis. Ultimately, patients need to be treated individually with the most appropriate technique.
Dr. Tewani of Rockford Gastroenterology Associates is clinical assistant professor of medicine at the University of Illinois, Rockford. He has no relevant conflicts of interest to disclose.
References
1. Rashid MU et al. Surg Oncol. 2022 Mar 18;101742.
2. Law R et al. Gastrointest Endosc. 2016 Jun;83(6):1248-57.
3. Backes Y et al. BMC Gastroenterol. 2016 May 26;16(1):56.
4. Thiruvengadam SS et al. Gastroenterol Hepatol. 2022 Mar;18(3):133-44.
5. Wang J et al. World J Gastroenterol. 2014 Jul 7;20(25):8282-7l.
Dear colleagues,
Resection of polyps is the bread and butter of endoscopy. Advances in technology have enabled us to tackle larger and more complex lesions throughout the gastrointestinal tract, especially through endoscopic mucosal resection (EMR). Endoscopic submucosal dissection (ESD) is another technique that offers much promise for complex colorectal polyps and is being rapidly adopted in the West. But do its benefits outweigh the costs in time, money and additional training needed for successful ESD? How can we justify higher recurrence rates with EMR when ESD is available? Will reimbursement continue to favor EMR? In this issue of Perspectives, Dr. Alexis Bayudan and Dr. Craig A. Munroe make the case for adopting ESD, while Dr. Sumeet Tewani highlights all the benefits of EMR. I invite you to a great debate and look forward to hearing your own thoughts on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is assistant professor of medicine at Baylor College of Medicine, Houston. He is an associate editor for GI & Hepatology News.
The future standard of care
BY ALEXIS BAYUDAN, MD, AND CRAIG A. MUNROE, MD
Endoscopic submucosal dissection (ESD) is a minimally invasive, organ-sparing, flexible endoscopic technique used to treat advanced neoplasia of the digestive tract, with the goal of en bloc resection for accurate histologic assessment. ESD was introduced over 25 years ago in Japan by a small group of innovative endoscopists.1 After its initial adoption and success with removing gastric lesions, ESD later evolved as a technique used for complete resection of lesions throughout the gastrointestinal tract.
The intent of ESD is to achieve clear pathologic evaluation of deep and lateral margins, which is generally lost when piecemeal EMR (pEMR) is performed on lesions larger than 2 cm. With growing global experience, the evidence is clear that ESD is advantageous when compared to pEMR in the resection of large colorectal lesions en bloc, leading to improved curative resection rates and less local recurrence.
From our own experience, and from the results of many studies, we know that although procedure time in ESD can be longer, the rate of complete resection is far superior. ESD was previously cited as having a 10% risk of perforation in the 1990s and early 2000s, but current rates are closer to 4.5%, as noted by Nimii et al., with nearly complete successful treatment with endoscopic closure.1 In a 2021 meta-analysis reviewing a total of 21 studies, Lim et. al demonstrate that, although there is an increased risk of perforation with ESD compared to EMR (risk ratio, 7.597; 95% confidence interval, 4.281-13.479; P < .001), there is no significant difference in bleeding risk between the two techniques (RR, 7.597; 95% CI, 4.281-13.479; P < .001).2
Since its inception, many refinements of the ESD technique have occurred through technology, and better understanding of anatomy and disease states. These include, but are not limited to, improvements in hemostatic and closure techniques, electrosurgical equipment, resection and traction devices, the use of carbon dioxide, the ability to perform full-thickness endoscopic surgery, and submucosal lifting.1 The realm of endoscopic innovation is moving at a rapid pace within commercial and noncommercial entities, and advancements in ESD devices will allow for further improvements in procedure times and decreased procedural complications. Conversely, there have been few advancements in EMR technique in decades.
Further developments in ESD will continue to democratize this intervention, so that it can be practiced in all medical centers, not just expert centers. However, for ESD to become standard of care in the Western world, it will require more exposure and training. ESD has rapidly spread throughout Japan because of the master-mentor relationship that fosters safe learning, in addition to an abundance of highly skilled EMR-experienced physicians who went on to acquire their skills under the supervision of ESD experts. Current methods of teaching ESD, such as using pig models to practice specific steps of the procedure, can be implemented in Western gastroenterology training programs and through GI and surgical society training programs to learn safe operation in the third space. Mentorship and proctorship are also mandatory. The incorporation of ESD into standard practice over pEMR is very akin to laparoscopic cholecystectomy revolutionizing gallbladder surgery, even though open cholecystectomy was known to be effective.
A major limitation in the adoption of ESD in the West is reimbursement. Despite mounting evidence of the superiority of ESD in well-trained hands, and the additional training needed to safely perform these procedures, there had not been a pathway forward for payment for the increased requirements needed to perform these procedures safely.3 This leads to more endoscopists performing pEMR in the West which is anti-innovative. In October 2021, the Centers for Medicare and Medicaid Services expanded the reimbursement for ESD (Healthcare Common Procedure Coding System C9779). The availability of billing codes paves the way for increasing patient access to these therapies. Hopefully, additional codes will follow.
With the mounting evidence demonstrating ESD is superior to pEMR in terms of curative resection and recurrence rates, we think it is time for ESD to be incorporated widely into Western practice. ESD is still evolving and improving; ESD will become both safer and more effective. ESD has revolutionized endoscopic resection, and we are just beginning to see the possibilities and value of these techniques.
Dr. Baydan is a second-year fellow, and Dr. Munroe is an associate professor, both at the University of California, San Francisco. They have no relevant conflicts of interest.
References
1. Ferreira J et al. Clin Colon Rectal Surg. 2015 Sep; 28(3):146-151.
2. Lim X et al. World J Gastroenterol. 2021 Jul 7;27(25):3925-39.
3. Iqbal S et al. World J Gastrointest Endosc. 2020 Jan 16; 12(1):49-52.
More investment than payoff
Most large colorectal polyps are best managed by endoscopic mucosal resection (EMR) and do not require endoscopic submucosal dissection (ESD). EMR can provide complete, safe, and effective removal, preventing colorectal cancer while avoiding the risks of surgery or ESD. EMR has several advantages over ESD. It is minimally invasive, low cost, well tolerated, and allows excellent histopathologic examination. It is performed during colonoscopy in an outpatient endoscopy lab or ambulatory surgery center. There are several techniques that can be performed safely and efficiently using accessories that are readily available. It is easier to learn and perform, with lower risks and fewer resources. Endoscopists can effectively integrate EMR into a busy practice, without making significant additional investments.
EMR of large adenomas has improved morbidity, mortality, and cost compared to surgery.1-3 I first carefully inspect the lesion to plan the approach and exclude submucosal invasion, which should be referred for ESD or surgery instead. This includes understanding the size, location, morphology, and surface characteristics, using high-definition and narrow-band imaging or Fujinon intelligent chromoendoscopy. Conventional EMR utilizes submucosal injection to lift the polyp away from the underlying muscle layer before hot snare resection. Injection needles and snares of various shapes, sizes, and stiffness are available in most endoscopy labs. The goal is en bloc resection to achieve potential cure with complete histological assessment and low rate of recurrence. This can be achieved for lesions up to 2 cm in size, although larger lesions require piecemeal resection, which limits accurate histopathology and carries a recurrence rate up to 25%.1 Thermal ablation of the resection margins with argon plasma coagulation or snare-tip soft coagulation can reduce the rate of recurrence. Additionally, most recurrences are identified during early surveillance within 6 months and managed endoscopically. The rates of adverse events, including bleeding (6%-15%), perforation (1%-2%), and postpolypectomy syndrome (< 1%) remain at acceptable low levels.1,4
For many polyps, saline injection is safe, effective, and inexpensive, but it dissipates rapidly with limited duration of effect. Alternative agents can improve the lift, at additional cost.4 I prefer adding dye, such as methylene blue, to differentiate the submucosa from the muscularis, demarcate the lesion margins, and allow easier inspection of the defect. Dilute epinephrine can also be added to reduce intraprocedural bleeding and maintain a clean resection field. I reserve this for the duodenum, but it can be an important adjunct for some colorectal polyps. Submucosal injection also allows assessment for a “nonlifting sign,” which raises suspicion for invasive carcinoma but can also occur with benign submucosal fibrosis from previous biopsy, partial resection, or adjacent tattoo. In these cases, effective management can still be achieved using EMR in combination with avulsion and thermal ablation techniques.
Alternative techniques include cold EMR and underwater EMR.1,4 These are gaining popularity because of their excellent safety profile and favorable outcomes. Cold EMR involves submucosal injection followed by cold-snare resection, eliminating the use of cautery and its associated risks. Cold EMR is very safe and effective for small polyps, and we use this for progressively larger polyps given the low complication rate. Despite the need for piecemeal resection of polyps larger than 10 mm, local recurrence rates are comparable to conventional EMR. Sessile serrated polyps are especially ideal for piecemeal cold EMR. Meanwhile, underwater EMR eliminates the need for submucosal injection by utilizing water immersion, which elevates the mucosal lesion away from the muscularis layer. Either hot or cold snare resection can be performed. Benefits include reduced procedure time and cost, and relatively low complication and recurrence rates, compared with conventional EMR. I find this to be a nice option for laterally spreading polyps, with potentially higher rates of en bloc resection.1,4
ESD involves similar techniques but includes careful dissection of the submucosal layer beneath the lesion. In addition to the tools for EMR, a specialized electrosurgical knife is necessary, as well as dedicated training and mentorship that can be difficult to accommodate for an active endoscopist in practice. The primary advantage of ESD is higher en bloc resection rates for larger and potentially deeper lesions, with accurate histologic assessment and staging, and very low recurrence rates.1,4,5 However, ESD is more complex, technically challenging, and time and resource intensive, with higher risk of complications. Intraprocedural bleeding is common and requires immediate management. Additional risks include 2% risk of delayed bleeding and 5% risk of perforation.1,5 ESD involves an operating room, longer procedure times, and higher cost including surgical, anesthesia, and nursing costs. Some of this may be balanced by reduced frequency of surveillance and therapeutic procedures. While both EMR and ESD carry significant cost savings, compared with surgery, ESD is additionally disadvantaged by lack of reimbursement.
Regardless of the technique, EMR is easier to learn and perform than ESD, uses a limited number of devices that are readily available, and carries lower cost-burden. EMR is successful for most colorectal polyps, with the primary disadvantage being piecemeal resection of larger polyps. The rates of adverse events are lower, and appropriate surveillance is essential to ensuring complete resection and eliminating recurrence. Japanese and European guidelines endorse ESD for lesions that have a high likelihood of cancer invading the submucosa and for lesions that cannot be removed by EMR because of submucosal fibrosis. Ultimately, patients need to be treated individually with the most appropriate technique.
Dr. Tewani of Rockford Gastroenterology Associates is clinical assistant professor of medicine at the University of Illinois, Rockford. He has no relevant conflicts of interest to disclose.
References
1. Rashid MU et al. Surg Oncol. 2022 Mar 18;101742.
2. Law R et al. Gastrointest Endosc. 2016 Jun;83(6):1248-57.
3. Backes Y et al. BMC Gastroenterol. 2016 May 26;16(1):56.
4. Thiruvengadam SS et al. Gastroenterol Hepatol. 2022 Mar;18(3):133-44.
5. Wang J et al. World J Gastroenterol. 2014 Jul 7;20(25):8282-7l.
Dear colleagues,
Resection of polyps is the bread and butter of endoscopy. Advances in technology have enabled us to tackle larger and more complex lesions throughout the gastrointestinal tract, especially through endoscopic mucosal resection (EMR). Endoscopic submucosal dissection (ESD) is another technique that offers much promise for complex colorectal polyps and is being rapidly adopted in the West. But do its benefits outweigh the costs in time, money and additional training needed for successful ESD? How can we justify higher recurrence rates with EMR when ESD is available? Will reimbursement continue to favor EMR? In this issue of Perspectives, Dr. Alexis Bayudan and Dr. Craig A. Munroe make the case for adopting ESD, while Dr. Sumeet Tewani highlights all the benefits of EMR. I invite you to a great debate and look forward to hearing your own thoughts on Twitter @AGA_GIHN and by email at [email protected].
Gyanprakash A. Ketwaroo, MD, MSc, is assistant professor of medicine at Baylor College of Medicine, Houston. He is an associate editor for GI & Hepatology News.
The future standard of care
BY ALEXIS BAYUDAN, MD, AND CRAIG A. MUNROE, MD
Endoscopic submucosal dissection (ESD) is a minimally invasive, organ-sparing, flexible endoscopic technique used to treat advanced neoplasia of the digestive tract, with the goal of en bloc resection for accurate histologic assessment. ESD was introduced over 25 years ago in Japan by a small group of innovative endoscopists.1 After its initial adoption and success with removing gastric lesions, ESD later evolved as a technique used for complete resection of lesions throughout the gastrointestinal tract.
The intent of ESD is to achieve clear pathologic evaluation of deep and lateral margins, which is generally lost when piecemeal EMR (pEMR) is performed on lesions larger than 2 cm. With growing global experience, the evidence is clear that ESD is advantageous when compared to pEMR in the resection of large colorectal lesions en bloc, leading to improved curative resection rates and less local recurrence.
From our own experience, and from the results of many studies, we know that although procedure time in ESD can be longer, the rate of complete resection is far superior. ESD was previously cited as having a 10% risk of perforation in the 1990s and early 2000s, but current rates are closer to 4.5%, as noted by Nimii et al., with nearly complete successful treatment with endoscopic closure.1 In a 2021 meta-analysis reviewing a total of 21 studies, Lim et. al demonstrate that, although there is an increased risk of perforation with ESD compared to EMR (risk ratio, 7.597; 95% confidence interval, 4.281-13.479; P < .001), there is no significant difference in bleeding risk between the two techniques (RR, 7.597; 95% CI, 4.281-13.479; P < .001).2
Since its inception, many refinements of the ESD technique have occurred through technology, and better understanding of anatomy and disease states. These include, but are not limited to, improvements in hemostatic and closure techniques, electrosurgical equipment, resection and traction devices, the use of carbon dioxide, the ability to perform full-thickness endoscopic surgery, and submucosal lifting.1 The realm of endoscopic innovation is moving at a rapid pace within commercial and noncommercial entities, and advancements in ESD devices will allow for further improvements in procedure times and decreased procedural complications. Conversely, there have been few advancements in EMR technique in decades.
Further developments in ESD will continue to democratize this intervention, so that it can be practiced in all medical centers, not just expert centers. However, for ESD to become standard of care in the Western world, it will require more exposure and training. ESD has rapidly spread throughout Japan because of the master-mentor relationship that fosters safe learning, in addition to an abundance of highly skilled EMR-experienced physicians who went on to acquire their skills under the supervision of ESD experts. Current methods of teaching ESD, such as using pig models to practice specific steps of the procedure, can be implemented in Western gastroenterology training programs and through GI and surgical society training programs to learn safe operation in the third space. Mentorship and proctorship are also mandatory. The incorporation of ESD into standard practice over pEMR is very akin to laparoscopic cholecystectomy revolutionizing gallbladder surgery, even though open cholecystectomy was known to be effective.
A major limitation in the adoption of ESD in the West is reimbursement. Despite mounting evidence of the superiority of ESD in well-trained hands, and the additional training needed to safely perform these procedures, there had not been a pathway forward for payment for the increased requirements needed to perform these procedures safely.3 This leads to more endoscopists performing pEMR in the West which is anti-innovative. In October 2021, the Centers for Medicare and Medicaid Services expanded the reimbursement for ESD (Healthcare Common Procedure Coding System C9779). The availability of billing codes paves the way for increasing patient access to these therapies. Hopefully, additional codes will follow.
With the mounting evidence demonstrating ESD is superior to pEMR in terms of curative resection and recurrence rates, we think it is time for ESD to be incorporated widely into Western practice. ESD is still evolving and improving; ESD will become both safer and more effective. ESD has revolutionized endoscopic resection, and we are just beginning to see the possibilities and value of these techniques.
Dr. Baydan is a second-year fellow, and Dr. Munroe is an associate professor, both at the University of California, San Francisco. They have no relevant conflicts of interest.
References
1. Ferreira J et al. Clin Colon Rectal Surg. 2015 Sep; 28(3):146-151.
2. Lim X et al. World J Gastroenterol. 2021 Jul 7;27(25):3925-39.
3. Iqbal S et al. World J Gastrointest Endosc. 2020 Jan 16; 12(1):49-52.
More investment than payoff
Most large colorectal polyps are best managed by endoscopic mucosal resection (EMR) and do not require endoscopic submucosal dissection (ESD). EMR can provide complete, safe, and effective removal, preventing colorectal cancer while avoiding the risks of surgery or ESD. EMR has several advantages over ESD. It is minimally invasive, low cost, well tolerated, and allows excellent histopathologic examination. It is performed during colonoscopy in an outpatient endoscopy lab or ambulatory surgery center. There are several techniques that can be performed safely and efficiently using accessories that are readily available. It is easier to learn and perform, with lower risks and fewer resources. Endoscopists can effectively integrate EMR into a busy practice, without making significant additional investments.
EMR of large adenomas has improved morbidity, mortality, and cost compared to surgery.1-3 I first carefully inspect the lesion to plan the approach and exclude submucosal invasion, which should be referred for ESD or surgery instead. This includes understanding the size, location, morphology, and surface characteristics, using high-definition and narrow-band imaging or Fujinon intelligent chromoendoscopy. Conventional EMR utilizes submucosal injection to lift the polyp away from the underlying muscle layer before hot snare resection. Injection needles and snares of various shapes, sizes, and stiffness are available in most endoscopy labs. The goal is en bloc resection to achieve potential cure with complete histological assessment and low rate of recurrence. This can be achieved for lesions up to 2 cm in size, although larger lesions require piecemeal resection, which limits accurate histopathology and carries a recurrence rate up to 25%.1 Thermal ablation of the resection margins with argon plasma coagulation or snare-tip soft coagulation can reduce the rate of recurrence. Additionally, most recurrences are identified during early surveillance within 6 months and managed endoscopically. The rates of adverse events, including bleeding (6%-15%), perforation (1%-2%), and postpolypectomy syndrome (< 1%) remain at acceptable low levels.1,4
For many polyps, saline injection is safe, effective, and inexpensive, but it dissipates rapidly with limited duration of effect. Alternative agents can improve the lift, at additional cost.4 I prefer adding dye, such as methylene blue, to differentiate the submucosa from the muscularis, demarcate the lesion margins, and allow easier inspection of the defect. Dilute epinephrine can also be added to reduce intraprocedural bleeding and maintain a clean resection field. I reserve this for the duodenum, but it can be an important adjunct for some colorectal polyps. Submucosal injection also allows assessment for a “nonlifting sign,” which raises suspicion for invasive carcinoma but can also occur with benign submucosal fibrosis from previous biopsy, partial resection, or adjacent tattoo. In these cases, effective management can still be achieved using EMR in combination with avulsion and thermal ablation techniques.
Alternative techniques include cold EMR and underwater EMR.1,4 These are gaining popularity because of their excellent safety profile and favorable outcomes. Cold EMR involves submucosal injection followed by cold-snare resection, eliminating the use of cautery and its associated risks. Cold EMR is very safe and effective for small polyps, and we use this for progressively larger polyps given the low complication rate. Despite the need for piecemeal resection of polyps larger than 10 mm, local recurrence rates are comparable to conventional EMR. Sessile serrated polyps are especially ideal for piecemeal cold EMR. Meanwhile, underwater EMR eliminates the need for submucosal injection by utilizing water immersion, which elevates the mucosal lesion away from the muscularis layer. Either hot or cold snare resection can be performed. Benefits include reduced procedure time and cost, and relatively low complication and recurrence rates, compared with conventional EMR. I find this to be a nice option for laterally spreading polyps, with potentially higher rates of en bloc resection.1,4
ESD involves similar techniques but includes careful dissection of the submucosal layer beneath the lesion. In addition to the tools for EMR, a specialized electrosurgical knife is necessary, as well as dedicated training and mentorship that can be difficult to accommodate for an active endoscopist in practice. The primary advantage of ESD is higher en bloc resection rates for larger and potentially deeper lesions, with accurate histologic assessment and staging, and very low recurrence rates.1,4,5 However, ESD is more complex, technically challenging, and time and resource intensive, with higher risk of complications. Intraprocedural bleeding is common and requires immediate management. Additional risks include 2% risk of delayed bleeding and 5% risk of perforation.1,5 ESD involves an operating room, longer procedure times, and higher cost including surgical, anesthesia, and nursing costs. Some of this may be balanced by reduced frequency of surveillance and therapeutic procedures. While both EMR and ESD carry significant cost savings, compared with surgery, ESD is additionally disadvantaged by lack of reimbursement.
Regardless of the technique, EMR is easier to learn and perform than ESD, uses a limited number of devices that are readily available, and carries lower cost-burden. EMR is successful for most colorectal polyps, with the primary disadvantage being piecemeal resection of larger polyps. The rates of adverse events are lower, and appropriate surveillance is essential to ensuring complete resection and eliminating recurrence. Japanese and European guidelines endorse ESD for lesions that have a high likelihood of cancer invading the submucosa and for lesions that cannot be removed by EMR because of submucosal fibrosis. Ultimately, patients need to be treated individually with the most appropriate technique.
Dr. Tewani of Rockford Gastroenterology Associates is clinical assistant professor of medicine at the University of Illinois, Rockford. He has no relevant conflicts of interest to disclose.
References
1. Rashid MU et al. Surg Oncol. 2022 Mar 18;101742.
2. Law R et al. Gastrointest Endosc. 2016 Jun;83(6):1248-57.
3. Backes Y et al. BMC Gastroenterol. 2016 May 26;16(1):56.
4. Thiruvengadam SS et al. Gastroenterol Hepatol. 2022 Mar;18(3):133-44.
5. Wang J et al. World J Gastroenterol. 2014 Jul 7;20(25):8282-7l.
Commentary: Evaluating New Treatments and Cardiovascular Risk in PsA, July 2022
Advanced targeted therapies have proven safety and efficacy over conventional therapies, often dramatically improving signs and symptoms. However, it is also desirable that such expensive therapies also show benefit in other outcomes, such as work productivity and quality of life. To evaluate work productivity and daily activity impairment and health-related quality of life in patients with inflammatory arthritis (rheumatoid arthritis, n = 95; PsA, n = 69, and axial spondyloarthritis, n = 95) treated with golimumab, Dejaco and colleagues conducted a prospective, multicenter study in Austria. A total of 110 of these patients were followed for 24 months. At 24 months after golimumab initiation, there was significant improvement in total work productivity, presenteeism, activity impairment, and quality-of-life scores. Thus, golimumab, in addition to reducing disease activity, improved work productivity, activity, and health-related quality of life in patients with inflammatory arthritis, including PsA.
Cardiovascular disease (CVD) remains a major comorbidity in patients with PsA. This observation was once again confirmed in an observational, cross-sectional, case-control study including 207 patients with PsA and 414 matched controls from France. Degboe and colleagues demonstrated that patients with PsA had a higher prevalence of cardiovascular events and cardiovascular risk factors, such as high body mass index, triglyceride level, and hypertension, compared with controls. The proportion of patients with PsA who were estimated to have very high cardiovascular risk factors (≥ 10%) increased when SCORE (European Society of Cardiology Systematic Coronary Risk Evaluation) and QRISK2 (British Heart Foundation) equations considered the additional risk attributable to PsA. However, risk predictions scores such as SCORE and QRISK2 perform poorly in patients with PsA. To identify novel inflammatory and metabolic parameters associated with cardiovascular disease, Schwartz and colleagues looked at 18F-fluorodeoxyglucose (FDG) PET-CT uptake in a cross-sectional analysis of a prospective study including 39 patients with biologic-treatment-naive PsA and 56 age-sex matched controls without PsA. They found that coronary artery disease (CAD) was significantly associated with visceral adiposity and FDG uptake in the bone marrow, liver, spleen, and subcutaneous adipose tissue. Thus, inflammatory and metabolic parameters, including visceral adiposity, potentially contribute to subclinical CAD in patients with PsA and may in the future be used to refine CVD risk and be targets for CAD preventive treatments.
Advanced targeted therapies have proven safety and efficacy over conventional therapies, often dramatically improving signs and symptoms. However, it is also desirable that such expensive therapies also show benefit in other outcomes, such as work productivity and quality of life. To evaluate work productivity and daily activity impairment and health-related quality of life in patients with inflammatory arthritis (rheumatoid arthritis, n = 95; PsA, n = 69, and axial spondyloarthritis, n = 95) treated with golimumab, Dejaco and colleagues conducted a prospective, multicenter study in Austria. A total of 110 of these patients were followed for 24 months. At 24 months after golimumab initiation, there was significant improvement in total work productivity, presenteeism, activity impairment, and quality-of-life scores. Thus, golimumab, in addition to reducing disease activity, improved work productivity, activity, and health-related quality of life in patients with inflammatory arthritis, including PsA.
Cardiovascular disease (CVD) remains a major comorbidity in patients with PsA. This observation was once again confirmed in an observational, cross-sectional, case-control study including 207 patients with PsA and 414 matched controls from France. Degboe and colleagues demonstrated that patients with PsA had a higher prevalence of cardiovascular events and cardiovascular risk factors, such as high body mass index, triglyceride level, and hypertension, compared with controls. The proportion of patients with PsA who were estimated to have very high cardiovascular risk factors (≥ 10%) increased when SCORE (European Society of Cardiology Systematic Coronary Risk Evaluation) and QRISK2 (British Heart Foundation) equations considered the additional risk attributable to PsA. However, risk predictions scores such as SCORE and QRISK2 perform poorly in patients with PsA. To identify novel inflammatory and metabolic parameters associated with cardiovascular disease, Schwartz and colleagues looked at 18F-fluorodeoxyglucose (FDG) PET-CT uptake in a cross-sectional analysis of a prospective study including 39 patients with biologic-treatment-naive PsA and 56 age-sex matched controls without PsA. They found that coronary artery disease (CAD) was significantly associated with visceral adiposity and FDG uptake in the bone marrow, liver, spleen, and subcutaneous adipose tissue. Thus, inflammatory and metabolic parameters, including visceral adiposity, potentially contribute to subclinical CAD in patients with PsA and may in the future be used to refine CVD risk and be targets for CAD preventive treatments.
Advanced targeted therapies have proven safety and efficacy over conventional therapies, often dramatically improving signs and symptoms. However, it is also desirable that such expensive therapies also show benefit in other outcomes, such as work productivity and quality of life. To evaluate work productivity and daily activity impairment and health-related quality of life in patients with inflammatory arthritis (rheumatoid arthritis, n = 95; PsA, n = 69, and axial spondyloarthritis, n = 95) treated with golimumab, Dejaco and colleagues conducted a prospective, multicenter study in Austria. A total of 110 of these patients were followed for 24 months. At 24 months after golimumab initiation, there was significant improvement in total work productivity, presenteeism, activity impairment, and quality-of-life scores. Thus, golimumab, in addition to reducing disease activity, improved work productivity, activity, and health-related quality of life in patients with inflammatory arthritis, including PsA.
Cardiovascular disease (CVD) remains a major comorbidity in patients with PsA. This observation was once again confirmed in an observational, cross-sectional, case-control study including 207 patients with PsA and 414 matched controls from France. Degboe and colleagues demonstrated that patients with PsA had a higher prevalence of cardiovascular events and cardiovascular risk factors, such as high body mass index, triglyceride level, and hypertension, compared with controls. The proportion of patients with PsA who were estimated to have very high cardiovascular risk factors (≥ 10%) increased when SCORE (European Society of Cardiology Systematic Coronary Risk Evaluation) and QRISK2 (British Heart Foundation) equations considered the additional risk attributable to PsA. However, risk predictions scores such as SCORE and QRISK2 perform poorly in patients with PsA. To identify novel inflammatory and metabolic parameters associated with cardiovascular disease, Schwartz and colleagues looked at 18F-fluorodeoxyglucose (FDG) PET-CT uptake in a cross-sectional analysis of a prospective study including 39 patients with biologic-treatment-naive PsA and 56 age-sex matched controls without PsA. They found that coronary artery disease (CAD) was significantly associated with visceral adiposity and FDG uptake in the bone marrow, liver, spleen, and subcutaneous adipose tissue. Thus, inflammatory and metabolic parameters, including visceral adiposity, potentially contribute to subclinical CAD in patients with PsA and may in the future be used to refine CVD risk and be targets for CAD preventive treatments.
ASCO outlines optimal treatments for patients with metastatic clear cell renal cell carcinoma
.
This year in the U.S., it is estimated that 79,000 men and women will be diagnosed with kidney cancer. Clear cell renal cell carcinoma (ccRCC) is the most common subtype and is a leading source of morbidity and mortality. It is also commonly used to study new targeted molecular therapies. The resulting influx of phase III trial results has transformed ccRCC care. “We have an array of different treatment options, and the structure in which these treatments would be applied needed to have expert input. It is an exciting time for kidney cancer, and optimizing therapy now has immediate and meaningful impact into patients longevity,” guideline lead author W. Kimryn Rathmell, MD, PhD, said in an interview.
“The key developments are the emergence of targeted therapies in addition to immune therapies as well as combinations. The order of treatments matters and different patients will have different needs,” said Dr. Rathmell, professor of medicine at Vanderbilt University, Nashville, Tenn.
Dr. Rathmell highlighted the section of the guideline that discusses the need for robust tissue-based diagnosis, as well as active surveillance or cytoreductive nephrectomy. She also emphasized sections on differentiating courses of treatment and plans the use of International Metastatic RCC Database Consortium (IMDC) risk model.
Cytoreductive nephrectomy is an option for patients with one risk factor in which a significant majority of the tumor is within the kidney. The patients should also have good Eastern Cooperative Oncology Group (ECOG) performance status, and no brain, bone, or liver metastases.
Some with metastatic ccRCC can be offered active surveillance. Defining characteristics include favorable or immediate risk, few or no symptoms, a favorable histologic profile, a long interval between nephrectomy and onset of metastasis, or low burden of metastatic disease.
The guidelines also discuss the need to stratify patients within risk groups. Patients rated as intermediate or poor risk in the first line setting should be treated with two immune checkpoint inhibitors (ICI) or an ICI combined with a vascular endothelial growth factor tyrosine kinase inhibitor (VEGFR TKI). ICI combined with a VEGFR TKI can be appropriate for patients with favorable risk but requiring systemic therapy. Those with favorable risk or another medical condition can be offered monotherapy with a VEGFR TKI or an ICI. Another first-line option is high-dose interleukin 2, but there are no established criteria for determining which patients are most likely to benefit.
The guideline discusses second- or later-line therapy options, metastasis-directed therapies, and metastatic subsets including bone, brain, and sarcomatoid carcinomas.
The authors pointed out that significant disparity exists among patients with ccRCC, with some patients having much less access to health care because of racial, geographic, or socioeconomic inequities. There are also known biases within ccRCC care: Females and African Americans are less likely than are males to receive systemic therapy and more likely to receive no treatment; African Americans are less likely to receive systemic therapy; and non-Hispanic African Americans and Hispanics less often undergo cytoreductive nephropathy. African Americans with ccRCC have a pattern of worse survival outcomes than do Whites.
The recommendations cannot be applied to renal cell carcinoma with non-clear cell histology.
“It is important to be comfortable with all of the treatment options for ccRCC, because applying them in the best order, and with the most informed ability to determine efficacy, will have a real impact on patient survival. We are near a goal to offer cure to an increasing number of patients, so choosing therapies that offer that option when it may be possible is important, and when cure is not on the table, we can rationally select therapies that allow patients to have more time with their families, with side effects that are manageable,” Dr. Rathmell said.
The IMDC risk stratification methodology needs to be more widely used in routine practice, Dr. Rathmell said.
“The impact was not significant in patient care until we reached a point of having multiple competing options for treatment. The stakes are higher now, so using this resource is important until we get to the next level with biological classifications,” he said.
Similarly, since the stakes are so high, having an accurate diagnosis is important. Even experts in the field are fooled by imaging findings, and over- or undertreatment of patients has a major impact on outcomes. “This is a message that we need to share for establishing best practices,” he said.
“Just because we have agents, the time to use them is as important as the selection of agent. Similarly, for the cytoreductive nephrectomy issue, new data both clarified and caused some confusion. Not every patient has the luxury of a comprehensive and multidisciplinary tumor board, so we felt it was important to provide some guidance that help making those complex decisions,” Dr. Rathmell said.
Dr. Rathmell has no relevant financial disclosures.
.
This year in the U.S., it is estimated that 79,000 men and women will be diagnosed with kidney cancer. Clear cell renal cell carcinoma (ccRCC) is the most common subtype and is a leading source of morbidity and mortality. It is also commonly used to study new targeted molecular therapies. The resulting influx of phase III trial results has transformed ccRCC care. “We have an array of different treatment options, and the structure in which these treatments would be applied needed to have expert input. It is an exciting time for kidney cancer, and optimizing therapy now has immediate and meaningful impact into patients longevity,” guideline lead author W. Kimryn Rathmell, MD, PhD, said in an interview.
“The key developments are the emergence of targeted therapies in addition to immune therapies as well as combinations. The order of treatments matters and different patients will have different needs,” said Dr. Rathmell, professor of medicine at Vanderbilt University, Nashville, Tenn.
Dr. Rathmell highlighted the section of the guideline that discusses the need for robust tissue-based diagnosis, as well as active surveillance or cytoreductive nephrectomy. She also emphasized sections on differentiating courses of treatment and plans the use of International Metastatic RCC Database Consortium (IMDC) risk model.
Cytoreductive nephrectomy is an option for patients with one risk factor in which a significant majority of the tumor is within the kidney. The patients should also have good Eastern Cooperative Oncology Group (ECOG) performance status, and no brain, bone, or liver metastases.
Some with metastatic ccRCC can be offered active surveillance. Defining characteristics include favorable or immediate risk, few or no symptoms, a favorable histologic profile, a long interval between nephrectomy and onset of metastasis, or low burden of metastatic disease.
The guidelines also discuss the need to stratify patients within risk groups. Patients rated as intermediate or poor risk in the first line setting should be treated with two immune checkpoint inhibitors (ICI) or an ICI combined with a vascular endothelial growth factor tyrosine kinase inhibitor (VEGFR TKI). ICI combined with a VEGFR TKI can be appropriate for patients with favorable risk but requiring systemic therapy. Those with favorable risk or another medical condition can be offered monotherapy with a VEGFR TKI or an ICI. Another first-line option is high-dose interleukin 2, but there are no established criteria for determining which patients are most likely to benefit.
The guideline discusses second- or later-line therapy options, metastasis-directed therapies, and metastatic subsets including bone, brain, and sarcomatoid carcinomas.
The authors pointed out that significant disparity exists among patients with ccRCC, with some patients having much less access to health care because of racial, geographic, or socioeconomic inequities. There are also known biases within ccRCC care: Females and African Americans are less likely than are males to receive systemic therapy and more likely to receive no treatment; African Americans are less likely to receive systemic therapy; and non-Hispanic African Americans and Hispanics less often undergo cytoreductive nephropathy. African Americans with ccRCC have a pattern of worse survival outcomes than do Whites.
The recommendations cannot be applied to renal cell carcinoma with non-clear cell histology.
“It is important to be comfortable with all of the treatment options for ccRCC, because applying them in the best order, and with the most informed ability to determine efficacy, will have a real impact on patient survival. We are near a goal to offer cure to an increasing number of patients, so choosing therapies that offer that option when it may be possible is important, and when cure is not on the table, we can rationally select therapies that allow patients to have more time with their families, with side effects that are manageable,” Dr. Rathmell said.
The IMDC risk stratification methodology needs to be more widely used in routine practice, Dr. Rathmell said.
“The impact was not significant in patient care until we reached a point of having multiple competing options for treatment. The stakes are higher now, so using this resource is important until we get to the next level with biological classifications,” he said.
Similarly, since the stakes are so high, having an accurate diagnosis is important. Even experts in the field are fooled by imaging findings, and over- or undertreatment of patients has a major impact on outcomes. “This is a message that we need to share for establishing best practices,” he said.
“Just because we have agents, the time to use them is as important as the selection of agent. Similarly, for the cytoreductive nephrectomy issue, new data both clarified and caused some confusion. Not every patient has the luxury of a comprehensive and multidisciplinary tumor board, so we felt it was important to provide some guidance that help making those complex decisions,” Dr. Rathmell said.
Dr. Rathmell has no relevant financial disclosures.
.
This year in the U.S., it is estimated that 79,000 men and women will be diagnosed with kidney cancer. Clear cell renal cell carcinoma (ccRCC) is the most common subtype and is a leading source of morbidity and mortality. It is also commonly used to study new targeted molecular therapies. The resulting influx of phase III trial results has transformed ccRCC care. “We have an array of different treatment options, and the structure in which these treatments would be applied needed to have expert input. It is an exciting time for kidney cancer, and optimizing therapy now has immediate and meaningful impact into patients longevity,” guideline lead author W. Kimryn Rathmell, MD, PhD, said in an interview.
“The key developments are the emergence of targeted therapies in addition to immune therapies as well as combinations. The order of treatments matters and different patients will have different needs,” said Dr. Rathmell, professor of medicine at Vanderbilt University, Nashville, Tenn.
Dr. Rathmell highlighted the section of the guideline that discusses the need for robust tissue-based diagnosis, as well as active surveillance or cytoreductive nephrectomy. She also emphasized sections on differentiating courses of treatment and plans the use of International Metastatic RCC Database Consortium (IMDC) risk model.
Cytoreductive nephrectomy is an option for patients with one risk factor in which a significant majority of the tumor is within the kidney. The patients should also have good Eastern Cooperative Oncology Group (ECOG) performance status, and no brain, bone, or liver metastases.
Some with metastatic ccRCC can be offered active surveillance. Defining characteristics include favorable or immediate risk, few or no symptoms, a favorable histologic profile, a long interval between nephrectomy and onset of metastasis, or low burden of metastatic disease.
The guidelines also discuss the need to stratify patients within risk groups. Patients rated as intermediate or poor risk in the first line setting should be treated with two immune checkpoint inhibitors (ICI) or an ICI combined with a vascular endothelial growth factor tyrosine kinase inhibitor (VEGFR TKI). ICI combined with a VEGFR TKI can be appropriate for patients with favorable risk but requiring systemic therapy. Those with favorable risk or another medical condition can be offered monotherapy with a VEGFR TKI or an ICI. Another first-line option is high-dose interleukin 2, but there are no established criteria for determining which patients are most likely to benefit.
The guideline discusses second- or later-line therapy options, metastasis-directed therapies, and metastatic subsets including bone, brain, and sarcomatoid carcinomas.
The authors pointed out that significant disparity exists among patients with ccRCC, with some patients having much less access to health care because of racial, geographic, or socioeconomic inequities. There are also known biases within ccRCC care: Females and African Americans are less likely than are males to receive systemic therapy and more likely to receive no treatment; African Americans are less likely to receive systemic therapy; and non-Hispanic African Americans and Hispanics less often undergo cytoreductive nephropathy. African Americans with ccRCC have a pattern of worse survival outcomes than do Whites.
The recommendations cannot be applied to renal cell carcinoma with non-clear cell histology.
“It is important to be comfortable with all of the treatment options for ccRCC, because applying them in the best order, and with the most informed ability to determine efficacy, will have a real impact on patient survival. We are near a goal to offer cure to an increasing number of patients, so choosing therapies that offer that option when it may be possible is important, and when cure is not on the table, we can rationally select therapies that allow patients to have more time with their families, with side effects that are manageable,” Dr. Rathmell said.
The IMDC risk stratification methodology needs to be more widely used in routine practice, Dr. Rathmell said.
“The impact was not significant in patient care until we reached a point of having multiple competing options for treatment. The stakes are higher now, so using this resource is important until we get to the next level with biological classifications,” he said.
Similarly, since the stakes are so high, having an accurate diagnosis is important. Even experts in the field are fooled by imaging findings, and over- or undertreatment of patients has a major impact on outcomes. “This is a message that we need to share for establishing best practices,” he said.
“Just because we have agents, the time to use them is as important as the selection of agent. Similarly, for the cytoreductive nephrectomy issue, new data both clarified and caused some confusion. Not every patient has the luxury of a comprehensive and multidisciplinary tumor board, so we felt it was important to provide some guidance that help making those complex decisions,” Dr. Rathmell said.
Dr. Rathmell has no relevant financial disclosures.
FROM THE JOURNAL OF CLINICAL ONCOLOGY