User login
Hospital safety culture may influence surgical outcomes
SAN DIEGO – Hospital safety culture may positively influence certain surgical patient outcomes, results from a study of 56 Illinois hospitals demonstrated.
“Efforts to improve awareness of safety and quality improvement principles should be encouraged at both the surgical system and hospital levels,” David D. Odell, MD, said at the American College of Surgeons/National Surgical Quality Improvement Program National Conference. “Safety culture itself is a concept [that] is increasingly viewed as important in the delivery of high-quality care. Yet in the surgical world, very little is known about how hospital culture actually influences outcomes for our patients.”
Dr. Odell, a thoracic surgeon at Northwestern Memorial Hospital, Chicago, discussed results from a study by the Illinois Surgical Quality Improvement Collaborative, a group of Illinois hospitals working together to improve the quality of surgical care in the state. Participants in the Collaborative include 56 hospitals, including all academic medical centers in the state, as well as 11 rural hospitals. Combined, these facilities perform 60% of general surgery operations in the state and 80% of all complex operations, impacting more than 600,000 patients each year.
In an effort to evaluate the relationship between hospital safety culture and surgical patient outcomes, Dr. Odell and his associates invited staff of Collaborative members to complete the Safety Attitudes Questionnaire (SAQ), a 56-item validated tool for assessment of hospital culture. Domains focused on were teamwork, communication, engagement, and leadership. The SAQ was given to administrators, staff, and front-line providers “to measure safety culture across all levels of the hospital,” Dr. Odell said. Percent positive responses were calculated at the hospital level for each of the eight domains to calculate a composite measure of safety. The researchers measured the impact of safety culture by assessing positive SAQ response rates. Outcome variables of interest were morbidity, mortality, death or serious morbidity, and readmission. Hospital-level risk-adjusted event rates and linear regression models were used to assess the impact of safety culture while controlling for teaching status, rural location, trauma center designation, hospital control (management), and the annual surgical volume.
Of the 49 participating hospitals represented in the survey responses, 49% had an Accreditation Council for Graduate Medical Education (ACGME)-accredited residency program, 12% were rural, 61% provided trauma care, 35% had a religious affiliation, 57% were “other” not-for-profit, and the mean total surgical volume was 11,412 cases.
Dr. Odell reported that by domain, SAQ responses were most positive for operating room safety and lowest for hospital management. “That doesn’t necessarily reflect the management’s outcomes only, but the views of those who took the survey toward management,” he said.
When the researchers evaluated the impact of a more-positive safety culture on the risk-adjusted outcome measures, they observed a statistically significant impact on morbidity following surgery (P = .02). The trend was similar although not statistically significant for death/serious morbidity (P = .08), mortality (P =. 20), or readmission (P = .68).
Dr. Odell acknowledged certain limitations of the study, including its retrospective design and the fact that the SAQ is a subjective assessment tool. “Not all [staff invited] were surveyed,” he added. “We sent out just under 1,400 surveys and we had a response rate of 44%.”
Staff from participating institutions of the Collaborative meet on a semiannual basis to share ideas, celebrate successes and learn from each other’s experiences, Dr. Odell said. Ongoing efforts to improve safety culture include fostering opportunities for mentorship in quality improvement and process improvement endeavors, as well as the provision of educational materials targeted at all levels of hospital staff “so that we can get everyone thinking and speaking the same language when it comes to quality improvement,” he said.
The Collaborative is funded by Blue Cross Blue Shield of Illinois. Dr. Odell reported having no financial disclosures.
SAN DIEGO – Hospital safety culture may positively influence certain surgical patient outcomes, results from a study of 56 Illinois hospitals demonstrated.
“Efforts to improve awareness of safety and quality improvement principles should be encouraged at both the surgical system and hospital levels,” David D. Odell, MD, said at the American College of Surgeons/National Surgical Quality Improvement Program National Conference. “Safety culture itself is a concept [that] is increasingly viewed as important in the delivery of high-quality care. Yet in the surgical world, very little is known about how hospital culture actually influences outcomes for our patients.”
Dr. Odell, a thoracic surgeon at Northwestern Memorial Hospital, Chicago, discussed results from a study by the Illinois Surgical Quality Improvement Collaborative, a group of Illinois hospitals working together to improve the quality of surgical care in the state. Participants in the Collaborative include 56 hospitals, including all academic medical centers in the state, as well as 11 rural hospitals. Combined, these facilities perform 60% of general surgery operations in the state and 80% of all complex operations, impacting more than 600,000 patients each year.
In an effort to evaluate the relationship between hospital safety culture and surgical patient outcomes, Dr. Odell and his associates invited staff of Collaborative members to complete the Safety Attitudes Questionnaire (SAQ), a 56-item validated tool for assessment of hospital culture. Domains focused on were teamwork, communication, engagement, and leadership. The SAQ was given to administrators, staff, and front-line providers “to measure safety culture across all levels of the hospital,” Dr. Odell said. Percent positive responses were calculated at the hospital level for each of the eight domains to calculate a composite measure of safety. The researchers measured the impact of safety culture by assessing positive SAQ response rates. Outcome variables of interest were morbidity, mortality, death or serious morbidity, and readmission. Hospital-level risk-adjusted event rates and linear regression models were used to assess the impact of safety culture while controlling for teaching status, rural location, trauma center designation, hospital control (management), and the annual surgical volume.
Of the 49 participating hospitals represented in the survey responses, 49% had an Accreditation Council for Graduate Medical Education (ACGME)-accredited residency program, 12% were rural, 61% provided trauma care, 35% had a religious affiliation, 57% were “other” not-for-profit, and the mean total surgical volume was 11,412 cases.
Dr. Odell reported that by domain, SAQ responses were most positive for operating room safety and lowest for hospital management. “That doesn’t necessarily reflect the management’s outcomes only, but the views of those who took the survey toward management,” he said.
When the researchers evaluated the impact of a more-positive safety culture on the risk-adjusted outcome measures, they observed a statistically significant impact on morbidity following surgery (P = .02). The trend was similar although not statistically significant for death/serious morbidity (P = .08), mortality (P =. 20), or readmission (P = .68).
Dr. Odell acknowledged certain limitations of the study, including its retrospective design and the fact that the SAQ is a subjective assessment tool. “Not all [staff invited] were surveyed,” he added. “We sent out just under 1,400 surveys and we had a response rate of 44%.”
Staff from participating institutions of the Collaborative meet on a semiannual basis to share ideas, celebrate successes and learn from each other’s experiences, Dr. Odell said. Ongoing efforts to improve safety culture include fostering opportunities for mentorship in quality improvement and process improvement endeavors, as well as the provision of educational materials targeted at all levels of hospital staff “so that we can get everyone thinking and speaking the same language when it comes to quality improvement,” he said.
The Collaborative is funded by Blue Cross Blue Shield of Illinois. Dr. Odell reported having no financial disclosures.
SAN DIEGO – Hospital safety culture may positively influence certain surgical patient outcomes, results from a study of 56 Illinois hospitals demonstrated.
“Efforts to improve awareness of safety and quality improvement principles should be encouraged at both the surgical system and hospital levels,” David D. Odell, MD, said at the American College of Surgeons/National Surgical Quality Improvement Program National Conference. “Safety culture itself is a concept [that] is increasingly viewed as important in the delivery of high-quality care. Yet in the surgical world, very little is known about how hospital culture actually influences outcomes for our patients.”
Dr. Odell, a thoracic surgeon at Northwestern Memorial Hospital, Chicago, discussed results from a study by the Illinois Surgical Quality Improvement Collaborative, a group of Illinois hospitals working together to improve the quality of surgical care in the state. Participants in the Collaborative include 56 hospitals, including all academic medical centers in the state, as well as 11 rural hospitals. Combined, these facilities perform 60% of general surgery operations in the state and 80% of all complex operations, impacting more than 600,000 patients each year.
In an effort to evaluate the relationship between hospital safety culture and surgical patient outcomes, Dr. Odell and his associates invited staff of Collaborative members to complete the Safety Attitudes Questionnaire (SAQ), a 56-item validated tool for assessment of hospital culture. Domains focused on were teamwork, communication, engagement, and leadership. The SAQ was given to administrators, staff, and front-line providers “to measure safety culture across all levels of the hospital,” Dr. Odell said. Percent positive responses were calculated at the hospital level for each of the eight domains to calculate a composite measure of safety. The researchers measured the impact of safety culture by assessing positive SAQ response rates. Outcome variables of interest were morbidity, mortality, death or serious morbidity, and readmission. Hospital-level risk-adjusted event rates and linear regression models were used to assess the impact of safety culture while controlling for teaching status, rural location, trauma center designation, hospital control (management), and the annual surgical volume.
Of the 49 participating hospitals represented in the survey responses, 49% had an Accreditation Council for Graduate Medical Education (ACGME)-accredited residency program, 12% were rural, 61% provided trauma care, 35% had a religious affiliation, 57% were “other” not-for-profit, and the mean total surgical volume was 11,412 cases.
Dr. Odell reported that by domain, SAQ responses were most positive for operating room safety and lowest for hospital management. “That doesn’t necessarily reflect the management’s outcomes only, but the views of those who took the survey toward management,” he said.
When the researchers evaluated the impact of a more-positive safety culture on the risk-adjusted outcome measures, they observed a statistically significant impact on morbidity following surgery (P = .02). The trend was similar although not statistically significant for death/serious morbidity (P = .08), mortality (P =. 20), or readmission (P = .68).
Dr. Odell acknowledged certain limitations of the study, including its retrospective design and the fact that the SAQ is a subjective assessment tool. “Not all [staff invited] were surveyed,” he added. “We sent out just under 1,400 surveys and we had a response rate of 44%.”
Staff from participating institutions of the Collaborative meet on a semiannual basis to share ideas, celebrate successes and learn from each other’s experiences, Dr. Odell said. Ongoing efforts to improve safety culture include fostering opportunities for mentorship in quality improvement and process improvement endeavors, as well as the provision of educational materials targeted at all levels of hospital staff “so that we can get everyone thinking and speaking the same language when it comes to quality improvement,” he said.
The Collaborative is funded by Blue Cross Blue Shield of Illinois. Dr. Odell reported having no financial disclosures.
EXPERT ANALYSIS AT THE ACS NSQIP NATIONAL CONFERENCE
Key clinical point: A positive hospital safety culture significantly impacted morbidity following surgery.
Major finding: When the researchers evaluated the impact of a positive safety culture on risk-adjusted outcome measures, they observed a statistically significant impact on morbidity following surgery (P = .02).
Data source: A retrospective study by the Illinois Surgical Quality Improvement Collaborative, a group of 56 hospitals in the state.
Disclosures: The Collaborative is funded by Blue Cross Blue Shield of Illinois. Dr. Odell reported having no financial disclosures.
Revised axial spondyloarthritis classification criteria remain elusive
DENVER – The revised axial spondyloarthritis classification criteria that U.S. rheumatologists say they desperately need remain elusive, with no firm path to creation.
The 2009 classification criteria for axial spondyloarthritis (SpA) developed by the Assessment of Spondyloarthritis international Society (ASAS) was a landmark in creating a definition for axial SpA to use when enrolling patients into clinical trials and to also potentially use for diagnosis.
But the 2009 classification criteria also have several shortcomings, especially from a U.S. perspective, that have limited the utility of the criteria in U.S. practice and studies.
Several years of discussions among North American rheumatologists about the classification criteria have produced consensus about what is wrong with them: The clinical elements of the criteria are not specific enough with sensitivity and specificity rates that are each about 80%-85%; the imaging component of the criteria is not totally up to date and as one example only considers osteitis and not other MRI abnormalities such as fatty metaplasia and T1 structural lesions; the criteria were entirely based on results from studies with Europeans, making their applicability to other patient types uncertain; and some terminology in the criteria are confusing, Liron Caplan, MD, said during a discussion of the criteria at the annual meeting of the of the Spondyloarthritis Research and Treatment Network.
But while the SpA experts who gathered at the meeting and who have also been discussing this issue since 2013 are of one mind on the problems, agreement on exactly how to solve them remains unresolved.
“Everything is still on the table,” said Dr. Caplan, a rheumatologist and director of the SpA program at the University of Colorado in Aurora.
A year ago, at SPARTAN’s prior annual meeting, circumstances looked on track for a quicker resolution. In a vote of the membership at the 2015 gathering, 64% backed creation of new classification criteria, and Dr. Caplan proposed quickly preparing a research proposal to present in early 2016 to the American College of Rheumatology and the European League Against Rheumatism for funding.
But in the ensuing year, the effort got mired in discussions on how to perform the study needed to gather the evidence base for criteria revisions. Some progress occurred, most notably an agreement between SPARTAN officials and the ASAS leadership to work together on the revision, said Atul A. Deodhar, MD, professor of medicine and medical director of the rheumatology clinics at the Oregon Health & Science University in Portland.
The SPARTAN and ASAS representatives also agreed on several specifics of what’s needed, including a weighting formula for the 10 different clinical criteria of the SpA classification. Currently, in individuals with low back pain lasting 3 months or longer at an age of onset younger than 45 years, any one of these 10 clinical criteria can create a case of SpA either if combined with MRI or radiographic evidence of sacroiliitis, or without any imaging if the individual is HLA-B27 positive and fulfills two items from the 10-item list of clinical criteria.
It’s the lack of radiographic imaging confirmation, so-called “nonradiographic SpA,” that generates the greatest concern about below-par specificity and that may be helped by a weighting adjustment that varies the classification power of each of the 10 clinical items. A more ideal balance might be criteria with sensitivity reduced from the current level to about 70% and with specificity rising to about 90%, Dr. Caplan suggested.
The keystone of any revision will be a large, new study with SpA patients from various locations. Dr. Caplan emphasized that “no firm decisions have yet been made” regarding the study's name or research focus.
On Twitter @mitchelzoler
This article was updated August 11, 2016.
DENVER – The revised axial spondyloarthritis classification criteria that U.S. rheumatologists say they desperately need remain elusive, with no firm path to creation.
The 2009 classification criteria for axial spondyloarthritis (SpA) developed by the Assessment of Spondyloarthritis international Society (ASAS) was a landmark in creating a definition for axial SpA to use when enrolling patients into clinical trials and to also potentially use for diagnosis.
But the 2009 classification criteria also have several shortcomings, especially from a U.S. perspective, that have limited the utility of the criteria in U.S. practice and studies.
Several years of discussions among North American rheumatologists about the classification criteria have produced consensus about what is wrong with them: The clinical elements of the criteria are not specific enough with sensitivity and specificity rates that are each about 80%-85%; the imaging component of the criteria is not totally up to date and as one example only considers osteitis and not other MRI abnormalities such as fatty metaplasia and T1 structural lesions; the criteria were entirely based on results from studies with Europeans, making their applicability to other patient types uncertain; and some terminology in the criteria are confusing, Liron Caplan, MD, said during a discussion of the criteria at the annual meeting of the of the Spondyloarthritis Research and Treatment Network.
But while the SpA experts who gathered at the meeting and who have also been discussing this issue since 2013 are of one mind on the problems, agreement on exactly how to solve them remains unresolved.
“Everything is still on the table,” said Dr. Caplan, a rheumatologist and director of the SpA program at the University of Colorado in Aurora.
A year ago, at SPARTAN’s prior annual meeting, circumstances looked on track for a quicker resolution. In a vote of the membership at the 2015 gathering, 64% backed creation of new classification criteria, and Dr. Caplan proposed quickly preparing a research proposal to present in early 2016 to the American College of Rheumatology and the European League Against Rheumatism for funding.
But in the ensuing year, the effort got mired in discussions on how to perform the study needed to gather the evidence base for criteria revisions. Some progress occurred, most notably an agreement between SPARTAN officials and the ASAS leadership to work together on the revision, said Atul A. Deodhar, MD, professor of medicine and medical director of the rheumatology clinics at the Oregon Health & Science University in Portland.
The SPARTAN and ASAS representatives also agreed on several specifics of what’s needed, including a weighting formula for the 10 different clinical criteria of the SpA classification. Currently, in individuals with low back pain lasting 3 months or longer at an age of onset younger than 45 years, any one of these 10 clinical criteria can create a case of SpA either if combined with MRI or radiographic evidence of sacroiliitis, or without any imaging if the individual is HLA-B27 positive and fulfills two items from the 10-item list of clinical criteria.
It’s the lack of radiographic imaging confirmation, so-called “nonradiographic SpA,” that generates the greatest concern about below-par specificity and that may be helped by a weighting adjustment that varies the classification power of each of the 10 clinical items. A more ideal balance might be criteria with sensitivity reduced from the current level to about 70% and with specificity rising to about 90%, Dr. Caplan suggested.
The keystone of any revision will be a large, new study with SpA patients from various locations. Dr. Caplan emphasized that “no firm decisions have yet been made” regarding the study's name or research focus.
On Twitter @mitchelzoler
This article was updated August 11, 2016.
DENVER – The revised axial spondyloarthritis classification criteria that U.S. rheumatologists say they desperately need remain elusive, with no firm path to creation.
The 2009 classification criteria for axial spondyloarthritis (SpA) developed by the Assessment of Spondyloarthritis international Society (ASAS) was a landmark in creating a definition for axial SpA to use when enrolling patients into clinical trials and to also potentially use for diagnosis.
But the 2009 classification criteria also have several shortcomings, especially from a U.S. perspective, that have limited the utility of the criteria in U.S. practice and studies.
Several years of discussions among North American rheumatologists about the classification criteria have produced consensus about what is wrong with them: The clinical elements of the criteria are not specific enough with sensitivity and specificity rates that are each about 80%-85%; the imaging component of the criteria is not totally up to date and as one example only considers osteitis and not other MRI abnormalities such as fatty metaplasia and T1 structural lesions; the criteria were entirely based on results from studies with Europeans, making their applicability to other patient types uncertain; and some terminology in the criteria are confusing, Liron Caplan, MD, said during a discussion of the criteria at the annual meeting of the of the Spondyloarthritis Research and Treatment Network.
But while the SpA experts who gathered at the meeting and who have also been discussing this issue since 2013 are of one mind on the problems, agreement on exactly how to solve them remains unresolved.
“Everything is still on the table,” said Dr. Caplan, a rheumatologist and director of the SpA program at the University of Colorado in Aurora.
A year ago, at SPARTAN’s prior annual meeting, circumstances looked on track for a quicker resolution. In a vote of the membership at the 2015 gathering, 64% backed creation of new classification criteria, and Dr. Caplan proposed quickly preparing a research proposal to present in early 2016 to the American College of Rheumatology and the European League Against Rheumatism for funding.
But in the ensuing year, the effort got mired in discussions on how to perform the study needed to gather the evidence base for criteria revisions. Some progress occurred, most notably an agreement between SPARTAN officials and the ASAS leadership to work together on the revision, said Atul A. Deodhar, MD, professor of medicine and medical director of the rheumatology clinics at the Oregon Health & Science University in Portland.
The SPARTAN and ASAS representatives also agreed on several specifics of what’s needed, including a weighting formula for the 10 different clinical criteria of the SpA classification. Currently, in individuals with low back pain lasting 3 months or longer at an age of onset younger than 45 years, any one of these 10 clinical criteria can create a case of SpA either if combined with MRI or radiographic evidence of sacroiliitis, or without any imaging if the individual is HLA-B27 positive and fulfills two items from the 10-item list of clinical criteria.
It’s the lack of radiographic imaging confirmation, so-called “nonradiographic SpA,” that generates the greatest concern about below-par specificity and that may be helped by a weighting adjustment that varies the classification power of each of the 10 clinical items. A more ideal balance might be criteria with sensitivity reduced from the current level to about 70% and with specificity rising to about 90%, Dr. Caplan suggested.
The keystone of any revision will be a large, new study with SpA patients from various locations. Dr. Caplan emphasized that “no firm decisions have yet been made” regarding the study's name or research focus.
On Twitter @mitchelzoler
This article was updated August 11, 2016.
EXPERT ANALYSIS FROM THE 2016 SPARTAN ANNUAL MEETING
Salvage RT may reduce risk of prostate cancer metastasis even at low PSA levels
Increasing prostate-specific antigen (PSA) levels prior to salvage radiotherapy were significantly associated with an increased risk of distant metastasis and cause-specific mortality but not overall survival among men with detectable PSA following radical prostatectomy, according to a report published in the Journal of Clinical Oncology.
Investigators retrospectively studied 1,106 consecutive patients with surgically staged prostate cancer who received salvage radiotherapy (SRT) and who had a documented PSA of 0.1 ng/mL or greater. A total of 208 patients developed distant metastasis during median follow-up of 8.9 years.
In multivariate analysis, each doubling of pre-SRT PSA was associated with a 32% increased risk of distant metastasis (P less than .001), reported Bradley J. Stish, MD, and his associates at the Mayo Clinic (J Clin Oncol. 2016 Aug. doi: 10.1200/JCO.2016.68.3425).
Each pre-SRT PSA doubling also significantly increased the relative risk of cause-specific mortality (HR, 1.40; P less than .001) but not overall survival (HR, 1.12; P = .02).
More advanced tumor stages, higher Gleason scores, and higher PSA levels prior to salvage radiotherapy were significantly associated with an increased risk of biochemical recurrence. Salvage radiotherapy dose greater than 68 Gy and use of androgen suppression were associated with reduced risk of biochemical recurrence. Overall survival was 92.9% (95% confidence interval, 91.3%-94.5%) at 5 years and 77.3% (95% CI, 74.2%-80.5%) at 10 years. The cause-specific mortality rates were 3.0% (95% CI, 1.9%-4.0%) at 5 years and 10.4% (95% CI, 8.0%-12.6%) at 10 years.
“When taken together, these findings provide strong evidence supporting the clinical benefits attributable to early [salvage radiotherapy] in men with detectable PSA after [radical prostatectomy],” investigators wrote.
Beginning salvage radiotherapy treatment at the lowest PSA level is “most beneficial for long-term therapeutic efficacy,” they added.
On Twitter @jessnicolecraig
Increasing prostate-specific antigen (PSA) levels prior to salvage radiotherapy were significantly associated with an increased risk of distant metastasis and cause-specific mortality but not overall survival among men with detectable PSA following radical prostatectomy, according to a report published in the Journal of Clinical Oncology.
Investigators retrospectively studied 1,106 consecutive patients with surgically staged prostate cancer who received salvage radiotherapy (SRT) and who had a documented PSA of 0.1 ng/mL or greater. A total of 208 patients developed distant metastasis during median follow-up of 8.9 years.
In multivariate analysis, each doubling of pre-SRT PSA was associated with a 32% increased risk of distant metastasis (P less than .001), reported Bradley J. Stish, MD, and his associates at the Mayo Clinic (J Clin Oncol. 2016 Aug. doi: 10.1200/JCO.2016.68.3425).
Each pre-SRT PSA doubling also significantly increased the relative risk of cause-specific mortality (HR, 1.40; P less than .001) but not overall survival (HR, 1.12; P = .02).
More advanced tumor stages, higher Gleason scores, and higher PSA levels prior to salvage radiotherapy were significantly associated with an increased risk of biochemical recurrence. Salvage radiotherapy dose greater than 68 Gy and use of androgen suppression were associated with reduced risk of biochemical recurrence. Overall survival was 92.9% (95% confidence interval, 91.3%-94.5%) at 5 years and 77.3% (95% CI, 74.2%-80.5%) at 10 years. The cause-specific mortality rates were 3.0% (95% CI, 1.9%-4.0%) at 5 years and 10.4% (95% CI, 8.0%-12.6%) at 10 years.
“When taken together, these findings provide strong evidence supporting the clinical benefits attributable to early [salvage radiotherapy] in men with detectable PSA after [radical prostatectomy],” investigators wrote.
Beginning salvage radiotherapy treatment at the lowest PSA level is “most beneficial for long-term therapeutic efficacy,” they added.
On Twitter @jessnicolecraig
Increasing prostate-specific antigen (PSA) levels prior to salvage radiotherapy were significantly associated with an increased risk of distant metastasis and cause-specific mortality but not overall survival among men with detectable PSA following radical prostatectomy, according to a report published in the Journal of Clinical Oncology.
Investigators retrospectively studied 1,106 consecutive patients with surgically staged prostate cancer who received salvage radiotherapy (SRT) and who had a documented PSA of 0.1 ng/mL or greater. A total of 208 patients developed distant metastasis during median follow-up of 8.9 years.
In multivariate analysis, each doubling of pre-SRT PSA was associated with a 32% increased risk of distant metastasis (P less than .001), reported Bradley J. Stish, MD, and his associates at the Mayo Clinic (J Clin Oncol. 2016 Aug. doi: 10.1200/JCO.2016.68.3425).
Each pre-SRT PSA doubling also significantly increased the relative risk of cause-specific mortality (HR, 1.40; P less than .001) but not overall survival (HR, 1.12; P = .02).
More advanced tumor stages, higher Gleason scores, and higher PSA levels prior to salvage radiotherapy were significantly associated with an increased risk of biochemical recurrence. Salvage radiotherapy dose greater than 68 Gy and use of androgen suppression were associated with reduced risk of biochemical recurrence. Overall survival was 92.9% (95% confidence interval, 91.3%-94.5%) at 5 years and 77.3% (95% CI, 74.2%-80.5%) at 10 years. The cause-specific mortality rates were 3.0% (95% CI, 1.9%-4.0%) at 5 years and 10.4% (95% CI, 8.0%-12.6%) at 10 years.
“When taken together, these findings provide strong evidence supporting the clinical benefits attributable to early [salvage radiotherapy] in men with detectable PSA after [radical prostatectomy],” investigators wrote.
Beginning salvage radiotherapy treatment at the lowest PSA level is “most beneficial for long-term therapeutic efficacy,” they added.
On Twitter @jessnicolecraig
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: Salvage radiotherapy should not be delayed by prolonged monitoring of detectable post radical prostatectomy PSA levels.
Major finding: A multivariate analysis revealed that prior to salvage radiotherapy, each doubling of prostate specific antigen was significantly associated with an increased risk of distant metastasis (hazard ratio, 1.32; 95% CI, 1.19-1.46; P less than .001) and cause-specific mortality (HR, 1.40; 95% CI, 1.21-1.63, P less than .001).
Data source: A retrospective study of 1,106 men with detectable prostate-specific antigen levels following radical prostatectomy for prostate cancer.
Disclosures: The Mayo Clinic funded this study. Six of the investigators had no disclosures to report; one investigator reported holding stock or ownership interest in Pfizer.
ACS NSQIP Geriatric Surgery Pilot Collaborative: Evaluating variables for inclusion
SAN DIEGO – A project to assess geriatric-specific variables for inclusion in the American College of Surgeons/National Surgical Quality Improvement Program National Conference is underway, according to Thomas N. Robinson, MD.
Dr. Robinson discussed results from the ACS-NSQIP Geriatric Surgery Pilot Collaborative, an effort launched in January 2014 with the ultimate goal of evaluating specific geriatric variables for incorporation into the ACS NSQIP set of essential variables collected by all participating hospitals.
Dr. Robinson, professor of surgery at the University of Colorado, Denver, said that 23 clinical sites in the United States are currently studying the following geriatric-specific variables in surgery patients aged 65 and older:
• Origin from home with support (to determine baseline functional status: lives alone at home, lives with support in home, origin status not from home).
• Discharge functional health status (ability to perform activities of daily living).
• Discharge with/without services (to capture care needs upon discharge).
• Preoperative use of a mobility aid.
• Preoperative history of prior falls.
• Postoperative history of pressure ulcer.
• Fall risk on discharge.
• New mobility aid on discharge.
• History of dementia.
• Competency status on admission.
• Postoperative delirium (yes or no).
• Hospice care on admission (yes or no).
• Do Not Resuscitate (DNR) order in place on admission (yes or no).
• DNR order during hospitalization (yes or no).
• Setting where DNR order was placed.
• Postoperative palliative care consult (yes or no).
• 30-day postoperative outcomes: functional health status (ability to perform activities of daily living), physical function compared with baseline, and living location.
The number of surgery cases in the collaborative grew from 7,235 in the first 6 months of 2014 to 24,835 cases in the last 6 months of 2015. The top 10 operations were total joint arthroplasty (29%), colectomy (12%), spine (8%), hip fracture (7%), carotid endarterectomy (4%), hysterectomy (4%), lung resection (2%), open lower extremity bypass (2%), laparoscopic cholecystectomy (2%), and pancreatectomy (2%).
Dr. Robinson reported that the rate of preoperative dementia among cases studied in the collaborative was 10%. “The incorporation of dementia into a surgical dataset represents an important step forward in providing quality surgical care for the elderly,” he said. “Dementia is a global public health concern.” He went on to note that patients with dementia have a 2.5-fold increased risk of developing postoperative delirium, making it “the perfect place to start a quality project. One in three cases of delirium is preventable. In our data set, delirium is associated with a hospital stay that’s 4 days longer, an increased chance of requiring discharge to an institutional care facility, an increased chance of a serious complication, and a higher 30-day mortality.”
Simple, low-tech bedside interventions such as ambulating in the hall three times a day, orienting the person, having the person sleep at night rather than sleep during the day, and avoiding medications with high risk for adverse events in older adults can prevent postoperative delirium, Dr. Robinson said.
One way that the Geriatric Surgery Pilot Collaborative can improve the surgical care of older adults is by fostering quality programs initiated at the participating local hospitals. “Preserving function after hospital stays is a first major goal,” he said. Another strategy involves creating a multidisciplinary frailty assessment to aid with decision making and risk assessment. “This takes into consideration NSQIP variables such as function, nutrition, comorbidity burden, cognition, social vulnerability, and mobility,” he said. The final and ultimate goal of the geriatric surgery collaborative is to establish a foundation of quality measurement for the Coalition for Quality in Geriatric Surgery, a project initiated by the American College of Surgeons to systematically improve the surgical care of older adults.
Dr. Robinson reported having no relevant financial disclosures.
SAN DIEGO – A project to assess geriatric-specific variables for inclusion in the American College of Surgeons/National Surgical Quality Improvement Program National Conference is underway, according to Thomas N. Robinson, MD.
Dr. Robinson discussed results from the ACS-NSQIP Geriatric Surgery Pilot Collaborative, an effort launched in January 2014 with the ultimate goal of evaluating specific geriatric variables for incorporation into the ACS NSQIP set of essential variables collected by all participating hospitals.
Dr. Robinson, professor of surgery at the University of Colorado, Denver, said that 23 clinical sites in the United States are currently studying the following geriatric-specific variables in surgery patients aged 65 and older:
• Origin from home with support (to determine baseline functional status: lives alone at home, lives with support in home, origin status not from home).
• Discharge functional health status (ability to perform activities of daily living).
• Discharge with/without services (to capture care needs upon discharge).
• Preoperative use of a mobility aid.
• Preoperative history of prior falls.
• Postoperative history of pressure ulcer.
• Fall risk on discharge.
• New mobility aid on discharge.
• History of dementia.
• Competency status on admission.
• Postoperative delirium (yes or no).
• Hospice care on admission (yes or no).
• Do Not Resuscitate (DNR) order in place on admission (yes or no).
• DNR order during hospitalization (yes or no).
• Setting where DNR order was placed.
• Postoperative palliative care consult (yes or no).
• 30-day postoperative outcomes: functional health status (ability to perform activities of daily living), physical function compared with baseline, and living location.
The number of surgery cases in the collaborative grew from 7,235 in the first 6 months of 2014 to 24,835 cases in the last 6 months of 2015. The top 10 operations were total joint arthroplasty (29%), colectomy (12%), spine (8%), hip fracture (7%), carotid endarterectomy (4%), hysterectomy (4%), lung resection (2%), open lower extremity bypass (2%), laparoscopic cholecystectomy (2%), and pancreatectomy (2%).
Dr. Robinson reported that the rate of preoperative dementia among cases studied in the collaborative was 10%. “The incorporation of dementia into a surgical dataset represents an important step forward in providing quality surgical care for the elderly,” he said. “Dementia is a global public health concern.” He went on to note that patients with dementia have a 2.5-fold increased risk of developing postoperative delirium, making it “the perfect place to start a quality project. One in three cases of delirium is preventable. In our data set, delirium is associated with a hospital stay that’s 4 days longer, an increased chance of requiring discharge to an institutional care facility, an increased chance of a serious complication, and a higher 30-day mortality.”
Simple, low-tech bedside interventions such as ambulating in the hall three times a day, orienting the person, having the person sleep at night rather than sleep during the day, and avoiding medications with high risk for adverse events in older adults can prevent postoperative delirium, Dr. Robinson said.
One way that the Geriatric Surgery Pilot Collaborative can improve the surgical care of older adults is by fostering quality programs initiated at the participating local hospitals. “Preserving function after hospital stays is a first major goal,” he said. Another strategy involves creating a multidisciplinary frailty assessment to aid with decision making and risk assessment. “This takes into consideration NSQIP variables such as function, nutrition, comorbidity burden, cognition, social vulnerability, and mobility,” he said. The final and ultimate goal of the geriatric surgery collaborative is to establish a foundation of quality measurement for the Coalition for Quality in Geriatric Surgery, a project initiated by the American College of Surgeons to systematically improve the surgical care of older adults.
Dr. Robinson reported having no relevant financial disclosures.
SAN DIEGO – A project to assess geriatric-specific variables for inclusion in the American College of Surgeons/National Surgical Quality Improvement Program National Conference is underway, according to Thomas N. Robinson, MD.
Dr. Robinson discussed results from the ACS-NSQIP Geriatric Surgery Pilot Collaborative, an effort launched in January 2014 with the ultimate goal of evaluating specific geriatric variables for incorporation into the ACS NSQIP set of essential variables collected by all participating hospitals.
Dr. Robinson, professor of surgery at the University of Colorado, Denver, said that 23 clinical sites in the United States are currently studying the following geriatric-specific variables in surgery patients aged 65 and older:
• Origin from home with support (to determine baseline functional status: lives alone at home, lives with support in home, origin status not from home).
• Discharge functional health status (ability to perform activities of daily living).
• Discharge with/without services (to capture care needs upon discharge).
• Preoperative use of a mobility aid.
• Preoperative history of prior falls.
• Postoperative history of pressure ulcer.
• Fall risk on discharge.
• New mobility aid on discharge.
• History of dementia.
• Competency status on admission.
• Postoperative delirium (yes or no).
• Hospice care on admission (yes or no).
• Do Not Resuscitate (DNR) order in place on admission (yes or no).
• DNR order during hospitalization (yes or no).
• Setting where DNR order was placed.
• Postoperative palliative care consult (yes or no).
• 30-day postoperative outcomes: functional health status (ability to perform activities of daily living), physical function compared with baseline, and living location.
The number of surgery cases in the collaborative grew from 7,235 in the first 6 months of 2014 to 24,835 cases in the last 6 months of 2015. The top 10 operations were total joint arthroplasty (29%), colectomy (12%), spine (8%), hip fracture (7%), carotid endarterectomy (4%), hysterectomy (4%), lung resection (2%), open lower extremity bypass (2%), laparoscopic cholecystectomy (2%), and pancreatectomy (2%).
Dr. Robinson reported that the rate of preoperative dementia among cases studied in the collaborative was 10%. “The incorporation of dementia into a surgical dataset represents an important step forward in providing quality surgical care for the elderly,” he said. “Dementia is a global public health concern.” He went on to note that patients with dementia have a 2.5-fold increased risk of developing postoperative delirium, making it “the perfect place to start a quality project. One in three cases of delirium is preventable. In our data set, delirium is associated with a hospital stay that’s 4 days longer, an increased chance of requiring discharge to an institutional care facility, an increased chance of a serious complication, and a higher 30-day mortality.”
Simple, low-tech bedside interventions such as ambulating in the hall three times a day, orienting the person, having the person sleep at night rather than sleep during the day, and avoiding medications with high risk for adverse events in older adults can prevent postoperative delirium, Dr. Robinson said.
One way that the Geriatric Surgery Pilot Collaborative can improve the surgical care of older adults is by fostering quality programs initiated at the participating local hospitals. “Preserving function after hospital stays is a first major goal,” he said. Another strategy involves creating a multidisciplinary frailty assessment to aid with decision making and risk assessment. “This takes into consideration NSQIP variables such as function, nutrition, comorbidity burden, cognition, social vulnerability, and mobility,” he said. The final and ultimate goal of the geriatric surgery collaborative is to establish a foundation of quality measurement for the Coalition for Quality in Geriatric Surgery, a project initiated by the American College of Surgeons to systematically improve the surgical care of older adults.
Dr. Robinson reported having no relevant financial disclosures.
EXPERT ANALYSIS AT THE ACS NSQIP NATIONAL CONFERENCE
ESR1 mutations found prognostic but not predictive in metastatic breast cancer
CHICAGO – Mutation of the estrogen receptor 1 (ESR1) gene, one of the mechanisms whereby tumors become resistant to endocrine therapy, may be prognostic but not predictive in women with hormone receptor–positive advanced breast cancer that has progressed on this therapy, two studies showed.
In a cohort of women who had experienced progression on a first-line aromatase inhibitor, those with ESR1 mutations detected in circulating cell-free DNA at the time of progression had a 70% higher risk of progression-free survival events and a 90% higher risk of death thereafter. But the presence of this mutation did not predict benefit from subsequent therapy.
Similarly, in an analysis of women who had experienced progression on prior endocrine therapy and were treated on the randomized PALOMA-3 trial with fulvestrant plus either palbociclib (a cyclin-dependent kinase inhibitor) or placebo, adding the drug reduced the risk of progression-free survival events similarly, by about 50%, regardless of the presence of ESR1 mutations before starting therapy.
Findings in context
These new findings can help guide decisions about which patients should be tested for ESR1 mutations, according to invited discussant Sarat Chandarlapaty, MD, PhD, a medical oncologist at the Memorial Sloan Kettering Cancer Center in New York.
“Putting it all together … we see ESR1 mutations arise in the metastatic setting subclonally with prolonged exposure to low-estrogen environments,” he said. “If we are going to do testing, it makes sense to do it in the setting in which there is prior exposure to an aromatase inhibitor in metastatic ER-positive breast cancer.”
The studies’ results also help clarify what the finding of an ESR1 mutation means for patient prognosis and choice of next therapy, Dr. Chandarlapaty said at the annual meeting of the American Society of Clinical Oncology.
“It’s clear from two large studies that ESR1 mutation prognosticates poorer and shorter survival, so just the finding alone may aid as sort of a clinical risk assessment for physicians,” he said. “For the question of prediction, I would say the weight of evidence—even though the clinical studies are small—all the way from biology to clinic is that ESR1-mutant patients are unlikely to benefit from a second-line aromatase inhibitor.”
However, “the question of whether testing should be made available in practice on the basis of this particular clinical decision is more complicated,” Dr. Chandarlapaty said. “For one, is second-line aromatase inhibitor alone a widely used option? Secondly, does the adoption now of palbociclib in the first-line setting change the biology and the nature of resistance at this later line – in other words, are we going to see patients going on to a second-line aromatase inhibitor after they’ve had a prior aromatase inhibitor plus palbociclib?”
Mutations after progression on first-line aromatase inhibitors
In the first study, Florian Clatot, MD, PhD, a medical oncologist at the Centre Henri Becquerel, University of Normandy, Rouen, France, and colleagues retrospectively studied 144 women who had experienced progression on a first-line aromatase inhibitor.
The investigators used digital droplet polymerase chain reaction (PCR) testing to screen for four ESR-1 mutations in circulating cell-free DNA collected before, at the time of, and after the progression.
Overall, 30.6% of women were found to have at least one of these mutations at the time of progression, Dr. Clatot reported. The prevalence was higher the longer women had been on the aromatase inhibitor.
After progression, most women went on to receive chemotherapy or alternative endocrine therapy with either the selective estrogen receptor modulator tamoxifen or the estrogen receptor antagonist fulvestrant (Faslodex). With a median follow-up of 40 months, multivariate analyses showed that the group with ESR1 mutations at progression had higher risks of subsequent progression-free survival events (hazard ratio, 1.7; P = .008) and death (hazard ratio, 1.9; P = .002).
However, the mutations were not predictive: Women having one fared more poorly, whether given chemotherapy or given tamoxifen or fulvestrant.
Kinetic analyses showed that 75% of the ESR1 mutations seen at progression were already detectable in the 3 and even 6 months before that event. “Most of the mutations detected before progression increased while aromatase inhibitor therapy was ongoing,” Dr. Clatot commented. “These results suggest that the preclinical detection of ESR1 circulating mutation may [be of] clinical interest.”
Most women who had mutations at progression saw a decrease in the amount detectable over the subsequent 3 months with therapy, including to the point of not being detectable in about half of cases with a reduction. All of those having an increase in mutational burden had progression on their next therapy, compared with only about 40% of those having a decrease in burden.
“Taken together, these results suggest that the selection pressure provided by aromatase inhibitor exposure is one of the main mechanisms of ESR1 mutation increase,” concluded Dr. Clatot. “ESR1 mutations are a strong and independent marker of poor prognosis but do not have any predictive value with the treatments used in our cohort.”
Mutations after progression on prior endocrine therapy
In the second study, Nicholas C. Turner, MD, PhD, a medical oncologist at the Royal Marsden Hospital and Institute of Cancer Research, London, and colleagues analyzed data from a subset of 395 women from PALOMA-3.
The randomized phase III trial tested fulvestrant combined with palbociclib (Ibrance), an oral inhibitor of cyclin-dependent kinases 4 and 6, or placebo. All of the women enrolled had experienced progression on prior endocrine therapy in the adjuvant, advanced, or metastatic setting.
The investigators looked for 12 ESR1 mutations in circulating tumor DNA from baseline plasma samples using the BEAMing (beads, emulsion, amplification, magnetics) digital PCR technique and droplet digital PCR screening.
Overall, 27% of the women had ESR1 mutations before starting therapy on the trial, Dr. Turner reported. Mutations were seen in those who had received a prior aromatase inhibitor, but not in those who had received only prior tamoxifen. “So it’s quite clear these mutations are not a mechanism of resistance to tamoxifen, suggesting that tamoxifen must have at least some activity against these mutations,” he commented.
In addition, ESR1 mutations were more common in patients who had been sensitive versus not to prior endocrine therapy of any type (30.3% vs. 12.8%) and in patients who had been sensitive versus not specifically to prior aromatase inhibitor therapy (34.6% vs. 11.1%).
Stratified analyses showed that palbociclib was similarly superior to placebo in terms of progression-free survival whether patients were positive for an ESR1 mutation (9.4 vs. 4.1 months; hazard ratio, 0.52; P = .0052) or negative (9.5 vs. 3.8 months; hazard ratio, 0.44; P less than .0001).
“Detection of estrogen receptor mutations was strongly associated with acquired resistance to prior aromatase inhibitors,” Dr. Turner said.
“Palbociclib offered high efficacy regardless of the estrogen receptor mutation status,” he added. “And because in this patient population estrogen receptor mutations are detected frequently, the combination of palbociclib and fulvestrant presents an attractive treatment option for patients who have been previously treated with and progressed on aromatase inhibitors.”
CHICAGO – Mutation of the estrogen receptor 1 (ESR1) gene, one of the mechanisms whereby tumors become resistant to endocrine therapy, may be prognostic but not predictive in women with hormone receptor–positive advanced breast cancer that has progressed on this therapy, two studies showed.
In a cohort of women who had experienced progression on a first-line aromatase inhibitor, those with ESR1 mutations detected in circulating cell-free DNA at the time of progression had a 70% higher risk of progression-free survival events and a 90% higher risk of death thereafter. But the presence of this mutation did not predict benefit from subsequent therapy.
Similarly, in an analysis of women who had experienced progression on prior endocrine therapy and were treated on the randomized PALOMA-3 trial with fulvestrant plus either palbociclib (a cyclin-dependent kinase inhibitor) or placebo, adding the drug reduced the risk of progression-free survival events similarly, by about 50%, regardless of the presence of ESR1 mutations before starting therapy.
Findings in context
These new findings can help guide decisions about which patients should be tested for ESR1 mutations, according to invited discussant Sarat Chandarlapaty, MD, PhD, a medical oncologist at the Memorial Sloan Kettering Cancer Center in New York.
“Putting it all together … we see ESR1 mutations arise in the metastatic setting subclonally with prolonged exposure to low-estrogen environments,” he said. “If we are going to do testing, it makes sense to do it in the setting in which there is prior exposure to an aromatase inhibitor in metastatic ER-positive breast cancer.”
The studies’ results also help clarify what the finding of an ESR1 mutation means for patient prognosis and choice of next therapy, Dr. Chandarlapaty said at the annual meeting of the American Society of Clinical Oncology.
“It’s clear from two large studies that ESR1 mutation prognosticates poorer and shorter survival, so just the finding alone may aid as sort of a clinical risk assessment for physicians,” he said. “For the question of prediction, I would say the weight of evidence—even though the clinical studies are small—all the way from biology to clinic is that ESR1-mutant patients are unlikely to benefit from a second-line aromatase inhibitor.”
However, “the question of whether testing should be made available in practice on the basis of this particular clinical decision is more complicated,” Dr. Chandarlapaty said. “For one, is second-line aromatase inhibitor alone a widely used option? Secondly, does the adoption now of palbociclib in the first-line setting change the biology and the nature of resistance at this later line – in other words, are we going to see patients going on to a second-line aromatase inhibitor after they’ve had a prior aromatase inhibitor plus palbociclib?”
Mutations after progression on first-line aromatase inhibitors
In the first study, Florian Clatot, MD, PhD, a medical oncologist at the Centre Henri Becquerel, University of Normandy, Rouen, France, and colleagues retrospectively studied 144 women who had experienced progression on a first-line aromatase inhibitor.
The investigators used digital droplet polymerase chain reaction (PCR) testing to screen for four ESR-1 mutations in circulating cell-free DNA collected before, at the time of, and after the progression.
Overall, 30.6% of women were found to have at least one of these mutations at the time of progression, Dr. Clatot reported. The prevalence was higher the longer women had been on the aromatase inhibitor.
After progression, most women went on to receive chemotherapy or alternative endocrine therapy with either the selective estrogen receptor modulator tamoxifen or the estrogen receptor antagonist fulvestrant (Faslodex). With a median follow-up of 40 months, multivariate analyses showed that the group with ESR1 mutations at progression had higher risks of subsequent progression-free survival events (hazard ratio, 1.7; P = .008) and death (hazard ratio, 1.9; P = .002).
However, the mutations were not predictive: Women having one fared more poorly, whether given chemotherapy or given tamoxifen or fulvestrant.
Kinetic analyses showed that 75% of the ESR1 mutations seen at progression were already detectable in the 3 and even 6 months before that event. “Most of the mutations detected before progression increased while aromatase inhibitor therapy was ongoing,” Dr. Clatot commented. “These results suggest that the preclinical detection of ESR1 circulating mutation may [be of] clinical interest.”
Most women who had mutations at progression saw a decrease in the amount detectable over the subsequent 3 months with therapy, including to the point of not being detectable in about half of cases with a reduction. All of those having an increase in mutational burden had progression on their next therapy, compared with only about 40% of those having a decrease in burden.
“Taken together, these results suggest that the selection pressure provided by aromatase inhibitor exposure is one of the main mechanisms of ESR1 mutation increase,” concluded Dr. Clatot. “ESR1 mutations are a strong and independent marker of poor prognosis but do not have any predictive value with the treatments used in our cohort.”
Mutations after progression on prior endocrine therapy
In the second study, Nicholas C. Turner, MD, PhD, a medical oncologist at the Royal Marsden Hospital and Institute of Cancer Research, London, and colleagues analyzed data from a subset of 395 women from PALOMA-3.
The randomized phase III trial tested fulvestrant combined with palbociclib (Ibrance), an oral inhibitor of cyclin-dependent kinases 4 and 6, or placebo. All of the women enrolled had experienced progression on prior endocrine therapy in the adjuvant, advanced, or metastatic setting.
The investigators looked for 12 ESR1 mutations in circulating tumor DNA from baseline plasma samples using the BEAMing (beads, emulsion, amplification, magnetics) digital PCR technique and droplet digital PCR screening.
Overall, 27% of the women had ESR1 mutations before starting therapy on the trial, Dr. Turner reported. Mutations were seen in those who had received a prior aromatase inhibitor, but not in those who had received only prior tamoxifen. “So it’s quite clear these mutations are not a mechanism of resistance to tamoxifen, suggesting that tamoxifen must have at least some activity against these mutations,” he commented.
In addition, ESR1 mutations were more common in patients who had been sensitive versus not to prior endocrine therapy of any type (30.3% vs. 12.8%) and in patients who had been sensitive versus not specifically to prior aromatase inhibitor therapy (34.6% vs. 11.1%).
Stratified analyses showed that palbociclib was similarly superior to placebo in terms of progression-free survival whether patients were positive for an ESR1 mutation (9.4 vs. 4.1 months; hazard ratio, 0.52; P = .0052) or negative (9.5 vs. 3.8 months; hazard ratio, 0.44; P less than .0001).
“Detection of estrogen receptor mutations was strongly associated with acquired resistance to prior aromatase inhibitors,” Dr. Turner said.
“Palbociclib offered high efficacy regardless of the estrogen receptor mutation status,” he added. “And because in this patient population estrogen receptor mutations are detected frequently, the combination of palbociclib and fulvestrant presents an attractive treatment option for patients who have been previously treated with and progressed on aromatase inhibitors.”
CHICAGO – Mutation of the estrogen receptor 1 (ESR1) gene, one of the mechanisms whereby tumors become resistant to endocrine therapy, may be prognostic but not predictive in women with hormone receptor–positive advanced breast cancer that has progressed on this therapy, two studies showed.
In a cohort of women who had experienced progression on a first-line aromatase inhibitor, those with ESR1 mutations detected in circulating cell-free DNA at the time of progression had a 70% higher risk of progression-free survival events and a 90% higher risk of death thereafter. But the presence of this mutation did not predict benefit from subsequent therapy.
Similarly, in an analysis of women who had experienced progression on prior endocrine therapy and were treated on the randomized PALOMA-3 trial with fulvestrant plus either palbociclib (a cyclin-dependent kinase inhibitor) or placebo, adding the drug reduced the risk of progression-free survival events similarly, by about 50%, regardless of the presence of ESR1 mutations before starting therapy.
Findings in context
These new findings can help guide decisions about which patients should be tested for ESR1 mutations, according to invited discussant Sarat Chandarlapaty, MD, PhD, a medical oncologist at the Memorial Sloan Kettering Cancer Center in New York.
“Putting it all together … we see ESR1 mutations arise in the metastatic setting subclonally with prolonged exposure to low-estrogen environments,” he said. “If we are going to do testing, it makes sense to do it in the setting in which there is prior exposure to an aromatase inhibitor in metastatic ER-positive breast cancer.”
The studies’ results also help clarify what the finding of an ESR1 mutation means for patient prognosis and choice of next therapy, Dr. Chandarlapaty said at the annual meeting of the American Society of Clinical Oncology.
“It’s clear from two large studies that ESR1 mutation prognosticates poorer and shorter survival, so just the finding alone may aid as sort of a clinical risk assessment for physicians,” he said. “For the question of prediction, I would say the weight of evidence—even though the clinical studies are small—all the way from biology to clinic is that ESR1-mutant patients are unlikely to benefit from a second-line aromatase inhibitor.”
However, “the question of whether testing should be made available in practice on the basis of this particular clinical decision is more complicated,” Dr. Chandarlapaty said. “For one, is second-line aromatase inhibitor alone a widely used option? Secondly, does the adoption now of palbociclib in the first-line setting change the biology and the nature of resistance at this later line – in other words, are we going to see patients going on to a second-line aromatase inhibitor after they’ve had a prior aromatase inhibitor plus palbociclib?”
Mutations after progression on first-line aromatase inhibitors
In the first study, Florian Clatot, MD, PhD, a medical oncologist at the Centre Henri Becquerel, University of Normandy, Rouen, France, and colleagues retrospectively studied 144 women who had experienced progression on a first-line aromatase inhibitor.
The investigators used digital droplet polymerase chain reaction (PCR) testing to screen for four ESR-1 mutations in circulating cell-free DNA collected before, at the time of, and after the progression.
Overall, 30.6% of women were found to have at least one of these mutations at the time of progression, Dr. Clatot reported. The prevalence was higher the longer women had been on the aromatase inhibitor.
After progression, most women went on to receive chemotherapy or alternative endocrine therapy with either the selective estrogen receptor modulator tamoxifen or the estrogen receptor antagonist fulvestrant (Faslodex). With a median follow-up of 40 months, multivariate analyses showed that the group with ESR1 mutations at progression had higher risks of subsequent progression-free survival events (hazard ratio, 1.7; P = .008) and death (hazard ratio, 1.9; P = .002).
However, the mutations were not predictive: Women having one fared more poorly, whether given chemotherapy or given tamoxifen or fulvestrant.
Kinetic analyses showed that 75% of the ESR1 mutations seen at progression were already detectable in the 3 and even 6 months before that event. “Most of the mutations detected before progression increased while aromatase inhibitor therapy was ongoing,” Dr. Clatot commented. “These results suggest that the preclinical detection of ESR1 circulating mutation may [be of] clinical interest.”
Most women who had mutations at progression saw a decrease in the amount detectable over the subsequent 3 months with therapy, including to the point of not being detectable in about half of cases with a reduction. All of those having an increase in mutational burden had progression on their next therapy, compared with only about 40% of those having a decrease in burden.
“Taken together, these results suggest that the selection pressure provided by aromatase inhibitor exposure is one of the main mechanisms of ESR1 mutation increase,” concluded Dr. Clatot. “ESR1 mutations are a strong and independent marker of poor prognosis but do not have any predictive value with the treatments used in our cohort.”
Mutations after progression on prior endocrine therapy
In the second study, Nicholas C. Turner, MD, PhD, a medical oncologist at the Royal Marsden Hospital and Institute of Cancer Research, London, and colleagues analyzed data from a subset of 395 women from PALOMA-3.
The randomized phase III trial tested fulvestrant combined with palbociclib (Ibrance), an oral inhibitor of cyclin-dependent kinases 4 and 6, or placebo. All of the women enrolled had experienced progression on prior endocrine therapy in the adjuvant, advanced, or metastatic setting.
The investigators looked for 12 ESR1 mutations in circulating tumor DNA from baseline plasma samples using the BEAMing (beads, emulsion, amplification, magnetics) digital PCR technique and droplet digital PCR screening.
Overall, 27% of the women had ESR1 mutations before starting therapy on the trial, Dr. Turner reported. Mutations were seen in those who had received a prior aromatase inhibitor, but not in those who had received only prior tamoxifen. “So it’s quite clear these mutations are not a mechanism of resistance to tamoxifen, suggesting that tamoxifen must have at least some activity against these mutations,” he commented.
In addition, ESR1 mutations were more common in patients who had been sensitive versus not to prior endocrine therapy of any type (30.3% vs. 12.8%) and in patients who had been sensitive versus not specifically to prior aromatase inhibitor therapy (34.6% vs. 11.1%).
Stratified analyses showed that palbociclib was similarly superior to placebo in terms of progression-free survival whether patients were positive for an ESR1 mutation (9.4 vs. 4.1 months; hazard ratio, 0.52; P = .0052) or negative (9.5 vs. 3.8 months; hazard ratio, 0.44; P less than .0001).
“Detection of estrogen receptor mutations was strongly associated with acquired resistance to prior aromatase inhibitors,” Dr. Turner said.
“Palbociclib offered high efficacy regardless of the estrogen receptor mutation status,” he added. “And because in this patient population estrogen receptor mutations are detected frequently, the combination of palbociclib and fulvestrant presents an attractive treatment option for patients who have been previously treated with and progressed on aromatase inhibitors.”
AT THE 2016 ASCO ANNUAL MEETING
Key clinical point: An ESR1 mutation in circulating DNA after progression on endocrine therapy was a marker for poor prognosis but did not predict benefit from subsequent therapy.
Major finding: Women with ESR1 mutations had poorer progression-free and overall survival (hazard ratios, 1.7 and 1.9). Adding palbociclib to fulvestrant halved the risk of progression-free survival events, regardless of the presence of an ESR1 mutation.
Data source: A retrospective cohort study of 144 women with metastatic breast cancer who had experienced progression on a first-line aromatase inhibitor, and an analysis of 395 women with advanced breast cancer from a randomized trial testing addition of palbociclib to fulvestrant after progression on prior endocrine therapy (PALOMA-3 trial).
Disclosures: Dr. Clatot disclosed that he receives research funding from Novartis. Dr. Turner disclosed that he receives honoraria from and has a consulting or advisory role with AstraZeneca, Pfizer, and Roche Pharma; Pfizer sponsored PALOMA-3, and AstraZeneca provided the fulvestrant.
National Program Reduces Catheter-Associated Urinary Tract Infections
Clinical question: Can a program of education, feedback, and proper training reduce catheter-associated urinary tract infections in hospitalized patients?
Bottom line: The Comprehensive Unit-based Safety Program, or CUSP, is a national program in the United States that aims to reduce catheter-associated urinary tract infections (CAUTIs) by focusing on proper technical skills, behavioral changes, education, and feedback. Implementation of the CUSP recommendations was effective in reducing catheter use and CAUTIs in patients in nonintensive care units (non-ICUs). The program was likely successful because it included both socioadaptive and technical changes and allowed the individual hospitals to customize interventions based on their own needs.
Reference: Saint S, Greene MT, Krein SL, et al. A program to prevent catheter-associated urinary tract infection in acute care. N Engl J Med 2016;374(22):2111-2119.
Design: Case series; LOE: 2b
Setting: Inpatient (any location)
Synopsis: This study reports the results of an 18-month program to reduce CAUTIs that was implemented in 926 inpatient units in 603 acute-care U.S. hospitals (which represents 10% of the acute care hospitals in the country). Overall, 40% of the units were ICUs while the remainder were non-ICUs.
Key recommendations of the program included the following: (1) assessing for the presence and need for a urinary catheter daily, (2) avoiding the use of a urinary catheter while emphasizing alternative urine-collection methods, and (3) promoting proper insertion and maintenance of catheters, when necessary. Hospitals were allowed to decide how best to implement these interventions in their individual units. Furthermore, participating unit teams received education on the prevention of CAUTIs as well as feedback on catheter use and the rate of CAUTIs on their individual units.
Data were collected over a 3-month baseline phase, a 2-month implementation phase, and a 12-month sustainability phase. After adjusting for hospital characteristics, the rate of CAUTIs decreased from 2.40 infections per 1000 catheter-days at the end of the baseline phase to 2.05 infections per 1000 catheter-days at the end of the sustainability phase. The reduction was statistically significant only in non-ICUs where CAUTIs decreased from 2.28 to 1.54 infections per 1000 catheter-days while catheter use decreased from 20.1% to 18.8%. This was not a randomized controlled trial, so confounding variables including secular trends may have affected the findings in this study.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
Clinical question: Can a program of education, feedback, and proper training reduce catheter-associated urinary tract infections in hospitalized patients?
Bottom line: The Comprehensive Unit-based Safety Program, or CUSP, is a national program in the United States that aims to reduce catheter-associated urinary tract infections (CAUTIs) by focusing on proper technical skills, behavioral changes, education, and feedback. Implementation of the CUSP recommendations was effective in reducing catheter use and CAUTIs in patients in nonintensive care units (non-ICUs). The program was likely successful because it included both socioadaptive and technical changes and allowed the individual hospitals to customize interventions based on their own needs.
Reference: Saint S, Greene MT, Krein SL, et al. A program to prevent catheter-associated urinary tract infection in acute care. N Engl J Med 2016;374(22):2111-2119.
Design: Case series; LOE: 2b
Setting: Inpatient (any location)
Synopsis: This study reports the results of an 18-month program to reduce CAUTIs that was implemented in 926 inpatient units in 603 acute-care U.S. hospitals (which represents 10% of the acute care hospitals in the country). Overall, 40% of the units were ICUs while the remainder were non-ICUs.
Key recommendations of the program included the following: (1) assessing for the presence and need for a urinary catheter daily, (2) avoiding the use of a urinary catheter while emphasizing alternative urine-collection methods, and (3) promoting proper insertion and maintenance of catheters, when necessary. Hospitals were allowed to decide how best to implement these interventions in their individual units. Furthermore, participating unit teams received education on the prevention of CAUTIs as well as feedback on catheter use and the rate of CAUTIs on their individual units.
Data were collected over a 3-month baseline phase, a 2-month implementation phase, and a 12-month sustainability phase. After adjusting for hospital characteristics, the rate of CAUTIs decreased from 2.40 infections per 1000 catheter-days at the end of the baseline phase to 2.05 infections per 1000 catheter-days at the end of the sustainability phase. The reduction was statistically significant only in non-ICUs where CAUTIs decreased from 2.28 to 1.54 infections per 1000 catheter-days while catheter use decreased from 20.1% to 18.8%. This was not a randomized controlled trial, so confounding variables including secular trends may have affected the findings in this study.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
Clinical question: Can a program of education, feedback, and proper training reduce catheter-associated urinary tract infections in hospitalized patients?
Bottom line: The Comprehensive Unit-based Safety Program, or CUSP, is a national program in the United States that aims to reduce catheter-associated urinary tract infections (CAUTIs) by focusing on proper technical skills, behavioral changes, education, and feedback. Implementation of the CUSP recommendations was effective in reducing catheter use and CAUTIs in patients in nonintensive care units (non-ICUs). The program was likely successful because it included both socioadaptive and technical changes and allowed the individual hospitals to customize interventions based on their own needs.
Reference: Saint S, Greene MT, Krein SL, et al. A program to prevent catheter-associated urinary tract infection in acute care. N Engl J Med 2016;374(22):2111-2119.
Design: Case series; LOE: 2b
Setting: Inpatient (any location)
Synopsis: This study reports the results of an 18-month program to reduce CAUTIs that was implemented in 926 inpatient units in 603 acute-care U.S. hospitals (which represents 10% of the acute care hospitals in the country). Overall, 40% of the units were ICUs while the remainder were non-ICUs.
Key recommendations of the program included the following: (1) assessing for the presence and need for a urinary catheter daily, (2) avoiding the use of a urinary catheter while emphasizing alternative urine-collection methods, and (3) promoting proper insertion and maintenance of catheters, when necessary. Hospitals were allowed to decide how best to implement these interventions in their individual units. Furthermore, participating unit teams received education on the prevention of CAUTIs as well as feedback on catheter use and the rate of CAUTIs on their individual units.
Data were collected over a 3-month baseline phase, a 2-month implementation phase, and a 12-month sustainability phase. After adjusting for hospital characteristics, the rate of CAUTIs decreased from 2.40 infections per 1000 catheter-days at the end of the baseline phase to 2.05 infections per 1000 catheter-days at the end of the sustainability phase. The reduction was statistically significant only in non-ICUs where CAUTIs decreased from 2.28 to 1.54 infections per 1000 catheter-days while catheter use decreased from 20.1% to 18.8%. This was not a randomized controlled trial, so confounding variables including secular trends may have affected the findings in this study.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
Procalcitonin Guidance Safely Decreases Antibiotic Use in Critically Ill Patients
Clinical question: Can the use of procalcitonin levels to determine when to discontinue antibiotic therapy safely reduce the duration of antibiotic use in critically ill patients?
Bottom line: For patients in the intensive care unit (ICU) who receive antibiotics for presumed or proven bacterial infections, the use of procalcitonin levels to determine when to stop antibiotic therapy results in decreased duration and consumption of antibiotics without increasing mortality.
Reference: De Jong E, Van Oers JA, Beishuizen A, et al. Efficacy and safety of procalcitonin guidance in reducing the duration of antibiotic treatment in critically ill patients: a randomised, controlled, open-label trial. Lancet Infect Dis 2016;16(7):819-827.
Design: Randomized controlled trial (nonblinded); LOE: 1b
Setting: Inpatient (ICU only)
Synopsis: To test the efficacy and safety of procalcitonin-guided antibiotic therapy, these investigators recruited patients in the ICU who had received their first doses of antibiotics for a presumed or proven bacterial infection within 24 hours of enrollment. Patients who were severely immunosuppressed and patients requiring prolonged courses of antibiotics (such as those with endocarditis) were excluded.
Using concealed allocation, patients were assigned to procalcitonin-guided treatment (n = 761) or to usual care (n = 785). The usual care group did not have procalcitonin levels drawn. In the procalcitonin group, patients had a procalcitonin level drawn close to the start of antibiotic therapy and daily thereafter until discharge from the ICU or 3 days after stopping antibiotic use. These levels were provided to the attending physician who could then decide whether to stop giving antibiotics.
Although the study protocol recommended that antibiotics be discontinued if the procalcitonin level had decreased by more than 80% of its peak value or reached a level of 0.5 mcg/L, the ultimate decision to do so was at the discretion of the attending physician. Overall, fewer than half the physicians actually discontinued antibiotics within 24 hours of reaching either of these goals. Despite this, the procalcitonin group had decreased number of days of antibiotic treatment (5 days vs 7 days; between group absolute difference = 1.22; 95% CI 0.65-1.78; P < .0001) and decreased consumption of antibiotics (7.5 daily defined doses vs 9.3 daily defined doses; between group absolute difference = 2.69; 1.26-4.12; P < .0001). Additionally, when examining 28-day mortality rates, the procalcitonin group was noninferior to the standard group, and ultimately, had fewer deaths than the standard group (20% vs 25%; between group absolute difference = 5.4%;1.2-9.5; P = .012). This mortality benefit persisted at 1 year.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
Clinical question: Can the use of procalcitonin levels to determine when to discontinue antibiotic therapy safely reduce the duration of antibiotic use in critically ill patients?
Bottom line: For patients in the intensive care unit (ICU) who receive antibiotics for presumed or proven bacterial infections, the use of procalcitonin levels to determine when to stop antibiotic therapy results in decreased duration and consumption of antibiotics without increasing mortality.
Reference: De Jong E, Van Oers JA, Beishuizen A, et al. Efficacy and safety of procalcitonin guidance in reducing the duration of antibiotic treatment in critically ill patients: a randomised, controlled, open-label trial. Lancet Infect Dis 2016;16(7):819-827.
Design: Randomized controlled trial (nonblinded); LOE: 1b
Setting: Inpatient (ICU only)
Synopsis: To test the efficacy and safety of procalcitonin-guided antibiotic therapy, these investigators recruited patients in the ICU who had received their first doses of antibiotics for a presumed or proven bacterial infection within 24 hours of enrollment. Patients who were severely immunosuppressed and patients requiring prolonged courses of antibiotics (such as those with endocarditis) were excluded.
Using concealed allocation, patients were assigned to procalcitonin-guided treatment (n = 761) or to usual care (n = 785). The usual care group did not have procalcitonin levels drawn. In the procalcitonin group, patients had a procalcitonin level drawn close to the start of antibiotic therapy and daily thereafter until discharge from the ICU or 3 days after stopping antibiotic use. These levels were provided to the attending physician who could then decide whether to stop giving antibiotics.
Although the study protocol recommended that antibiotics be discontinued if the procalcitonin level had decreased by more than 80% of its peak value or reached a level of 0.5 mcg/L, the ultimate decision to do so was at the discretion of the attending physician. Overall, fewer than half the physicians actually discontinued antibiotics within 24 hours of reaching either of these goals. Despite this, the procalcitonin group had decreased number of days of antibiotic treatment (5 days vs 7 days; between group absolute difference = 1.22; 95% CI 0.65-1.78; P < .0001) and decreased consumption of antibiotics (7.5 daily defined doses vs 9.3 daily defined doses; between group absolute difference = 2.69; 1.26-4.12; P < .0001). Additionally, when examining 28-day mortality rates, the procalcitonin group was noninferior to the standard group, and ultimately, had fewer deaths than the standard group (20% vs 25%; between group absolute difference = 5.4%;1.2-9.5; P = .012). This mortality benefit persisted at 1 year.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
Clinical question: Can the use of procalcitonin levels to determine when to discontinue antibiotic therapy safely reduce the duration of antibiotic use in critically ill patients?
Bottom line: For patients in the intensive care unit (ICU) who receive antibiotics for presumed or proven bacterial infections, the use of procalcitonin levels to determine when to stop antibiotic therapy results in decreased duration and consumption of antibiotics without increasing mortality.
Reference: De Jong E, Van Oers JA, Beishuizen A, et al. Efficacy and safety of procalcitonin guidance in reducing the duration of antibiotic treatment in critically ill patients: a randomised, controlled, open-label trial. Lancet Infect Dis 2016;16(7):819-827.
Design: Randomized controlled trial (nonblinded); LOE: 1b
Setting: Inpatient (ICU only)
Synopsis: To test the efficacy and safety of procalcitonin-guided antibiotic therapy, these investigators recruited patients in the ICU who had received their first doses of antibiotics for a presumed or proven bacterial infection within 24 hours of enrollment. Patients who were severely immunosuppressed and patients requiring prolonged courses of antibiotics (such as those with endocarditis) were excluded.
Using concealed allocation, patients were assigned to procalcitonin-guided treatment (n = 761) or to usual care (n = 785). The usual care group did not have procalcitonin levels drawn. In the procalcitonin group, patients had a procalcitonin level drawn close to the start of antibiotic therapy and daily thereafter until discharge from the ICU or 3 days after stopping antibiotic use. These levels were provided to the attending physician who could then decide whether to stop giving antibiotics.
Although the study protocol recommended that antibiotics be discontinued if the procalcitonin level had decreased by more than 80% of its peak value or reached a level of 0.5 mcg/L, the ultimate decision to do so was at the discretion of the attending physician. Overall, fewer than half the physicians actually discontinued antibiotics within 24 hours of reaching either of these goals. Despite this, the procalcitonin group had decreased number of days of antibiotic treatment (5 days vs 7 days; between group absolute difference = 1.22; 95% CI 0.65-1.78; P < .0001) and decreased consumption of antibiotics (7.5 daily defined doses vs 9.3 daily defined doses; between group absolute difference = 2.69; 1.26-4.12; P < .0001). Additionally, when examining 28-day mortality rates, the procalcitonin group was noninferior to the standard group, and ultimately, had fewer deaths than the standard group (20% vs 25%; between group absolute difference = 5.4%;1.2-9.5; P = .012). This mortality benefit persisted at 1 year.
Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.
AGA Clinical Practice Update: Experts carve pathway for celiac trials
Celiac disease’s only treatment is far from ideal. Not only is the gluten-free diet hard to follow, costly, and socially isolating, but even adherent patients can suffer persistent and disabling symptoms, Daniel A. Leffler, MD, MS, of Beth Israel Deaconess Medical Center in Boston, and his associates noted. This “high unmet medical need” inspired their discussion of clinical trials in celiac disease at the third Gastroenterology Regulatory Endpoints and Advancement of Therapeutics (GREAT 3). A summary appears as a Clinical Practice Update in Gastroenterology (2016 Jul 22. doi: 10.1053/j.gastro.2016.07.025).
Celiac is the “outlier among intestinal diseases,” the experts noted. Few randomized trials have assessed nondietary celiac disease therapies, drug developers lack a precedent approval to clarify a “guiding regulatory pathway.” The first step toward bridging these gaps is to better define target populations for clinical trials, meeting attendees agreed. Such groups might include patients with diet-refractory symptoms; newly diagnosed patients who need pharmacologic support for symptom resolution and duodenal healing; patients with asymptomatic mucosal damage, if such damage is shown to cause malignancy; and patients with neurobehavioral disorders that impede gluten avoidance. Medications that enable patients to safely consume gluten could benefit even more, including those who already face “arduous” dietary controls from comorbidities such as type 1 diabetes, the experts added.
Defining “clinical benefit” in pivotal trials of celiac disease also poses a challenge. “While every patient might desire something slightly different, the overarching theme of clinical benefit from the patient’s perspective appears to be quality of life – free of symptoms and inflammation without worry about [gluten] contamination,” attendees emphasized. Indeed, one recent survey found that patients prioritized protection against cross-contamination over being able to consume gluten at will. But initial trials should focus on gastrointestinal symptoms, which are common, affect patients of all ages, and “can be measured in a reasonable time frame,” they concluded.
Unfortunately, defining and measuring atypical symptoms can be difficult, particularly in children and adolescents. In an unpublished study at the Nationwide Children’s Hospital, gastrointestinal symptoms affected 77% of celiac patients, while only about 5% experienced atypical or nongastrointestinal symptoms, the experts noted. Such low percentages could make it difficult to adequately power studies of these nongastrointestinal outcomes. Furthermore, measuring sequelae such as osteoporosis or anemia “would require longer studies, as these conditions do not resolve quickly.”
Phase II and III trials cannot secure marketing approvals without clearly defining and measuring “clinical benefit,” Food and Drug Administration representatives at the meeting noted. Accordingly, diarrhea and abdominal pain will be key patient-reported outcome measures, and pivotal trials of young children with celiac disease will require observer-reported outcomes, attendees agreed. Although both celiac disease and the gluten-free diet profoundly undercut health-related quality of life, this measure is too broad and easily confounded to be a good endpoint in pivotal trials of celiac therapies. Likewise, histology is valuable for relating symptoms to active disease, but mucosal healing is too variable and unpredictable to serve as a primary endpoint, experts noted.
In contrast, serologic tests could serve as enrollment criteria, stratification measures, and endpoints if these tests received the appropriate FDA approvals, experts asserted. The endomysial antibody, tissue transglutaminase antibody, and deamidated gliadin peptide tests are most often used in practice, but have only been approved for diagnosing celiac disease – not as a replacement for biopsy or as a measure of disease progression or therapeutic response, FDA representatives noted. Although drug developers need tools to measure therapeutic efficacy in celiac disease, FDA recommended soliciting advice from its Center for Drug Evaluation and Research before testing these tools or kicking off monitoring studies.
The meeting was supported by the Celiac Disease Foundation and Beyond Celiac, and was sponsored by the FDA Center for Drug Evaluation and Research; the American Gastroenterological Association; the American College of Gastroenterology; the American Society for Pediatric Gastroenterology, Hepatology, and Nutrition; and the North American Society for the Study of Celiac Disease. Dr. Leffler disclosed ties to Alba Therapeutics, Alvine Pharmaceuticals, INOVA Diagnostics, Coronado Biosciences, Pfizer, and GI Supply. One coauthor and senior author, Dr. Sheila Crowe, disclosed ties to Alvine Pharmaceuticals, Ferring, and Celimmune.
Celiac disease’s only treatment is far from ideal. Not only is the gluten-free diet hard to follow, costly, and socially isolating, but even adherent patients can suffer persistent and disabling symptoms, Daniel A. Leffler, MD, MS, of Beth Israel Deaconess Medical Center in Boston, and his associates noted. This “high unmet medical need” inspired their discussion of clinical trials in celiac disease at the third Gastroenterology Regulatory Endpoints and Advancement of Therapeutics (GREAT 3). A summary appears as a Clinical Practice Update in Gastroenterology (2016 Jul 22. doi: 10.1053/j.gastro.2016.07.025).
Celiac is the “outlier among intestinal diseases,” the experts noted. Few randomized trials have assessed nondietary celiac disease therapies, drug developers lack a precedent approval to clarify a “guiding regulatory pathway.” The first step toward bridging these gaps is to better define target populations for clinical trials, meeting attendees agreed. Such groups might include patients with diet-refractory symptoms; newly diagnosed patients who need pharmacologic support for symptom resolution and duodenal healing; patients with asymptomatic mucosal damage, if such damage is shown to cause malignancy; and patients with neurobehavioral disorders that impede gluten avoidance. Medications that enable patients to safely consume gluten could benefit even more, including those who already face “arduous” dietary controls from comorbidities such as type 1 diabetes, the experts added.
Defining “clinical benefit” in pivotal trials of celiac disease also poses a challenge. “While every patient might desire something slightly different, the overarching theme of clinical benefit from the patient’s perspective appears to be quality of life – free of symptoms and inflammation without worry about [gluten] contamination,” attendees emphasized. Indeed, one recent survey found that patients prioritized protection against cross-contamination over being able to consume gluten at will. But initial trials should focus on gastrointestinal symptoms, which are common, affect patients of all ages, and “can be measured in a reasonable time frame,” they concluded.
Unfortunately, defining and measuring atypical symptoms can be difficult, particularly in children and adolescents. In an unpublished study at the Nationwide Children’s Hospital, gastrointestinal symptoms affected 77% of celiac patients, while only about 5% experienced atypical or nongastrointestinal symptoms, the experts noted. Such low percentages could make it difficult to adequately power studies of these nongastrointestinal outcomes. Furthermore, measuring sequelae such as osteoporosis or anemia “would require longer studies, as these conditions do not resolve quickly.”
Phase II and III trials cannot secure marketing approvals without clearly defining and measuring “clinical benefit,” Food and Drug Administration representatives at the meeting noted. Accordingly, diarrhea and abdominal pain will be key patient-reported outcome measures, and pivotal trials of young children with celiac disease will require observer-reported outcomes, attendees agreed. Although both celiac disease and the gluten-free diet profoundly undercut health-related quality of life, this measure is too broad and easily confounded to be a good endpoint in pivotal trials of celiac therapies. Likewise, histology is valuable for relating symptoms to active disease, but mucosal healing is too variable and unpredictable to serve as a primary endpoint, experts noted.
In contrast, serologic tests could serve as enrollment criteria, stratification measures, and endpoints if these tests received the appropriate FDA approvals, experts asserted. The endomysial antibody, tissue transglutaminase antibody, and deamidated gliadin peptide tests are most often used in practice, but have only been approved for diagnosing celiac disease – not as a replacement for biopsy or as a measure of disease progression or therapeutic response, FDA representatives noted. Although drug developers need tools to measure therapeutic efficacy in celiac disease, FDA recommended soliciting advice from its Center for Drug Evaluation and Research before testing these tools or kicking off monitoring studies.
The meeting was supported by the Celiac Disease Foundation and Beyond Celiac, and was sponsored by the FDA Center for Drug Evaluation and Research; the American Gastroenterological Association; the American College of Gastroenterology; the American Society for Pediatric Gastroenterology, Hepatology, and Nutrition; and the North American Society for the Study of Celiac Disease. Dr. Leffler disclosed ties to Alba Therapeutics, Alvine Pharmaceuticals, INOVA Diagnostics, Coronado Biosciences, Pfizer, and GI Supply. One coauthor and senior author, Dr. Sheila Crowe, disclosed ties to Alvine Pharmaceuticals, Ferring, and Celimmune.
Celiac disease’s only treatment is far from ideal. Not only is the gluten-free diet hard to follow, costly, and socially isolating, but even adherent patients can suffer persistent and disabling symptoms, Daniel A. Leffler, MD, MS, of Beth Israel Deaconess Medical Center in Boston, and his associates noted. This “high unmet medical need” inspired their discussion of clinical trials in celiac disease at the third Gastroenterology Regulatory Endpoints and Advancement of Therapeutics (GREAT 3). A summary appears as a Clinical Practice Update in Gastroenterology (2016 Jul 22. doi: 10.1053/j.gastro.2016.07.025).
Celiac is the “outlier among intestinal diseases,” the experts noted. Few randomized trials have assessed nondietary celiac disease therapies, drug developers lack a precedent approval to clarify a “guiding regulatory pathway.” The first step toward bridging these gaps is to better define target populations for clinical trials, meeting attendees agreed. Such groups might include patients with diet-refractory symptoms; newly diagnosed patients who need pharmacologic support for symptom resolution and duodenal healing; patients with asymptomatic mucosal damage, if such damage is shown to cause malignancy; and patients with neurobehavioral disorders that impede gluten avoidance. Medications that enable patients to safely consume gluten could benefit even more, including those who already face “arduous” dietary controls from comorbidities such as type 1 diabetes, the experts added.
Defining “clinical benefit” in pivotal trials of celiac disease also poses a challenge. “While every patient might desire something slightly different, the overarching theme of clinical benefit from the patient’s perspective appears to be quality of life – free of symptoms and inflammation without worry about [gluten] contamination,” attendees emphasized. Indeed, one recent survey found that patients prioritized protection against cross-contamination over being able to consume gluten at will. But initial trials should focus on gastrointestinal symptoms, which are common, affect patients of all ages, and “can be measured in a reasonable time frame,” they concluded.
Unfortunately, defining and measuring atypical symptoms can be difficult, particularly in children and adolescents. In an unpublished study at the Nationwide Children’s Hospital, gastrointestinal symptoms affected 77% of celiac patients, while only about 5% experienced atypical or nongastrointestinal symptoms, the experts noted. Such low percentages could make it difficult to adequately power studies of these nongastrointestinal outcomes. Furthermore, measuring sequelae such as osteoporosis or anemia “would require longer studies, as these conditions do not resolve quickly.”
Phase II and III trials cannot secure marketing approvals without clearly defining and measuring “clinical benefit,” Food and Drug Administration representatives at the meeting noted. Accordingly, diarrhea and abdominal pain will be key patient-reported outcome measures, and pivotal trials of young children with celiac disease will require observer-reported outcomes, attendees agreed. Although both celiac disease and the gluten-free diet profoundly undercut health-related quality of life, this measure is too broad and easily confounded to be a good endpoint in pivotal trials of celiac therapies. Likewise, histology is valuable for relating symptoms to active disease, but mucosal healing is too variable and unpredictable to serve as a primary endpoint, experts noted.
In contrast, serologic tests could serve as enrollment criteria, stratification measures, and endpoints if these tests received the appropriate FDA approvals, experts asserted. The endomysial antibody, tissue transglutaminase antibody, and deamidated gliadin peptide tests are most often used in practice, but have only been approved for diagnosing celiac disease – not as a replacement for biopsy or as a measure of disease progression or therapeutic response, FDA representatives noted. Although drug developers need tools to measure therapeutic efficacy in celiac disease, FDA recommended soliciting advice from its Center for Drug Evaluation and Research before testing these tools or kicking off monitoring studies.
The meeting was supported by the Celiac Disease Foundation and Beyond Celiac, and was sponsored by the FDA Center for Drug Evaluation and Research; the American Gastroenterological Association; the American College of Gastroenterology; the American Society for Pediatric Gastroenterology, Hepatology, and Nutrition; and the North American Society for the Study of Celiac Disease. Dr. Leffler disclosed ties to Alba Therapeutics, Alvine Pharmaceuticals, INOVA Diagnostics, Coronado Biosciences, Pfizer, and GI Supply. One coauthor and senior author, Dr. Sheila Crowe, disclosed ties to Alvine Pharmaceuticals, Ferring, and Celimmune.
FROM GASTROENTEROLOGY
Enhanced Radiation Dermatitis Associated With Concurrent Palliative Radiation and Vemurafenib Therapy
To the Editor:
Vemurafenib is a selective BRAF inhibitor that was approved by the US Food and Drug Administration (FDA) in August 2011 for the treatment of patients with unresectable or metastatic melanoma with the BRAF V600E mutation as detected by an approved test. Both malignant and nonmalignant cutaneous findings have been well documented in association with vemurafenib, including squamous cell carcinoma, keratoacanthomas, UVA photosensitivity, keratosis pilaris–like eruptions, seborrheic dermatitis, follicular plugging, follicular hyperkeratosis, and eruptive melanocytic nevi.1 As more patients with metastatic melanoma are treated with vemurafenib, the use of concomitant palliative or adjuvant radiation therapy with vemurafenib will inevitably occur in greater frequency. Therefore, it is critical to understand the potential cutaneous side effects of this combination.
A predisposition to enhanced radiation dermatitis has been well described with concurrent use of targeted chemotherapies such as the epidermal growth factor receptor inhibitor cetuximab with radiotherapy.2 We report a case of radiation dermatitis occurring shortly after initiating radiation therapy in a patient on vemurafenib.
A 53-year-old man with initial stage IIIB melanoma, Breslow depth 2.2 mm with histologic ulceration, and a mitotic index of 2/mm2 on the right buttock underwent wide local excision and sentinel lymph node biopsy followed by complete lymph node dissection with a total of 2 of 10 positive lymph nodes. The patient subsequently underwent 1 year of adjuvant high-dose interferon therapy. Four years after his initial presentation he developed metastases to the lungs, pelvis, and both femurs. He was started on oral vemurafenib 960 mg twice daily. Due to painful bony metastases in the pelvis, the patient also was started on concurrent palliative radiation therapy to both femurs, L5 vertebra, and the sacrum 1 day after initiation of vemurafenib. Three days after initiation of radiation therapy at a cumulative radiation dose of 0.75 Gy, the patient developed severe, painful, well-demarcated, erythematous plaques in the anterior and posterior pelvic distribution overlying the radiation field (Figure 1) that subsequently evolved to eroded desquamative plaques with copious transudate. The patient also developed hyperkeratotic papules on the chest and thighs consistent with the keratosis pilaris–like eruptions associated with vemurafenib therapy.1 Five months later the patient developed worsening neurologic symptoms, and magnetic resonance imaging of the brain revealed multiple brain metastases. Given his disease progression, vemurafenib was discontinued. Ten days later, the patient underwent palliative whole-brain radiation therapy. He received a total dose of 3.25 Gy to the whole brain without any cutaneous sequelae.
The pathophysiology of radiation dermatitis is caused by a dose-dependent loss of basal and endothelial cells following irradiation.3 If surviving basal cells are able to repopulate the basal monolayer, normal skin barrier function is preserved. Dose tolerance is exceeded when cell loss without replacement occurs, resulting in necrosis and clinical evidence of radiation dermatitis, which is characterized by painful erythema or hyperpigmentation followed by desquamation and skin necrosis. In general, occurrence and severity of radiation dermatitis when radiation therapy is used alone in the absence of concurrent chemotherapy is dose dependent, with cutaneous evidence of radiation dermatitis occurring at doses ranging from as low as 2 Gy but most commonly 5 to 10 Gy.4 A report of radiation recall dermatitis in 2 patients who received vemurafenib after completing a full course of radiotherapy5 supports the hypothesis that vemurafenib is a radiosensitizing medication. Enhanced radiation dermatitis was reported in a single case of a patient on vemurafenib who developed radiation dermatitis after completing 3.25 Gy of radiation to the lumbar spine. Although this case likely depicted enhanced radiation dermatitis secondary to concurrent vemurafenib use, it was inconclusive whether vemurafenib contributed to the cutaneous effect, as the patient developed a cutaneous skin reaction 1 week after receiving a cumulative radiation dose of 3.25 Gy, a level at which radiation alone has been shown to cause skin toxicity.6 In our patient, cutaneous manifestations were noted 3 days after initiation of radiation treatment, at which point he had received a total radiation dose of 0.75 Gy, which is well below the threshold commonly recognized to cause radiation-induced skin toxicities. In addition, rechallenge in this patient with higher-dose radiotherapy while off of vemurafenib treatment led to no skin toxicity, despite the common side effects of whole-brain radiation therapy including radiation dermatitis and alopecia.7
The exact mechanism of increased radiosensitivity caused by targeted chemotherapies such as cetuximab and vemurafenib is unclear. One possible explanation is that the drug interferes with the mitogen-activated protein kinase (MAPK) pathway, which plays a crucial role in controlling cell survival and regeneration following radiation exposure.8 Disruption of this signaling pathway through targeted therapies leads to impaired keratinocyte cell survival and recovery, and thus may enhance susceptibility to radiation-induced skin injury (Figure 2). In vivo studies have demonstrated that the epidermal growth factor receptor is activated following UV irradiation in human keratinocytes, leading to activation of the downstream MAPK signal transduction pathway required for cellular proliferation mediated via the RAF family of proteins.9,10 Further supporting the importance of this pathway in keratinocyte survival and recovery are findings that somatic deletion of BRAF in fibroblasts results in decreased growth factor–induced MAPK activation and enhanced apoptosis,8 whereas activated BRAF has been shown to exert protective effects against oxidative stress as well as tumorigenesis.11 The observation that mutant BRAF melanoma cells demonstrated increased radiosensitivity following BRAF inhibition with vemurafenib12 is consistent with our hypothesis that increased radiosensitivity occurs when signal transduction mediated by MAPK pathway is blocked, thereby inhibiting cell survival. As a result, radiation dermatitis is likely to occur more frequently and at a lower dose when signaling pathways upstream in the MAPK pathway required for keratinocyte regeneration, such as epidermal growth factor receptor and BRAF, are inhibited by targeted therapies. This hypothesis supports the observation that patients on medications that inhibit these signaling pathways, such as cetuximab and vemurafenib, develop enhanced sensitivity to both UV radiation and radiation therapy.
We report a case of enhanced radiation dermatitis occurring at a total dose of 0.75 Gy of radiotherapy, well below the threshold commonly recognized to cause radiation-induced skin toxicities. Our observation suggests that vemurafenib likely acts as a radiosensitizing agent that notably decreases the threshold for radiotherapy-related skin toxicities. Furthermore, the radiosensitizing effect of vemurafenib appears to be transient, as our patient showed no evidence of any skin reaction to subsequent radiation treatment soon after vemurafenib was discontinued. As more patients with metastatic melanoma are treated with vemurafenib, the combination of palliative or adjuvant radiation therapy with vemurafenib will likely be used more frequently. Caution should be exercised in patients on vemurafenib who receive concurrent radiotherapy, even at low radiation doses.
- Huang V, Hepper D, Anadkat M, et al. Cutaneous toxic effects associated with vemurafenib and inhibition of the BRAF pathway. Arch Dermatol. 2012;148:628-633.
- Studer G, Brown M, Dalgueiro E, et al. Grade 3/4 dermatitis in head and neck cancer patients treated with concurrent cetuximab and IMRT. Int J Radiat Oncol Biol Phys. 2011;81:110-117.
- Archambeau JO, Pezner R, Wasserman T. Pathophysiology of irradiated skin and breast. Int J Radiat Oncol Biol Phys. 1995;31:1171-1185.
- Balter S, Hopewell JW, Miller DL, et al. Fluoroscopically guided interventional procedures: a review of radiation effects on patients’ skin and hair. Radiology. 2010;254:326-341.
- Boussemart L, Boivin C, Claveau J, et al. Vemurafenib and radiosensitization. JAMA Dermatol. 2013;149:855-857.
- Churilla TM, Chowdhry VK, Pan D, et al. Radiation-induced dermatitis with vemurafenib therapy. Pract Radiat Oncol. 2013;3:e195-e198.
- Anker CJ, Grossmann KF, Atkins MB, et al. Avoiding severe toxicity from combined BRAF inhibitor and radiation treatment: consensus guidelines from the Eastern Cooperative Oncology Group (ECOG). Int J Radiat Oncol Biol Phys. 2016;95:632-646.
- Dent P, Yacoub A, Fisher PB, et al. MAPK pathways in radiation responses. Oncogene. 2003;22:5885-5896.
- Cao C, Lus S, Jiang Q, et al. EGFR activation confers protections against UV-induced apoptosis in cultured mouse skin dendritic cells. Cell Signal. 2008;20:1830-1838.
- Xu Y, Shao Y, Zhou J, et al. Ultraviolet irradiation-induces epidermal growth factor receptor (EGFR) nuclear translocation in human keratinocytes. J Cell Biochem. 2009;107:873-880.
- Valerie K, Yacoub A, Hagan M, et al. Radiation-induced cell signaling: inside-out and outside-in. Mol Cancer Ther. 2007;6:789-801.
- Sambade M, Peters E, Thomas N, et al. Melanoma cells show a heterogeneous range of sensitivity to ionizing radiation and are radiosensitized by inhibition of B-RAF with PLX-4032. Radiother Oncol. 2011;98:394-399.
To the Editor:
Vemurafenib is a selective BRAF inhibitor that was approved by the US Food and Drug Administration (FDA) in August 2011 for the treatment of patients with unresectable or metastatic melanoma with the BRAF V600E mutation as detected by an approved test. Both malignant and nonmalignant cutaneous findings have been well documented in association with vemurafenib, including squamous cell carcinoma, keratoacanthomas, UVA photosensitivity, keratosis pilaris–like eruptions, seborrheic dermatitis, follicular plugging, follicular hyperkeratosis, and eruptive melanocytic nevi.1 As more patients with metastatic melanoma are treated with vemurafenib, the use of concomitant palliative or adjuvant radiation therapy with vemurafenib will inevitably occur in greater frequency. Therefore, it is critical to understand the potential cutaneous side effects of this combination.
A predisposition to enhanced radiation dermatitis has been well described with concurrent use of targeted chemotherapies such as the epidermal growth factor receptor inhibitor cetuximab with radiotherapy.2 We report a case of radiation dermatitis occurring shortly after initiating radiation therapy in a patient on vemurafenib.
A 53-year-old man with initial stage IIIB melanoma, Breslow depth 2.2 mm with histologic ulceration, and a mitotic index of 2/mm2 on the right buttock underwent wide local excision and sentinel lymph node biopsy followed by complete lymph node dissection with a total of 2 of 10 positive lymph nodes. The patient subsequently underwent 1 year of adjuvant high-dose interferon therapy. Four years after his initial presentation he developed metastases to the lungs, pelvis, and both femurs. He was started on oral vemurafenib 960 mg twice daily. Due to painful bony metastases in the pelvis, the patient also was started on concurrent palliative radiation therapy to both femurs, L5 vertebra, and the sacrum 1 day after initiation of vemurafenib. Three days after initiation of radiation therapy at a cumulative radiation dose of 0.75 Gy, the patient developed severe, painful, well-demarcated, erythematous plaques in the anterior and posterior pelvic distribution overlying the radiation field (Figure 1) that subsequently evolved to eroded desquamative plaques with copious transudate. The patient also developed hyperkeratotic papules on the chest and thighs consistent with the keratosis pilaris–like eruptions associated with vemurafenib therapy.1 Five months later the patient developed worsening neurologic symptoms, and magnetic resonance imaging of the brain revealed multiple brain metastases. Given his disease progression, vemurafenib was discontinued. Ten days later, the patient underwent palliative whole-brain radiation therapy. He received a total dose of 3.25 Gy to the whole brain without any cutaneous sequelae.
The pathophysiology of radiation dermatitis is caused by a dose-dependent loss of basal and endothelial cells following irradiation.3 If surviving basal cells are able to repopulate the basal monolayer, normal skin barrier function is preserved. Dose tolerance is exceeded when cell loss without replacement occurs, resulting in necrosis and clinical evidence of radiation dermatitis, which is characterized by painful erythema or hyperpigmentation followed by desquamation and skin necrosis. In general, occurrence and severity of radiation dermatitis when radiation therapy is used alone in the absence of concurrent chemotherapy is dose dependent, with cutaneous evidence of radiation dermatitis occurring at doses ranging from as low as 2 Gy but most commonly 5 to 10 Gy.4 A report of radiation recall dermatitis in 2 patients who received vemurafenib after completing a full course of radiotherapy5 supports the hypothesis that vemurafenib is a radiosensitizing medication. Enhanced radiation dermatitis was reported in a single case of a patient on vemurafenib who developed radiation dermatitis after completing 3.25 Gy of radiation to the lumbar spine. Although this case likely depicted enhanced radiation dermatitis secondary to concurrent vemurafenib use, it was inconclusive whether vemurafenib contributed to the cutaneous effect, as the patient developed a cutaneous skin reaction 1 week after receiving a cumulative radiation dose of 3.25 Gy, a level at which radiation alone has been shown to cause skin toxicity.6 In our patient, cutaneous manifestations were noted 3 days after initiation of radiation treatment, at which point he had received a total radiation dose of 0.75 Gy, which is well below the threshold commonly recognized to cause radiation-induced skin toxicities. In addition, rechallenge in this patient with higher-dose radiotherapy while off of vemurafenib treatment led to no skin toxicity, despite the common side effects of whole-brain radiation therapy including radiation dermatitis and alopecia.7
The exact mechanism of increased radiosensitivity caused by targeted chemotherapies such as cetuximab and vemurafenib is unclear. One possible explanation is that the drug interferes with the mitogen-activated protein kinase (MAPK) pathway, which plays a crucial role in controlling cell survival and regeneration following radiation exposure.8 Disruption of this signaling pathway through targeted therapies leads to impaired keratinocyte cell survival and recovery, and thus may enhance susceptibility to radiation-induced skin injury (Figure 2). In vivo studies have demonstrated that the epidermal growth factor receptor is activated following UV irradiation in human keratinocytes, leading to activation of the downstream MAPK signal transduction pathway required for cellular proliferation mediated via the RAF family of proteins.9,10 Further supporting the importance of this pathway in keratinocyte survival and recovery are findings that somatic deletion of BRAF in fibroblasts results in decreased growth factor–induced MAPK activation and enhanced apoptosis,8 whereas activated BRAF has been shown to exert protective effects against oxidative stress as well as tumorigenesis.11 The observation that mutant BRAF melanoma cells demonstrated increased radiosensitivity following BRAF inhibition with vemurafenib12 is consistent with our hypothesis that increased radiosensitivity occurs when signal transduction mediated by MAPK pathway is blocked, thereby inhibiting cell survival. As a result, radiation dermatitis is likely to occur more frequently and at a lower dose when signaling pathways upstream in the MAPK pathway required for keratinocyte regeneration, such as epidermal growth factor receptor and BRAF, are inhibited by targeted therapies. This hypothesis supports the observation that patients on medications that inhibit these signaling pathways, such as cetuximab and vemurafenib, develop enhanced sensitivity to both UV radiation and radiation therapy.
We report a case of enhanced radiation dermatitis occurring at a total dose of 0.75 Gy of radiotherapy, well below the threshold commonly recognized to cause radiation-induced skin toxicities. Our observation suggests that vemurafenib likely acts as a radiosensitizing agent that notably decreases the threshold for radiotherapy-related skin toxicities. Furthermore, the radiosensitizing effect of vemurafenib appears to be transient, as our patient showed no evidence of any skin reaction to subsequent radiation treatment soon after vemurafenib was discontinued. As more patients with metastatic melanoma are treated with vemurafenib, the combination of palliative or adjuvant radiation therapy with vemurafenib will likely be used more frequently. Caution should be exercised in patients on vemurafenib who receive concurrent radiotherapy, even at low radiation doses.
To the Editor:
Vemurafenib is a selective BRAF inhibitor that was approved by the US Food and Drug Administration (FDA) in August 2011 for the treatment of patients with unresectable or metastatic melanoma with the BRAF V600E mutation as detected by an approved test. Both malignant and nonmalignant cutaneous findings have been well documented in association with vemurafenib, including squamous cell carcinoma, keratoacanthomas, UVA photosensitivity, keratosis pilaris–like eruptions, seborrheic dermatitis, follicular plugging, follicular hyperkeratosis, and eruptive melanocytic nevi.1 As more patients with metastatic melanoma are treated with vemurafenib, the use of concomitant palliative or adjuvant radiation therapy with vemurafenib will inevitably occur in greater frequency. Therefore, it is critical to understand the potential cutaneous side effects of this combination.
A predisposition to enhanced radiation dermatitis has been well described with concurrent use of targeted chemotherapies such as the epidermal growth factor receptor inhibitor cetuximab with radiotherapy.2 We report a case of radiation dermatitis occurring shortly after initiating radiation therapy in a patient on vemurafenib.
A 53-year-old man with initial stage IIIB melanoma, Breslow depth 2.2 mm with histologic ulceration, and a mitotic index of 2/mm2 on the right buttock underwent wide local excision and sentinel lymph node biopsy followed by complete lymph node dissection with a total of 2 of 10 positive lymph nodes. The patient subsequently underwent 1 year of adjuvant high-dose interferon therapy. Four years after his initial presentation he developed metastases to the lungs, pelvis, and both femurs. He was started on oral vemurafenib 960 mg twice daily. Due to painful bony metastases in the pelvis, the patient also was started on concurrent palliative radiation therapy to both femurs, L5 vertebra, and the sacrum 1 day after initiation of vemurafenib. Three days after initiation of radiation therapy at a cumulative radiation dose of 0.75 Gy, the patient developed severe, painful, well-demarcated, erythematous plaques in the anterior and posterior pelvic distribution overlying the radiation field (Figure 1) that subsequently evolved to eroded desquamative plaques with copious transudate. The patient also developed hyperkeratotic papules on the chest and thighs consistent with the keratosis pilaris–like eruptions associated with vemurafenib therapy.1 Five months later the patient developed worsening neurologic symptoms, and magnetic resonance imaging of the brain revealed multiple brain metastases. Given his disease progression, vemurafenib was discontinued. Ten days later, the patient underwent palliative whole-brain radiation therapy. He received a total dose of 3.25 Gy to the whole brain without any cutaneous sequelae.
The pathophysiology of radiation dermatitis is caused by a dose-dependent loss of basal and endothelial cells following irradiation.3 If surviving basal cells are able to repopulate the basal monolayer, normal skin barrier function is preserved. Dose tolerance is exceeded when cell loss without replacement occurs, resulting in necrosis and clinical evidence of radiation dermatitis, which is characterized by painful erythema or hyperpigmentation followed by desquamation and skin necrosis. In general, occurrence and severity of radiation dermatitis when radiation therapy is used alone in the absence of concurrent chemotherapy is dose dependent, with cutaneous evidence of radiation dermatitis occurring at doses ranging from as low as 2 Gy but most commonly 5 to 10 Gy.4 A report of radiation recall dermatitis in 2 patients who received vemurafenib after completing a full course of radiotherapy5 supports the hypothesis that vemurafenib is a radiosensitizing medication. Enhanced radiation dermatitis was reported in a single case of a patient on vemurafenib who developed radiation dermatitis after completing 3.25 Gy of radiation to the lumbar spine. Although this case likely depicted enhanced radiation dermatitis secondary to concurrent vemurafenib use, it was inconclusive whether vemurafenib contributed to the cutaneous effect, as the patient developed a cutaneous skin reaction 1 week after receiving a cumulative radiation dose of 3.25 Gy, a level at which radiation alone has been shown to cause skin toxicity.6 In our patient, cutaneous manifestations were noted 3 days after initiation of radiation treatment, at which point he had received a total radiation dose of 0.75 Gy, which is well below the threshold commonly recognized to cause radiation-induced skin toxicities. In addition, rechallenge in this patient with higher-dose radiotherapy while off of vemurafenib treatment led to no skin toxicity, despite the common side effects of whole-brain radiation therapy including radiation dermatitis and alopecia.7
The exact mechanism of increased radiosensitivity caused by targeted chemotherapies such as cetuximab and vemurafenib is unclear. One possible explanation is that the drug interferes with the mitogen-activated protein kinase (MAPK) pathway, which plays a crucial role in controlling cell survival and regeneration following radiation exposure.8 Disruption of this signaling pathway through targeted therapies leads to impaired keratinocyte cell survival and recovery, and thus may enhance susceptibility to radiation-induced skin injury (Figure 2). In vivo studies have demonstrated that the epidermal growth factor receptor is activated following UV irradiation in human keratinocytes, leading to activation of the downstream MAPK signal transduction pathway required for cellular proliferation mediated via the RAF family of proteins.9,10 Further supporting the importance of this pathway in keratinocyte survival and recovery are findings that somatic deletion of BRAF in fibroblasts results in decreased growth factor–induced MAPK activation and enhanced apoptosis,8 whereas activated BRAF has been shown to exert protective effects against oxidative stress as well as tumorigenesis.11 The observation that mutant BRAF melanoma cells demonstrated increased radiosensitivity following BRAF inhibition with vemurafenib12 is consistent with our hypothesis that increased radiosensitivity occurs when signal transduction mediated by MAPK pathway is blocked, thereby inhibiting cell survival. As a result, radiation dermatitis is likely to occur more frequently and at a lower dose when signaling pathways upstream in the MAPK pathway required for keratinocyte regeneration, such as epidermal growth factor receptor and BRAF, are inhibited by targeted therapies. This hypothesis supports the observation that patients on medications that inhibit these signaling pathways, such as cetuximab and vemurafenib, develop enhanced sensitivity to both UV radiation and radiation therapy.
We report a case of enhanced radiation dermatitis occurring at a total dose of 0.75 Gy of radiotherapy, well below the threshold commonly recognized to cause radiation-induced skin toxicities. Our observation suggests that vemurafenib likely acts as a radiosensitizing agent that notably decreases the threshold for radiotherapy-related skin toxicities. Furthermore, the radiosensitizing effect of vemurafenib appears to be transient, as our patient showed no evidence of any skin reaction to subsequent radiation treatment soon after vemurafenib was discontinued. As more patients with metastatic melanoma are treated with vemurafenib, the combination of palliative or adjuvant radiation therapy with vemurafenib will likely be used more frequently. Caution should be exercised in patients on vemurafenib who receive concurrent radiotherapy, even at low radiation doses.
- Huang V, Hepper D, Anadkat M, et al. Cutaneous toxic effects associated with vemurafenib and inhibition of the BRAF pathway. Arch Dermatol. 2012;148:628-633.
- Studer G, Brown M, Dalgueiro E, et al. Grade 3/4 dermatitis in head and neck cancer patients treated with concurrent cetuximab and IMRT. Int J Radiat Oncol Biol Phys. 2011;81:110-117.
- Archambeau JO, Pezner R, Wasserman T. Pathophysiology of irradiated skin and breast. Int J Radiat Oncol Biol Phys. 1995;31:1171-1185.
- Balter S, Hopewell JW, Miller DL, et al. Fluoroscopically guided interventional procedures: a review of radiation effects on patients’ skin and hair. Radiology. 2010;254:326-341.
- Boussemart L, Boivin C, Claveau J, et al. Vemurafenib and radiosensitization. JAMA Dermatol. 2013;149:855-857.
- Churilla TM, Chowdhry VK, Pan D, et al. Radiation-induced dermatitis with vemurafenib therapy. Pract Radiat Oncol. 2013;3:e195-e198.
- Anker CJ, Grossmann KF, Atkins MB, et al. Avoiding severe toxicity from combined BRAF inhibitor and radiation treatment: consensus guidelines from the Eastern Cooperative Oncology Group (ECOG). Int J Radiat Oncol Biol Phys. 2016;95:632-646.
- Dent P, Yacoub A, Fisher PB, et al. MAPK pathways in radiation responses. Oncogene. 2003;22:5885-5896.
- Cao C, Lus S, Jiang Q, et al. EGFR activation confers protections against UV-induced apoptosis in cultured mouse skin dendritic cells. Cell Signal. 2008;20:1830-1838.
- Xu Y, Shao Y, Zhou J, et al. Ultraviolet irradiation-induces epidermal growth factor receptor (EGFR) nuclear translocation in human keratinocytes. J Cell Biochem. 2009;107:873-880.
- Valerie K, Yacoub A, Hagan M, et al. Radiation-induced cell signaling: inside-out and outside-in. Mol Cancer Ther. 2007;6:789-801.
- Sambade M, Peters E, Thomas N, et al. Melanoma cells show a heterogeneous range of sensitivity to ionizing radiation and are radiosensitized by inhibition of B-RAF with PLX-4032. Radiother Oncol. 2011;98:394-399.
- Huang V, Hepper D, Anadkat M, et al. Cutaneous toxic effects associated with vemurafenib and inhibition of the BRAF pathway. Arch Dermatol. 2012;148:628-633.
- Studer G, Brown M, Dalgueiro E, et al. Grade 3/4 dermatitis in head and neck cancer patients treated with concurrent cetuximab and IMRT. Int J Radiat Oncol Biol Phys. 2011;81:110-117.
- Archambeau JO, Pezner R, Wasserman T. Pathophysiology of irradiated skin and breast. Int J Radiat Oncol Biol Phys. 1995;31:1171-1185.
- Balter S, Hopewell JW, Miller DL, et al. Fluoroscopically guided interventional procedures: a review of radiation effects on patients’ skin and hair. Radiology. 2010;254:326-341.
- Boussemart L, Boivin C, Claveau J, et al. Vemurafenib and radiosensitization. JAMA Dermatol. 2013;149:855-857.
- Churilla TM, Chowdhry VK, Pan D, et al. Radiation-induced dermatitis with vemurafenib therapy. Pract Radiat Oncol. 2013;3:e195-e198.
- Anker CJ, Grossmann KF, Atkins MB, et al. Avoiding severe toxicity from combined BRAF inhibitor and radiation treatment: consensus guidelines from the Eastern Cooperative Oncology Group (ECOG). Int J Radiat Oncol Biol Phys. 2016;95:632-646.
- Dent P, Yacoub A, Fisher PB, et al. MAPK pathways in radiation responses. Oncogene. 2003;22:5885-5896.
- Cao C, Lus S, Jiang Q, et al. EGFR activation confers protections against UV-induced apoptosis in cultured mouse skin dendritic cells. Cell Signal. 2008;20:1830-1838.
- Xu Y, Shao Y, Zhou J, et al. Ultraviolet irradiation-induces epidermal growth factor receptor (EGFR) nuclear translocation in human keratinocytes. J Cell Biochem. 2009;107:873-880.
- Valerie K, Yacoub A, Hagan M, et al. Radiation-induced cell signaling: inside-out and outside-in. Mol Cancer Ther. 2007;6:789-801.
- Sambade M, Peters E, Thomas N, et al. Melanoma cells show a heterogeneous range of sensitivity to ionizing radiation and are radiosensitized by inhibition of B-RAF with PLX-4032. Radiother Oncol. 2011;98:394-399.
Practice Points
- Given the increased frequency of palliative and adjuvant radiation therapy in patients with metastatic melanoma, it is critical to understand the potential cutaneous side effects of vemurafenib when used in conjunction with radiotherapy.
- Clinicians should be aware of the increased risk for severe radiation dermatitis in patients on vemurafenib who are receiving concurrent palliative radiation therapy.
Fingernail Photo-onycholysis After Aminolevulinic Acid–Photodynamic Therapy Under Blue Light for Treatment of Actinic Keratoses on the Face
To the Editor:
Topical photodynamic therapy (PDT) is one of several effective treatments of actinic keratoses (AKs). Photodynamic therapy involves selection of a lesion field, application of a photosensitizer drug, incubation for an explicit period of time, and illumination of the area from a light source corresponding to the absorption spectrum of the chosen drug.1 A photosensitizer drug used in PDT to target AK is aminolevulinic acid (ALA). Aminolevulinic acid converts disease tissue to photoactivatable porphyrins, especially protoporphyrin IX, which has its largest absorption peak (410 nm) in the blue spectrum, with smaller absorption peaks at 505, 540, 580, and 630 nm. Photodynamic therapy treatments historically have been carried out under red light (peak emissions, 630 nm) to improve tissue penetration, which is superior in efficacy when treating Bowen disease and basal cell carcinoma.1,2 Broadband blue light (peak emission, 417 nm) now is routinely used and has been proven effective in combination with ALA for the treatment of AK.3 It was approved by the US Food and Drug Administration for AKs in 1999.4
Photo-onycholysis is a photosensitivity reaction defined as separation of the nail plate from the nail bed. There are 4 different types of photo-onycholysis characterized by appearance and by the number of digits affected: Type I is denoted by the involvement of several fingers, with half-moon–shaped separations of the nail plate. Type II affects a single finger and corresponds to a brown, defined, circular notch opening distally. Type III, which involves several fingers, is defined as round yellow stains in the central portion of the nail that turn red after 5 to 10 days. Type IV has been associated with bullae under the nails.5 There have been cases of photo-onycholysis arising after exposure to UV light following ingestion of certain prescription drugs or spontaneously,6 and a single case following PDT to the hands with red light.5 We report a case of fingernail photo-onycholysis resulting from ALA-PDT for the treatment of perioral AK.
A 65-year-old woman was treated for AKs on the perioral region of the face with PDT. Aminolevulinic acid hydrochloride 20% was applied to the lips and allowed to incubate for 60 minutes. Her face was illuminated with 10 J/cm² of blue light (417 nm) for 16 minutes and 40 seconds. Sunscreen (sun protection factor 40) was applied to the area immediately after treatment, and the patient was thoroughly counseled to avoid sunlight for the next 48 hours and to use sun protection. Within 72 hours following treatment, the patient reported all 10 fingernails noticeably separated from the nail bed with minimal pain, corresponding to type I photo-onycholysis (Figure). The patient’s only medications were vitamin D (1000 mg once daily) and calcium supplements (1500 mg twice daily). Although the patient exercised strict UV light avoidance for the face, her hands were not protected when she went gardening directly after the treatment. At 5 weeks, the patient returned for her second ALA-PDT treatment of perioral AK and a fungal culture was taken of the left third fingernail, which returned negative results. Poly-ureaurethane nail lacquer 16% was prescribed and was used once daily to protect and strengthen the fingernails. The patient returned for follow-up in clinic after 13 weeks and photo-onycholysis was resolving. Photo-onycholysis is categorized as a phototoxic reaction often associated with drug intake, more specifically with the use of tetracyclines, psoralens, and fluoroquinolones; less commonly with oral contraceptives; or spontaneously.6 It usually is recognized as a crescent-shaped distal separation of the nail surrounded by pigment. The action spectrum is believed to include UVA and UVB, though the exact mechanisms have not been confirmed.5
Our case provides evidence for risks involving the development of photo-onycholysis following PDT. We have no reason to believe there was systemic absorption of ALA, as there were no visible vesicles on the arms or hands after the treatment. Negative fungal culture results excluded onychomycosis. It is our hypothesis that the patient touched her face with her fingernails during the 60-minute incubation time prior to ALA-PDT treatment under blue light, inadvertently collecting ALA under the fingernails. Once she exposed her hands to sunlight while gardening after treatment, the nails likely reacted with the ALA in response to the UV radiation, thus triggering photo-onycholysis.
This case represents a report of fingernail photo-onycholysis from ALA-PDT under blue light as well as a report following treatment of AK not located on the hands with PDT. Although the photo-onycholysis did resolve within a few months of treatment, our case demonstrates the importance of counseling patients more specifically about isolating the ALA treatment zone from nontreated areas on the body during incubation. Improper UV light protection following ALA-PDT is known to produce phototoxic reactions and our case supports this outcome.
- Morton CA, McKenna KE, Rhodes LE. Guidelines for topical photodynamic therapy: update. Br J Dermatol. 2008;159:1245-1266.
- Hauschild A. Photodynamic therapy for actinic keratoses: procedure matters? Br J Dermatol. 2012;166:3-5.
- Alexiades-Armenakas M. Laser-mediated photodynamic therapy. Clin Dermatol. 2006;24:16-25.
- Babilas P, Schreml S, Landthaler M, et al. Photodynamic therapy in dermatology: state-of-the-art. Photodermatol Photoimmunol Photomed. 2010;26:118-132.
- Hanneken S, Wessendorf U, Neumann NJ. Photodynamic onycholysis: first report of photo-onycholysis after photodynamic therapy. Clin Exp Dermatol. 2008;33:659-660.
- Baran R, Juhlin L. Photoonycholysis. Photodermatol Photoimmunol Photomed. 2002;18:202-207.
To the Editor:
Topical photodynamic therapy (PDT) is one of several effective treatments of actinic keratoses (AKs). Photodynamic therapy involves selection of a lesion field, application of a photosensitizer drug, incubation for an explicit period of time, and illumination of the area from a light source corresponding to the absorption spectrum of the chosen drug.1 A photosensitizer drug used in PDT to target AK is aminolevulinic acid (ALA). Aminolevulinic acid converts disease tissue to photoactivatable porphyrins, especially protoporphyrin IX, which has its largest absorption peak (410 nm) in the blue spectrum, with smaller absorption peaks at 505, 540, 580, and 630 nm. Photodynamic therapy treatments historically have been carried out under red light (peak emissions, 630 nm) to improve tissue penetration, which is superior in efficacy when treating Bowen disease and basal cell carcinoma.1,2 Broadband blue light (peak emission, 417 nm) now is routinely used and has been proven effective in combination with ALA for the treatment of AK.3 It was approved by the US Food and Drug Administration for AKs in 1999.4
Photo-onycholysis is a photosensitivity reaction defined as separation of the nail plate from the nail bed. There are 4 different types of photo-onycholysis characterized by appearance and by the number of digits affected: Type I is denoted by the involvement of several fingers, with half-moon–shaped separations of the nail plate. Type II affects a single finger and corresponds to a brown, defined, circular notch opening distally. Type III, which involves several fingers, is defined as round yellow stains in the central portion of the nail that turn red after 5 to 10 days. Type IV has been associated with bullae under the nails.5 There have been cases of photo-onycholysis arising after exposure to UV light following ingestion of certain prescription drugs or spontaneously,6 and a single case following PDT to the hands with red light.5 We report a case of fingernail photo-onycholysis resulting from ALA-PDT for the treatment of perioral AK.
A 65-year-old woman was treated for AKs on the perioral region of the face with PDT. Aminolevulinic acid hydrochloride 20% was applied to the lips and allowed to incubate for 60 minutes. Her face was illuminated with 10 J/cm² of blue light (417 nm) for 16 minutes and 40 seconds. Sunscreen (sun protection factor 40) was applied to the area immediately after treatment, and the patient was thoroughly counseled to avoid sunlight for the next 48 hours and to use sun protection. Within 72 hours following treatment, the patient reported all 10 fingernails noticeably separated from the nail bed with minimal pain, corresponding to type I photo-onycholysis (Figure). The patient’s only medications were vitamin D (1000 mg once daily) and calcium supplements (1500 mg twice daily). Although the patient exercised strict UV light avoidance for the face, her hands were not protected when she went gardening directly after the treatment. At 5 weeks, the patient returned for her second ALA-PDT treatment of perioral AK and a fungal culture was taken of the left third fingernail, which returned negative results. Poly-ureaurethane nail lacquer 16% was prescribed and was used once daily to protect and strengthen the fingernails. The patient returned for follow-up in clinic after 13 weeks and photo-onycholysis was resolving. Photo-onycholysis is categorized as a phototoxic reaction often associated with drug intake, more specifically with the use of tetracyclines, psoralens, and fluoroquinolones; less commonly with oral contraceptives; or spontaneously.6 It usually is recognized as a crescent-shaped distal separation of the nail surrounded by pigment. The action spectrum is believed to include UVA and UVB, though the exact mechanisms have not been confirmed.5
Our case provides evidence for risks involving the development of photo-onycholysis following PDT. We have no reason to believe there was systemic absorption of ALA, as there were no visible vesicles on the arms or hands after the treatment. Negative fungal culture results excluded onychomycosis. It is our hypothesis that the patient touched her face with her fingernails during the 60-minute incubation time prior to ALA-PDT treatment under blue light, inadvertently collecting ALA under the fingernails. Once she exposed her hands to sunlight while gardening after treatment, the nails likely reacted with the ALA in response to the UV radiation, thus triggering photo-onycholysis.
This case represents a report of fingernail photo-onycholysis from ALA-PDT under blue light as well as a report following treatment of AK not located on the hands with PDT. Although the photo-onycholysis did resolve within a few months of treatment, our case demonstrates the importance of counseling patients more specifically about isolating the ALA treatment zone from nontreated areas on the body during incubation. Improper UV light protection following ALA-PDT is known to produce phototoxic reactions and our case supports this outcome.
To the Editor:
Topical photodynamic therapy (PDT) is one of several effective treatments of actinic keratoses (AKs). Photodynamic therapy involves selection of a lesion field, application of a photosensitizer drug, incubation for an explicit period of time, and illumination of the area from a light source corresponding to the absorption spectrum of the chosen drug.1 A photosensitizer drug used in PDT to target AK is aminolevulinic acid (ALA). Aminolevulinic acid converts disease tissue to photoactivatable porphyrins, especially protoporphyrin IX, which has its largest absorption peak (410 nm) in the blue spectrum, with smaller absorption peaks at 505, 540, 580, and 630 nm. Photodynamic therapy treatments historically have been carried out under red light (peak emissions, 630 nm) to improve tissue penetration, which is superior in efficacy when treating Bowen disease and basal cell carcinoma.1,2 Broadband blue light (peak emission, 417 nm) now is routinely used and has been proven effective in combination with ALA for the treatment of AK.3 It was approved by the US Food and Drug Administration for AKs in 1999.4
Photo-onycholysis is a photosensitivity reaction defined as separation of the nail plate from the nail bed. There are 4 different types of photo-onycholysis characterized by appearance and by the number of digits affected: Type I is denoted by the involvement of several fingers, with half-moon–shaped separations of the nail plate. Type II affects a single finger and corresponds to a brown, defined, circular notch opening distally. Type III, which involves several fingers, is defined as round yellow stains in the central portion of the nail that turn red after 5 to 10 days. Type IV has been associated with bullae under the nails.5 There have been cases of photo-onycholysis arising after exposure to UV light following ingestion of certain prescription drugs or spontaneously,6 and a single case following PDT to the hands with red light.5 We report a case of fingernail photo-onycholysis resulting from ALA-PDT for the treatment of perioral AK.
A 65-year-old woman was treated for AKs on the perioral region of the face with PDT. Aminolevulinic acid hydrochloride 20% was applied to the lips and allowed to incubate for 60 minutes. Her face was illuminated with 10 J/cm² of blue light (417 nm) for 16 minutes and 40 seconds. Sunscreen (sun protection factor 40) was applied to the area immediately after treatment, and the patient was thoroughly counseled to avoid sunlight for the next 48 hours and to use sun protection. Within 72 hours following treatment, the patient reported all 10 fingernails noticeably separated from the nail bed with minimal pain, corresponding to type I photo-onycholysis (Figure). The patient’s only medications were vitamin D (1000 mg once daily) and calcium supplements (1500 mg twice daily). Although the patient exercised strict UV light avoidance for the face, her hands were not protected when she went gardening directly after the treatment. At 5 weeks, the patient returned for her second ALA-PDT treatment of perioral AK and a fungal culture was taken of the left third fingernail, which returned negative results. Poly-ureaurethane nail lacquer 16% was prescribed and was used once daily to protect and strengthen the fingernails. The patient returned for follow-up in clinic after 13 weeks and photo-onycholysis was resolving. Photo-onycholysis is categorized as a phototoxic reaction often associated with drug intake, more specifically with the use of tetracyclines, psoralens, and fluoroquinolones; less commonly with oral contraceptives; or spontaneously.6 It usually is recognized as a crescent-shaped distal separation of the nail surrounded by pigment. The action spectrum is believed to include UVA and UVB, though the exact mechanisms have not been confirmed.5
Our case provides evidence for risks involving the development of photo-onycholysis following PDT. We have no reason to believe there was systemic absorption of ALA, as there were no visible vesicles on the arms or hands after the treatment. Negative fungal culture results excluded onychomycosis. It is our hypothesis that the patient touched her face with her fingernails during the 60-minute incubation time prior to ALA-PDT treatment under blue light, inadvertently collecting ALA under the fingernails. Once she exposed her hands to sunlight while gardening after treatment, the nails likely reacted with the ALA in response to the UV radiation, thus triggering photo-onycholysis.
This case represents a report of fingernail photo-onycholysis from ALA-PDT under blue light as well as a report following treatment of AK not located on the hands with PDT. Although the photo-onycholysis did resolve within a few months of treatment, our case demonstrates the importance of counseling patients more specifically about isolating the ALA treatment zone from nontreated areas on the body during incubation. Improper UV light protection following ALA-PDT is known to produce phototoxic reactions and our case supports this outcome.
- Morton CA, McKenna KE, Rhodes LE. Guidelines for topical photodynamic therapy: update. Br J Dermatol. 2008;159:1245-1266.
- Hauschild A. Photodynamic therapy for actinic keratoses: procedure matters? Br J Dermatol. 2012;166:3-5.
- Alexiades-Armenakas M. Laser-mediated photodynamic therapy. Clin Dermatol. 2006;24:16-25.
- Babilas P, Schreml S, Landthaler M, et al. Photodynamic therapy in dermatology: state-of-the-art. Photodermatol Photoimmunol Photomed. 2010;26:118-132.
- Hanneken S, Wessendorf U, Neumann NJ. Photodynamic onycholysis: first report of photo-onycholysis after photodynamic therapy. Clin Exp Dermatol. 2008;33:659-660.
- Baran R, Juhlin L. Photoonycholysis. Photodermatol Photoimmunol Photomed. 2002;18:202-207.
- Morton CA, McKenna KE, Rhodes LE. Guidelines for topical photodynamic therapy: update. Br J Dermatol. 2008;159:1245-1266.
- Hauschild A. Photodynamic therapy for actinic keratoses: procedure matters? Br J Dermatol. 2012;166:3-5.
- Alexiades-Armenakas M. Laser-mediated photodynamic therapy. Clin Dermatol. 2006;24:16-25.
- Babilas P, Schreml S, Landthaler M, et al. Photodynamic therapy in dermatology: state-of-the-art. Photodermatol Photoimmunol Photomed. 2010;26:118-132.
- Hanneken S, Wessendorf U, Neumann NJ. Photodynamic onycholysis: first report of photo-onycholysis after photodynamic therapy. Clin Exp Dermatol. 2008;33:659-660.
- Baran R, Juhlin L. Photoonycholysis. Photodermatol Photoimmunol Photomed. 2002;18:202-207.
Practice Points
- Photodynamic therapy with aminolevulinic acid (ALA) is an effective treatment of actinic keratoses but can produce unexpected side effects in locations distant from initial therapy sites.
- It is important to counsel patients prior to initiating photodynamic therapy with ALA about isolating the ALA treatment zone from nontreated areas on the body during incubation.