User login
Recommendations for infant sleep position are not being followed
Despite the American Academy of Pediatrics recommendation that parents place infants supine for sleeping, many mothers do not do so, according to the results of the Study of Attitudes and Factors Affecting Infant Care.
In 32 hospitals, 3,297 mothers were surveyed to determine their intentions, the actual infant sleeping position used, and the factors that affected their behaviors. Of those, 2,491 (77%) reported that they usually place their infants in supine position, but only 49% reported that they exclusively place their infants supine. In addition, 14% reported that they place infants on their sides and 8% prone, reported Eve R. Colson, MD, of the department of pediatrics at Yale University, New Haven, Conn., and her coauthors (Pediatrics. 2017. doi: 10.1542/peds.2017-0596).
Several other discrepancies in compliance with the AAP recommendation also were noted. African American mothers were more likely to intend to use prone position, compared with white mothers (adjusted odds ratio, 2.5; 95% confidence interval, 1.57-3.85). Those who did not complete high school were also more likely to intend to use the prone position (aOR, 2.1; 95% CI, 1.16-3.73). On the other hand, those who received recommendation-compliant advice from a doctor were less likely to place their infants in the prone (aOR, 0.6; 95% CI, 0.39-0.93) or side (aOR, 0.5; 95% CI, 0.36-0.67) positions.
“Of particular note, those who reported that their social norms supported placing the infant in the prone position were much more likely to do so, compared with those who felt that their social norms supported using only the supine position (aOR, 11.6; 95% CI, 7.24-18.7). And, most remarkably, those who had positive attitudes about the prone sleep position ... were more likely to choose the prone position (aOR, 130; 95% CI, 71.8-236),” the researchers wrote. These findings indicate that choices in infant sleeping position are directly influenced by attitudes toward the choice, subjective social norms, and perceptions about control.
“These beliefs persist and are potentially modifiable, so they should be considered an important part of any intervention to change practice,” Dr. Colson and her colleagues wrote.
The study was a nationally representative sample of mothers of infants aged 2-6 months. Although the data were taken from patient surveys, which could have been misreported, they are supported by the findings of other studies.
[polldaddy:{"method":"iframe","type":"survey","src":"//newspolls2017.polldaddy.com/s/recommendations-for-infant-sleep-position-are-not-being-followed?iframe=1"}]
“Maternal race and education continue to be factors associated with choice of infant sleeping position as does advice from a doctor. Factors that appear to be of equal or greater importance are those related to attitudes, subjective social norms, and perceived control, all of which can potentially be altered through educational interventions,” Dr. Colson and her colleagues concluded.
The Eunice Kennedy Shriver National Institute of Child Health and Human Development and the National Institutes of Health funded the study. The authors reported no financial disclosures.
Over the 10 years spanning 1994-2004, the sudden infant death syndrome rate in the United States fell by 53%, correlating with an increase in exclusive supine sleep from less than 10% to 78%. However, since then, rates of both supine sleep and SIDS death have remained stagnant.
To make progress in these areas, current data are needed on supine sleep to enhance understanding of how families make these decisions. Colson et al. provide exactly the kind of information we need to guide providers and public health officials in their efforts to help families maintain the safest sleep environments for their infants.
As a start, mothers who want to practice safe sleep need to be empowered to insist that other caregivers in their lives support their parenting decisions, because the study shows that mothers who feel that they have more control are more likely to use the recommended position. We also must look at how we can help change personal attitudes and societal norms in favor of supine sleep because these issues were found to be some of the strongest predictors of prone sleep position.
We, as health care providers, need to provide clear and consistent messaging in both word and behavior to help mothers make safe decisions for their infants.
Michael H. Goodstein, MD, is a neonatologist at WellSpan York (Pa.) Hospital. Barbara M. Ostfeld, PhD, is the program director of the SIDS Center of New Jersey at Rutgers University, New Brunswick. Their remarks accompanied the article by Colson et al. (Pediatrics 2017 Aug 21. doi: 10.1542/peds.2017-2068). Neither author reported any financial disclosures.
Over the 10 years spanning 1994-2004, the sudden infant death syndrome rate in the United States fell by 53%, correlating with an increase in exclusive supine sleep from less than 10% to 78%. However, since then, rates of both supine sleep and SIDS death have remained stagnant.
To make progress in these areas, current data are needed on supine sleep to enhance understanding of how families make these decisions. Colson et al. provide exactly the kind of information we need to guide providers and public health officials in their efforts to help families maintain the safest sleep environments for their infants.
As a start, mothers who want to practice safe sleep need to be empowered to insist that other caregivers in their lives support their parenting decisions, because the study shows that mothers who feel that they have more control are more likely to use the recommended position. We also must look at how we can help change personal attitudes and societal norms in favor of supine sleep because these issues were found to be some of the strongest predictors of prone sleep position.
We, as health care providers, need to provide clear and consistent messaging in both word and behavior to help mothers make safe decisions for their infants.
Michael H. Goodstein, MD, is a neonatologist at WellSpan York (Pa.) Hospital. Barbara M. Ostfeld, PhD, is the program director of the SIDS Center of New Jersey at Rutgers University, New Brunswick. Their remarks accompanied the article by Colson et al. (Pediatrics 2017 Aug 21. doi: 10.1542/peds.2017-2068). Neither author reported any financial disclosures.
Over the 10 years spanning 1994-2004, the sudden infant death syndrome rate in the United States fell by 53%, correlating with an increase in exclusive supine sleep from less than 10% to 78%. However, since then, rates of both supine sleep and SIDS death have remained stagnant.
To make progress in these areas, current data are needed on supine sleep to enhance understanding of how families make these decisions. Colson et al. provide exactly the kind of information we need to guide providers and public health officials in their efforts to help families maintain the safest sleep environments for their infants.
As a start, mothers who want to practice safe sleep need to be empowered to insist that other caregivers in their lives support their parenting decisions, because the study shows that mothers who feel that they have more control are more likely to use the recommended position. We also must look at how we can help change personal attitudes and societal norms in favor of supine sleep because these issues were found to be some of the strongest predictors of prone sleep position.
We, as health care providers, need to provide clear and consistent messaging in both word and behavior to help mothers make safe decisions for their infants.
Michael H. Goodstein, MD, is a neonatologist at WellSpan York (Pa.) Hospital. Barbara M. Ostfeld, PhD, is the program director of the SIDS Center of New Jersey at Rutgers University, New Brunswick. Their remarks accompanied the article by Colson et al. (Pediatrics 2017 Aug 21. doi: 10.1542/peds.2017-2068). Neither author reported any financial disclosures.
Despite the American Academy of Pediatrics recommendation that parents place infants supine for sleeping, many mothers do not do so, according to the results of the Study of Attitudes and Factors Affecting Infant Care.
In 32 hospitals, 3,297 mothers were surveyed to determine their intentions, the actual infant sleeping position used, and the factors that affected their behaviors. Of those, 2,491 (77%) reported that they usually place their infants in supine position, but only 49% reported that they exclusively place their infants supine. In addition, 14% reported that they place infants on their sides and 8% prone, reported Eve R. Colson, MD, of the department of pediatrics at Yale University, New Haven, Conn., and her coauthors (Pediatrics. 2017. doi: 10.1542/peds.2017-0596).
Several other discrepancies in compliance with the AAP recommendation also were noted. African American mothers were more likely to intend to use prone position, compared with white mothers (adjusted odds ratio, 2.5; 95% confidence interval, 1.57-3.85). Those who did not complete high school were also more likely to intend to use the prone position (aOR, 2.1; 95% CI, 1.16-3.73). On the other hand, those who received recommendation-compliant advice from a doctor were less likely to place their infants in the prone (aOR, 0.6; 95% CI, 0.39-0.93) or side (aOR, 0.5; 95% CI, 0.36-0.67) positions.
“Of particular note, those who reported that their social norms supported placing the infant in the prone position were much more likely to do so, compared with those who felt that their social norms supported using only the supine position (aOR, 11.6; 95% CI, 7.24-18.7). And, most remarkably, those who had positive attitudes about the prone sleep position ... were more likely to choose the prone position (aOR, 130; 95% CI, 71.8-236),” the researchers wrote. These findings indicate that choices in infant sleeping position are directly influenced by attitudes toward the choice, subjective social norms, and perceptions about control.
“These beliefs persist and are potentially modifiable, so they should be considered an important part of any intervention to change practice,” Dr. Colson and her colleagues wrote.
The study was a nationally representative sample of mothers of infants aged 2-6 months. Although the data were taken from patient surveys, which could have been misreported, they are supported by the findings of other studies.
[polldaddy:{"method":"iframe","type":"survey","src":"//newspolls2017.polldaddy.com/s/recommendations-for-infant-sleep-position-are-not-being-followed?iframe=1"}]
“Maternal race and education continue to be factors associated with choice of infant sleeping position as does advice from a doctor. Factors that appear to be of equal or greater importance are those related to attitudes, subjective social norms, and perceived control, all of which can potentially be altered through educational interventions,” Dr. Colson and her colleagues concluded.
The Eunice Kennedy Shriver National Institute of Child Health and Human Development and the National Institutes of Health funded the study. The authors reported no financial disclosures.
Despite the American Academy of Pediatrics recommendation that parents place infants supine for sleeping, many mothers do not do so, according to the results of the Study of Attitudes and Factors Affecting Infant Care.
In 32 hospitals, 3,297 mothers were surveyed to determine their intentions, the actual infant sleeping position used, and the factors that affected their behaviors. Of those, 2,491 (77%) reported that they usually place their infants in supine position, but only 49% reported that they exclusively place their infants supine. In addition, 14% reported that they place infants on their sides and 8% prone, reported Eve R. Colson, MD, of the department of pediatrics at Yale University, New Haven, Conn., and her coauthors (Pediatrics. 2017. doi: 10.1542/peds.2017-0596).
Several other discrepancies in compliance with the AAP recommendation also were noted. African American mothers were more likely to intend to use prone position, compared with white mothers (adjusted odds ratio, 2.5; 95% confidence interval, 1.57-3.85). Those who did not complete high school were also more likely to intend to use the prone position (aOR, 2.1; 95% CI, 1.16-3.73). On the other hand, those who received recommendation-compliant advice from a doctor were less likely to place their infants in the prone (aOR, 0.6; 95% CI, 0.39-0.93) or side (aOR, 0.5; 95% CI, 0.36-0.67) positions.
“Of particular note, those who reported that their social norms supported placing the infant in the prone position were much more likely to do so, compared with those who felt that their social norms supported using only the supine position (aOR, 11.6; 95% CI, 7.24-18.7). And, most remarkably, those who had positive attitudes about the prone sleep position ... were more likely to choose the prone position (aOR, 130; 95% CI, 71.8-236),” the researchers wrote. These findings indicate that choices in infant sleeping position are directly influenced by attitudes toward the choice, subjective social norms, and perceptions about control.
“These beliefs persist and are potentially modifiable, so they should be considered an important part of any intervention to change practice,” Dr. Colson and her colleagues wrote.
The study was a nationally representative sample of mothers of infants aged 2-6 months. Although the data were taken from patient surveys, which could have been misreported, they are supported by the findings of other studies.
[polldaddy:{"method":"iframe","type":"survey","src":"//newspolls2017.polldaddy.com/s/recommendations-for-infant-sleep-position-are-not-being-followed?iframe=1"}]
“Maternal race and education continue to be factors associated with choice of infant sleeping position as does advice from a doctor. Factors that appear to be of equal or greater importance are those related to attitudes, subjective social norms, and perceived control, all of which can potentially be altered through educational interventions,” Dr. Colson and her colleagues concluded.
The Eunice Kennedy Shriver National Institute of Child Health and Human Development and the National Institutes of Health funded the study. The authors reported no financial disclosures.
FROM PEDIATRICS
Key clinical point:
Major finding: Of the 3,297 mothers surveyed, 2,491 (77%) reported that they usually place their infants in supine position, but only 49% reported that they exclusively place their infants supine.
Data source: The Study of Attitudes and Factors Affecting Infant Care, involving 3,297 mothers.
Disclosures: The Eunice Kennedy Shriver National Institute of Child Health and Human Development and the National Institutes of Health funded the study. The authors reported no financial disclosures.
Ribociclib: another CDK inhibitor hits the mark in breast cancer
This spring, the US Food and Drug Administration approved a second cyclin-dependent kinase (CDK) inhibitor for the treatment of postmenopausal women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative advanced/metastatic breast cancer in combination with aromatase inhibitors (AIs).1 The drug, ribociclib, joins palbociclib as the second drug in this class, which targets key regulators of the mammalian cell cycle and can help to overcome resistance to endocrine therapy–like AIs, a standard front-line treatment option in this group of patients. Palbociclib (Ibrance) was approved last year in combination with the AI letrozole, which was recently expanded to include its use in combination with all AIs, the same indication for which ribociclib received approval.
The ribociclib approval was based on the results of a phase 3, randomized, double-blind, placebo-controlled, international clinical trial called MONALEESA-2.2 The trial, conducted in 29 countries, compared the effects of ribociclib plus letrozole with letrozole plus placebo in 668 postmenopausal women with locally confirmed, HR-positive, HER2-negative, recurrent or metastatic breast cancer.
Patients had not received previous systemic therapy for advanced disease, had measurable disease according to Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1), had an Eastern Cooperative Oncology Group performance status of 0 or 1 (range, 1-5; 0, fully active and 5, dead), and had adequate bone marrow and organ function.
Patients were excluded if they had received previous CDK4/6 therapy, any previous systemic chemotherapy, endocrine therapy for advanced disease, previous neoadjuvant or adjuvant therapy with any nonsteroidal AI (unless they had been disease free for more than 12 months), and had inflammatory breast cancer, central nervous system metastases, history of cardiac disease or dysfunction, or impaired gastrointestinal function that alters drug absorption.
Patients were treated with ribociclib at a dose of 600 mg daily on a 3-weeks-on, 1-week-off schedule in 28-day cycles or placebo, which were combined with letrozole at a dose of 2.5 mg a day on a continuous schedule. Randomization was stratified according to the presence or absence of liver or lung metastases and treatment was continued until disease progression, unacceptable toxicity, death or discontinuation of treatment. Dose reductions of ribociclib were allowed, to manage adverse events (AEs), but treatment crossover was not permitted.
Tumor assessments were performed at screening, every 8 weeks during the first 18 months, every 12 weeks thereafter until disease progression, and at the end of treatment, and were assessed by an independent review committee. The baseline characteristics of the patient population were well balanced; patients had a median age of 62 years, all were HR positive except 1 patient who was HER2 positive.
The trial was ended prematurely after an initial interim analysis demonstrated a significant benefit in favor of ribociclib in the primary endpoint, progression-free survival (PFS). Over a median duration of follow-up of 15.3 months, the median PFS was not yet reached in the ribociclib arm, compared with 14.7 months in the placebo arm (hazard ratio, 0.556; P < .0001). In a subsequent analysis with 11 months of additional follow-up, the median PFS was 25.3 months in the combination arm, compared with 16 months in the placebo arm, which translated into a 44% reduction in the risk of disease progression or death. The PFS benefit with ribociclib was observed across all preplanned subgroup analyses. The objective response rates were 52.7% in the ribociclib arm, compared with 37.1% in the placebo arm, but overall survival data were immature.
The frequency and severity of AEs were increased in the combination arm; most common were neutropenia, nausea, fatigue, diarrhea, leukopenia, alopecia, vomiting, constipation, headache, and back pain. The most common grade 3 or 4 AEs experienced with ribociclib were neutropenia, leukopenia, abnormal liver function tests, lymphopenia, and vomiting.
Ribociclib is accompanied by warnings and precautions about QT interval prolongation, hepatobiliary toxicity, and neutropenia. Clinicians are advised to monitor electrocardiograms and electrolytes before the start of ribociclib therapy and to begin treatment only in patients with QTcF values <450 ms and in whom electrolyte abnormalities have been corrected. ECG should be repeated at around day 14 of the first cycle, the beginning of the second cycle, and as deemed clinically necessary.
Liver function tests should be performed before starting treatment, every 2 weeks for the first 2 cycles, at the beginning of each of the subsequent 4 cycles, and as clinically indicated. For aspartate aminotransferase (AST) and/or alanine aminotransferase (ALT) levels greater than 3-5 times the upper limit of normal (ULN, grade 2), ribociclib should be interrupted until recovery to baseline or lower. For levels >5-20 times the ULN (grade 3) or recurring grade 2 increases, treatment should be interrupted until recovery to baseline or lower and then resumed at the next lowest dose level. Treatment with ribociclib should be discontinued in the event of recurring grade 3 elevations or for AST/ALT elevations >3 times ULN in combination with total bilirubin >2 times ULN.
Complete blood counts should be performed before starting treatment and monitored every 2 weeks for the first 2 cycles, at the beginning of each of the 4 subsequent cycles, and as clinically needed. If absolute neutrophil counts are 500-1,000 mm3 (grade 3), treatment should be discontinued until recovery to grade 2 or lower. If grade 3 neutropenia recurs or for grade 3 febrile neutropenia or grade 4 neutropenia, treatment should resume at a lower dose level upon recovery to grade 2 or lower.
Pregnant women and those of reproductive age should be warned of the risk of fetal harm and the need for effective contraception during treatment and for at least 3 weeks after the last dose. Ribociclib is marketed as Kisqali by Novartis.
1. Ribociclib (Kisqali). US Food and Drug Administration website. https://www.fda.gov/drugs/informationondrugs/approveddrugs/ucm546438.htm. Last updated March 14, 2017. Accessed April 3, 2017.
2. Kisqali (ribociclib) tables, for oral use. Prescribing information. Novartis Pharmaceuticals Corp. https://www.pharma.us.novartis.com/sites/www.pharma.us.novartis.com/files/kisqali.pdf. March 2017. Accessed April 3, 2017.
3. Horobagyi GN, Stemmer SN, Burris HA, et al. Ribociclib as first-line therapy for HR-positive, advanced breast cancer. N Engl J Med. 2016;375:1738-1748.
This spring, the US Food and Drug Administration approved a second cyclin-dependent kinase (CDK) inhibitor for the treatment of postmenopausal women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative advanced/metastatic breast cancer in combination with aromatase inhibitors (AIs).1 The drug, ribociclib, joins palbociclib as the second drug in this class, which targets key regulators of the mammalian cell cycle and can help to overcome resistance to endocrine therapy–like AIs, a standard front-line treatment option in this group of patients. Palbociclib (Ibrance) was approved last year in combination with the AI letrozole, which was recently expanded to include its use in combination with all AIs, the same indication for which ribociclib received approval.
The ribociclib approval was based on the results of a phase 3, randomized, double-blind, placebo-controlled, international clinical trial called MONALEESA-2.2 The trial, conducted in 29 countries, compared the effects of ribociclib plus letrozole with letrozole plus placebo in 668 postmenopausal women with locally confirmed, HR-positive, HER2-negative, recurrent or metastatic breast cancer.
Patients had not received previous systemic therapy for advanced disease, had measurable disease according to Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1), had an Eastern Cooperative Oncology Group performance status of 0 or 1 (range, 1-5; 0, fully active and 5, dead), and had adequate bone marrow and organ function.
Patients were excluded if they had received previous CDK4/6 therapy, any previous systemic chemotherapy, endocrine therapy for advanced disease, previous neoadjuvant or adjuvant therapy with any nonsteroidal AI (unless they had been disease free for more than 12 months), and had inflammatory breast cancer, central nervous system metastases, history of cardiac disease or dysfunction, or impaired gastrointestinal function that alters drug absorption.
Patients were treated with ribociclib at a dose of 600 mg daily on a 3-weeks-on, 1-week-off schedule in 28-day cycles or placebo, which were combined with letrozole at a dose of 2.5 mg a day on a continuous schedule. Randomization was stratified according to the presence or absence of liver or lung metastases and treatment was continued until disease progression, unacceptable toxicity, death or discontinuation of treatment. Dose reductions of ribociclib were allowed, to manage adverse events (AEs), but treatment crossover was not permitted.
Tumor assessments were performed at screening, every 8 weeks during the first 18 months, every 12 weeks thereafter until disease progression, and at the end of treatment, and were assessed by an independent review committee. The baseline characteristics of the patient population were well balanced; patients had a median age of 62 years, all were HR positive except 1 patient who was HER2 positive.
The trial was ended prematurely after an initial interim analysis demonstrated a significant benefit in favor of ribociclib in the primary endpoint, progression-free survival (PFS). Over a median duration of follow-up of 15.3 months, the median PFS was not yet reached in the ribociclib arm, compared with 14.7 months in the placebo arm (hazard ratio, 0.556; P < .0001). In a subsequent analysis with 11 months of additional follow-up, the median PFS was 25.3 months in the combination arm, compared with 16 months in the placebo arm, which translated into a 44% reduction in the risk of disease progression or death. The PFS benefit with ribociclib was observed across all preplanned subgroup analyses. The objective response rates were 52.7% in the ribociclib arm, compared with 37.1% in the placebo arm, but overall survival data were immature.
The frequency and severity of AEs were increased in the combination arm; most common were neutropenia, nausea, fatigue, diarrhea, leukopenia, alopecia, vomiting, constipation, headache, and back pain. The most common grade 3 or 4 AEs experienced with ribociclib were neutropenia, leukopenia, abnormal liver function tests, lymphopenia, and vomiting.
Ribociclib is accompanied by warnings and precautions about QT interval prolongation, hepatobiliary toxicity, and neutropenia. Clinicians are advised to monitor electrocardiograms and electrolytes before the start of ribociclib therapy and to begin treatment only in patients with QTcF values <450 ms and in whom electrolyte abnormalities have been corrected. ECG should be repeated at around day 14 of the first cycle, the beginning of the second cycle, and as deemed clinically necessary.
Liver function tests should be performed before starting treatment, every 2 weeks for the first 2 cycles, at the beginning of each of the subsequent 4 cycles, and as clinically indicated. For aspartate aminotransferase (AST) and/or alanine aminotransferase (ALT) levels greater than 3-5 times the upper limit of normal (ULN, grade 2), ribociclib should be interrupted until recovery to baseline or lower. For levels >5-20 times the ULN (grade 3) or recurring grade 2 increases, treatment should be interrupted until recovery to baseline or lower and then resumed at the next lowest dose level. Treatment with ribociclib should be discontinued in the event of recurring grade 3 elevations or for AST/ALT elevations >3 times ULN in combination with total bilirubin >2 times ULN.
Complete blood counts should be performed before starting treatment and monitored every 2 weeks for the first 2 cycles, at the beginning of each of the 4 subsequent cycles, and as clinically needed. If absolute neutrophil counts are 500-1,000 mm3 (grade 3), treatment should be discontinued until recovery to grade 2 or lower. If grade 3 neutropenia recurs or for grade 3 febrile neutropenia or grade 4 neutropenia, treatment should resume at a lower dose level upon recovery to grade 2 or lower.
Pregnant women and those of reproductive age should be warned of the risk of fetal harm and the need for effective contraception during treatment and for at least 3 weeks after the last dose. Ribociclib is marketed as Kisqali by Novartis.
This spring, the US Food and Drug Administration approved a second cyclin-dependent kinase (CDK) inhibitor for the treatment of postmenopausal women with hormone receptor (HR)-positive, human epidermal growth factor receptor 2 (HER2)-negative advanced/metastatic breast cancer in combination with aromatase inhibitors (AIs).1 The drug, ribociclib, joins palbociclib as the second drug in this class, which targets key regulators of the mammalian cell cycle and can help to overcome resistance to endocrine therapy–like AIs, a standard front-line treatment option in this group of patients. Palbociclib (Ibrance) was approved last year in combination with the AI letrozole, which was recently expanded to include its use in combination with all AIs, the same indication for which ribociclib received approval.
The ribociclib approval was based on the results of a phase 3, randomized, double-blind, placebo-controlled, international clinical trial called MONALEESA-2.2 The trial, conducted in 29 countries, compared the effects of ribociclib plus letrozole with letrozole plus placebo in 668 postmenopausal women with locally confirmed, HR-positive, HER2-negative, recurrent or metastatic breast cancer.
Patients had not received previous systemic therapy for advanced disease, had measurable disease according to Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1), had an Eastern Cooperative Oncology Group performance status of 0 or 1 (range, 1-5; 0, fully active and 5, dead), and had adequate bone marrow and organ function.
Patients were excluded if they had received previous CDK4/6 therapy, any previous systemic chemotherapy, endocrine therapy for advanced disease, previous neoadjuvant or adjuvant therapy with any nonsteroidal AI (unless they had been disease free for more than 12 months), and had inflammatory breast cancer, central nervous system metastases, history of cardiac disease or dysfunction, or impaired gastrointestinal function that alters drug absorption.
Patients were treated with ribociclib at a dose of 600 mg daily on a 3-weeks-on, 1-week-off schedule in 28-day cycles or placebo, which were combined with letrozole at a dose of 2.5 mg a day on a continuous schedule. Randomization was stratified according to the presence or absence of liver or lung metastases and treatment was continued until disease progression, unacceptable toxicity, death or discontinuation of treatment. Dose reductions of ribociclib were allowed, to manage adverse events (AEs), but treatment crossover was not permitted.
Tumor assessments were performed at screening, every 8 weeks during the first 18 months, every 12 weeks thereafter until disease progression, and at the end of treatment, and were assessed by an independent review committee. The baseline characteristics of the patient population were well balanced; patients had a median age of 62 years, all were HR positive except 1 patient who was HER2 positive.
The trial was ended prematurely after an initial interim analysis demonstrated a significant benefit in favor of ribociclib in the primary endpoint, progression-free survival (PFS). Over a median duration of follow-up of 15.3 months, the median PFS was not yet reached in the ribociclib arm, compared with 14.7 months in the placebo arm (hazard ratio, 0.556; P < .0001). In a subsequent analysis with 11 months of additional follow-up, the median PFS was 25.3 months in the combination arm, compared with 16 months in the placebo arm, which translated into a 44% reduction in the risk of disease progression or death. The PFS benefit with ribociclib was observed across all preplanned subgroup analyses. The objective response rates were 52.7% in the ribociclib arm, compared with 37.1% in the placebo arm, but overall survival data were immature.
The frequency and severity of AEs were increased in the combination arm; most common were neutropenia, nausea, fatigue, diarrhea, leukopenia, alopecia, vomiting, constipation, headache, and back pain. The most common grade 3 or 4 AEs experienced with ribociclib were neutropenia, leukopenia, abnormal liver function tests, lymphopenia, and vomiting.
Ribociclib is accompanied by warnings and precautions about QT interval prolongation, hepatobiliary toxicity, and neutropenia. Clinicians are advised to monitor electrocardiograms and electrolytes before the start of ribociclib therapy and to begin treatment only in patients with QTcF values <450 ms and in whom electrolyte abnormalities have been corrected. ECG should be repeated at around day 14 of the first cycle, the beginning of the second cycle, and as deemed clinically necessary.
Liver function tests should be performed before starting treatment, every 2 weeks for the first 2 cycles, at the beginning of each of the subsequent 4 cycles, and as clinically indicated. For aspartate aminotransferase (AST) and/or alanine aminotransferase (ALT) levels greater than 3-5 times the upper limit of normal (ULN, grade 2), ribociclib should be interrupted until recovery to baseline or lower. For levels >5-20 times the ULN (grade 3) or recurring grade 2 increases, treatment should be interrupted until recovery to baseline or lower and then resumed at the next lowest dose level. Treatment with ribociclib should be discontinued in the event of recurring grade 3 elevations or for AST/ALT elevations >3 times ULN in combination with total bilirubin >2 times ULN.
Complete blood counts should be performed before starting treatment and monitored every 2 weeks for the first 2 cycles, at the beginning of each of the 4 subsequent cycles, and as clinically needed. If absolute neutrophil counts are 500-1,000 mm3 (grade 3), treatment should be discontinued until recovery to grade 2 or lower. If grade 3 neutropenia recurs or for grade 3 febrile neutropenia or grade 4 neutropenia, treatment should resume at a lower dose level upon recovery to grade 2 or lower.
Pregnant women and those of reproductive age should be warned of the risk of fetal harm and the need for effective contraception during treatment and for at least 3 weeks after the last dose. Ribociclib is marketed as Kisqali by Novartis.
1. Ribociclib (Kisqali). US Food and Drug Administration website. https://www.fda.gov/drugs/informationondrugs/approveddrugs/ucm546438.htm. Last updated March 14, 2017. Accessed April 3, 2017.
2. Kisqali (ribociclib) tables, for oral use. Prescribing information. Novartis Pharmaceuticals Corp. https://www.pharma.us.novartis.com/sites/www.pharma.us.novartis.com/files/kisqali.pdf. March 2017. Accessed April 3, 2017.
3. Horobagyi GN, Stemmer SN, Burris HA, et al. Ribociclib as first-line therapy for HR-positive, advanced breast cancer. N Engl J Med. 2016;375:1738-1748.
1. Ribociclib (Kisqali). US Food and Drug Administration website. https://www.fda.gov/drugs/informationondrugs/approveddrugs/ucm546438.htm. Last updated March 14, 2017. Accessed April 3, 2017.
2. Kisqali (ribociclib) tables, for oral use. Prescribing information. Novartis Pharmaceuticals Corp. https://www.pharma.us.novartis.com/sites/www.pharma.us.novartis.com/files/kisqali.pdf. March 2017. Accessed April 3, 2017.
3. Horobagyi GN, Stemmer SN, Burris HA, et al. Ribociclib as first-line therapy for HR-positive, advanced breast cancer. N Engl J Med. 2016;375:1738-1748.
Approval makes olaratumab the first first-line treatment option for soft tissue sarcoma in more than 40 years
When the US Food and Drug Administration approved olaratumab as a first-line treatment for patients with soft tissue sarcoma (STS) in the fall of 2016, it marked the first approval since the chemotherapy drug doxorubicin became standard of care more than 40 years ago.1 Though rare, STS, which comprises a host of different histologic subtypes, has proven difficult to treat. Like pazopanib, which was approved in 2012 for the treatment of STS in the second-line setting, olaratumab targets the platelet-derived growth factor receptor alpha (PDGFRα), a tyrosine kinase receptor involved in cell signaling pathways that promotes key hallmark abilities in both cancer cells and the cells of the tumor microenvironment. Olaratumab, however, is a much more specific inhibitor of PDGFRα compared with pazopanib.
Accelerated approval was granted for the treatment of patients with STS that is not amenable to curative treatment with radiotherapy or surgery and with a subtype that cannot be treated effectively with an anthracycline-containing regimen. The approval was based on the phase 2 JGDG study, a randomized, active-controlled clinical trial in which 133 patients were randomized 1:1 to receive olaratumab plus doxorubicin, or doxorubicin alone.2
Eligible patients included those aged 18 years and over, with histologically confirmed diagnosis of locally advanced or metastatic STS not previously treated with an anthracycline, with an Eastern Cooperative Oncology Group (ECOG) performance status of 0-2 (range, 1-5; 0, fully active and 5, dead), and with available tumor tissue for determination of PDGFRα expression by immunohistochemistry. Patients were enrolled at 16 clinical sites in 16 cities and 15 states in the United States from October 2010 to January 2013.
Patients were excluded if they had histologically or cytologically confirmed Kaposi sarcoma; untreated central nervous system metastases; received prior treatment with doxorubicin or other anthracyclines and anthracenediones, or any drug targeting PDGF or the PDGFRs; received concurrent treatment with other anticancer therapy within 4 weeks before study entry; unstable angina pectoris, angioplasty, cardiac stenting, or myocardial infarction within 6 months before study entry; HIV infection; or if they were pregnant or lactating.
Olaratumab was administered at 15 mg/kg as an intravenous infusion on days 1 and 8 of each 21-day cycle, and doxorubicin at 75 mg/m2 as an intravenous infusion on day 1 of each cycle, for a maximum of 8 cycles. Patients were permitted to receive dexarozoxane on cycles 5-8 and crossover was permitted. Tumor response was assessed by Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1) every 6 weeks, and survival assessed every 2 months, until study completion. PDGFR expression was assessed by immunohistochemistry at a central academic laboratory before randomization.
The primary endpoint of the study was progression-free survival (PFS) and the combination of olaratumab–doxorubicin significantly extended PFS in this patient population: median PFS was 6.6 months in the combination arm, compared with 4.1 months in the doxorubicin-alone arm (hazard ratio [HR], 0.672; P = .0615). The objective response rate (ORR) and median overall survival (OS), which were secondary endpoints in the trial, were also significantly improved with combination therapy compared with doxorubicin alone (ORR, 18.2% vs 11.9%, respectively; median OS, 26.5 months vs 14.7 months). The benefits of combination therapy were observed across prespecified subgroups, including histological tumor type, number of previous treatments, and PDGFRα expression level.
The most common adverse events (AEs) in the patients taking olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. Grade 3/4 AEs were also higher for the combination than for doxorubicin alone. The most common AE leading to discontinuation of olaratumab was infusion-related reactions, which occurred in 13% of patients.
According to the prescribing information, the recommended dose for olaratumab is 15 mg/kg as an intravenous infusion over 60 minutes on days 1 and 8 of each 21-day cycle until disease progression or unacceptable toxicity, in combination with doxorubicin for the first 8 cycles. Patients should be premedicated with dexamethasone and diphenhydramine, to help protect against infusion-related reactions.
Olaratumab, marketed as Lartruvo by Lilly Oncology, has warnings and precautions relating to infusion-related reactions and embryofetal toxicity. Patients should be monitored for signs and symptoms of the former during and after infusion and olaratumab should be administered in a setting with available resuscitation equipment. Olaratumab should be permanently discontinued in the event of grade 3/4 infusion-related reactions. Olaratumab can cause fetal harm and female patients should be advised of the potential risk to a fetus and the need for effective contraception during treatment and for 3 months after the last dose.
1. FDA grants accelerated approval to new treatment for advanced soft tissue sarcoma. FDA News Release. https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm525878.htm. Last updated October 19, 2016. Accessed March 6, 2017.
2. Tap WD, Jones RL, Van Tine BA, et al. Olaratumumab and doxorubicin versus doxorubicin alone for treatment of soft-tissue sarcoma: an open-label phase 1b and randomised phase 2 trial. Lancet. 2016;388(10043):488-497.
3. Lartruvo (olaratumumab) injection, for intravenous use. Prescribing information. Eli Lilly and Co. http://pi.lilly.com/us/lartruvo-uspi.pdf. Last update October 2016. Accessed March 6, 2017.
When the US Food and Drug Administration approved olaratumab as a first-line treatment for patients with soft tissue sarcoma (STS) in the fall of 2016, it marked the first approval since the chemotherapy drug doxorubicin became standard of care more than 40 years ago.1 Though rare, STS, which comprises a host of different histologic subtypes, has proven difficult to treat. Like pazopanib, which was approved in 2012 for the treatment of STS in the second-line setting, olaratumab targets the platelet-derived growth factor receptor alpha (PDGFRα), a tyrosine kinase receptor involved in cell signaling pathways that promotes key hallmark abilities in both cancer cells and the cells of the tumor microenvironment. Olaratumab, however, is a much more specific inhibitor of PDGFRα compared with pazopanib.
Accelerated approval was granted for the treatment of patients with STS that is not amenable to curative treatment with radiotherapy or surgery and with a subtype that cannot be treated effectively with an anthracycline-containing regimen. The approval was based on the phase 2 JGDG study, a randomized, active-controlled clinical trial in which 133 patients were randomized 1:1 to receive olaratumab plus doxorubicin, or doxorubicin alone.2
Eligible patients included those aged 18 years and over, with histologically confirmed diagnosis of locally advanced or metastatic STS not previously treated with an anthracycline, with an Eastern Cooperative Oncology Group (ECOG) performance status of 0-2 (range, 1-5; 0, fully active and 5, dead), and with available tumor tissue for determination of PDGFRα expression by immunohistochemistry. Patients were enrolled at 16 clinical sites in 16 cities and 15 states in the United States from October 2010 to January 2013.
Patients were excluded if they had histologically or cytologically confirmed Kaposi sarcoma; untreated central nervous system metastases; received prior treatment with doxorubicin or other anthracyclines and anthracenediones, or any drug targeting PDGF or the PDGFRs; received concurrent treatment with other anticancer therapy within 4 weeks before study entry; unstable angina pectoris, angioplasty, cardiac stenting, or myocardial infarction within 6 months before study entry; HIV infection; or if they were pregnant or lactating.
Olaratumab was administered at 15 mg/kg as an intravenous infusion on days 1 and 8 of each 21-day cycle, and doxorubicin at 75 mg/m2 as an intravenous infusion on day 1 of each cycle, for a maximum of 8 cycles. Patients were permitted to receive dexarozoxane on cycles 5-8 and crossover was permitted. Tumor response was assessed by Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1) every 6 weeks, and survival assessed every 2 months, until study completion. PDGFR expression was assessed by immunohistochemistry at a central academic laboratory before randomization.
The primary endpoint of the study was progression-free survival (PFS) and the combination of olaratumab–doxorubicin significantly extended PFS in this patient population: median PFS was 6.6 months in the combination arm, compared with 4.1 months in the doxorubicin-alone arm (hazard ratio [HR], 0.672; P = .0615). The objective response rate (ORR) and median overall survival (OS), which were secondary endpoints in the trial, were also significantly improved with combination therapy compared with doxorubicin alone (ORR, 18.2% vs 11.9%, respectively; median OS, 26.5 months vs 14.7 months). The benefits of combination therapy were observed across prespecified subgroups, including histological tumor type, number of previous treatments, and PDGFRα expression level.
The most common adverse events (AEs) in the patients taking olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. Grade 3/4 AEs were also higher for the combination than for doxorubicin alone. The most common AE leading to discontinuation of olaratumab was infusion-related reactions, which occurred in 13% of patients.
According to the prescribing information, the recommended dose for olaratumab is 15 mg/kg as an intravenous infusion over 60 minutes on days 1 and 8 of each 21-day cycle until disease progression or unacceptable toxicity, in combination with doxorubicin for the first 8 cycles. Patients should be premedicated with dexamethasone and diphenhydramine, to help protect against infusion-related reactions.
Olaratumab, marketed as Lartruvo by Lilly Oncology, has warnings and precautions relating to infusion-related reactions and embryofetal toxicity. Patients should be monitored for signs and symptoms of the former during and after infusion and olaratumab should be administered in a setting with available resuscitation equipment. Olaratumab should be permanently discontinued in the event of grade 3/4 infusion-related reactions. Olaratumab can cause fetal harm and female patients should be advised of the potential risk to a fetus and the need for effective contraception during treatment and for 3 months after the last dose.
When the US Food and Drug Administration approved olaratumab as a first-line treatment for patients with soft tissue sarcoma (STS) in the fall of 2016, it marked the first approval since the chemotherapy drug doxorubicin became standard of care more than 40 years ago.1 Though rare, STS, which comprises a host of different histologic subtypes, has proven difficult to treat. Like pazopanib, which was approved in 2012 for the treatment of STS in the second-line setting, olaratumab targets the platelet-derived growth factor receptor alpha (PDGFRα), a tyrosine kinase receptor involved in cell signaling pathways that promotes key hallmark abilities in both cancer cells and the cells of the tumor microenvironment. Olaratumab, however, is a much more specific inhibitor of PDGFRα compared with pazopanib.
Accelerated approval was granted for the treatment of patients with STS that is not amenable to curative treatment with radiotherapy or surgery and with a subtype that cannot be treated effectively with an anthracycline-containing regimen. The approval was based on the phase 2 JGDG study, a randomized, active-controlled clinical trial in which 133 patients were randomized 1:1 to receive olaratumab plus doxorubicin, or doxorubicin alone.2
Eligible patients included those aged 18 years and over, with histologically confirmed diagnosis of locally advanced or metastatic STS not previously treated with an anthracycline, with an Eastern Cooperative Oncology Group (ECOG) performance status of 0-2 (range, 1-5; 0, fully active and 5, dead), and with available tumor tissue for determination of PDGFRα expression by immunohistochemistry. Patients were enrolled at 16 clinical sites in 16 cities and 15 states in the United States from October 2010 to January 2013.
Patients were excluded if they had histologically or cytologically confirmed Kaposi sarcoma; untreated central nervous system metastases; received prior treatment with doxorubicin or other anthracyclines and anthracenediones, or any drug targeting PDGF or the PDGFRs; received concurrent treatment with other anticancer therapy within 4 weeks before study entry; unstable angina pectoris, angioplasty, cardiac stenting, or myocardial infarction within 6 months before study entry; HIV infection; or if they were pregnant or lactating.
Olaratumab was administered at 15 mg/kg as an intravenous infusion on days 1 and 8 of each 21-day cycle, and doxorubicin at 75 mg/m2 as an intravenous infusion on day 1 of each cycle, for a maximum of 8 cycles. Patients were permitted to receive dexarozoxane on cycles 5-8 and crossover was permitted. Tumor response was assessed by Response Evaluation Criteria in Solid Tumors (RECIST, version 1.1) every 6 weeks, and survival assessed every 2 months, until study completion. PDGFR expression was assessed by immunohistochemistry at a central academic laboratory before randomization.
The primary endpoint of the study was progression-free survival (PFS) and the combination of olaratumab–doxorubicin significantly extended PFS in this patient population: median PFS was 6.6 months in the combination arm, compared with 4.1 months in the doxorubicin-alone arm (hazard ratio [HR], 0.672; P = .0615). The objective response rate (ORR) and median overall survival (OS), which were secondary endpoints in the trial, were also significantly improved with combination therapy compared with doxorubicin alone (ORR, 18.2% vs 11.9%, respectively; median OS, 26.5 months vs 14.7 months). The benefits of combination therapy were observed across prespecified subgroups, including histological tumor type, number of previous treatments, and PDGFRα expression level.
The most common adverse events (AEs) in the patients taking olaratumab were nausea, fatigue, neutropenia, musculoskeletal pain, mucositis, alopecia, vomiting, diarrhea, decreased appetite, abdominal pain, neuropathy, and headache. Grade 3/4 AEs were also higher for the combination than for doxorubicin alone. The most common AE leading to discontinuation of olaratumab was infusion-related reactions, which occurred in 13% of patients.
According to the prescribing information, the recommended dose for olaratumab is 15 mg/kg as an intravenous infusion over 60 minutes on days 1 and 8 of each 21-day cycle until disease progression or unacceptable toxicity, in combination with doxorubicin for the first 8 cycles. Patients should be premedicated with dexamethasone and diphenhydramine, to help protect against infusion-related reactions.
Olaratumab, marketed as Lartruvo by Lilly Oncology, has warnings and precautions relating to infusion-related reactions and embryofetal toxicity. Patients should be monitored for signs and symptoms of the former during and after infusion and olaratumab should be administered in a setting with available resuscitation equipment. Olaratumab should be permanently discontinued in the event of grade 3/4 infusion-related reactions. Olaratumab can cause fetal harm and female patients should be advised of the potential risk to a fetus and the need for effective contraception during treatment and for 3 months after the last dose.
1. FDA grants accelerated approval to new treatment for advanced soft tissue sarcoma. FDA News Release. https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm525878.htm. Last updated October 19, 2016. Accessed March 6, 2017.
2. Tap WD, Jones RL, Van Tine BA, et al. Olaratumumab and doxorubicin versus doxorubicin alone for treatment of soft-tissue sarcoma: an open-label phase 1b and randomised phase 2 trial. Lancet. 2016;388(10043):488-497.
3. Lartruvo (olaratumumab) injection, for intravenous use. Prescribing information. Eli Lilly and Co. http://pi.lilly.com/us/lartruvo-uspi.pdf. Last update October 2016. Accessed March 6, 2017.
1. FDA grants accelerated approval to new treatment for advanced soft tissue sarcoma. FDA News Release. https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm525878.htm. Last updated October 19, 2016. Accessed March 6, 2017.
2. Tap WD, Jones RL, Van Tine BA, et al. Olaratumumab and doxorubicin versus doxorubicin alone for treatment of soft-tissue sarcoma: an open-label phase 1b and randomised phase 2 trial. Lancet. 2016;388(10043):488-497.
3. Lartruvo (olaratumumab) injection, for intravenous use. Prescribing information. Eli Lilly and Co. http://pi.lilly.com/us/lartruvo-uspi.pdf. Last update October 2016. Accessed March 6, 2017.
Researchers compare world health authorities
A new study has revealed substantial differences between health authorities in different regions of the world.
A pair of researchers compared 12 different regulatory authorities responsible for approving drugs and medical products.
The researchers collected data* on annual budgets, new drug approvals per year, numbers of reviewers, standard and median review times, fees for new drug applications (NDAs), and other measurements.
The results were published in Nature Reviews Drug Discovery.
For the 2015 fiscal year, the US Food and Drug Administration (FDA) had the highest budget—$1.19 billion—and India’s Central Drugs Standard Control Organization (CDSCO) had the lowest—$26 million.
In 2016, the FDA again had the highest budget—$1.23 billion—while Health Canada and Switzerland’s SwissMedic had the lowest—$108 million.
In 2016, the European Medicines Agency (EMA) had the highest number of reviewers—4500—and SwissMedic had the lowest—60. (Data from 2015 were not included.)
In 2015, Japan’s Pharmaceuticals and Medical Devices Agency had the highest number of NDA submissions—127—and Health Canada had the lowest—27. Meanwhile, the Chinese FDA had the highest number of new drug approvals—72—and India’s CDSCO had the lowest—17.
The UK’s Medicines and Healthcare products Regulatory Agency (MHRA) technically had the most new drug approvals in 2015, at 146, but not all of these were unique, as the number included all decentralized applications, both with the UK as the reference member state and approvals from concerned member states.
In 2016, the EMA had the highest number of NDA submissions—68—and Health Canada had the lowest—25. Singapore’s Health Sciences Authority had the highest number of new drug approvals—72—while the US FDA and India’s CDSCO had the lowest—22.
The shortest standard review period was 210 days. This is the standard for the EMA, the UK’s MHRA, and Russia’s Roszdravnadzor. The regulatory agency with the longest standard review time—900 days—is the Chinese FDA.
The shortest median time to new drug approval in 2015 was 230 days, for the UK’s MHRA. The longest was 834 days, for the Brazilian Health Surveillance Agency.
The highest NDA review fees were those charged by the US FDA—$2.3 million. The lowest were those charged by India’s CDSCO—50,000 Indian rupees or about USD$1000.
The researchers noted that these data suggest products are being evaluated via different processes and according to different standards, which makes it challenging for pharmaceutical companies to develop drugs for simultaneous submission to all regulatory authorities.
Therefore, a harmonization of approval requirements and processes could significantly improve efficiency.
“Patients would profit especially since new drugs would be available faster and at lower prices,” said study author Thomas D. Szucs, MD, PhD, of the University of Basel in Switzerland.
“This suggests that companies and authorities should strengthen their international collaboration and communicate better with each other.”
*Some data were missing for most of the 12 agencies studied.
A new study has revealed substantial differences between health authorities in different regions of the world.
A pair of researchers compared 12 different regulatory authorities responsible for approving drugs and medical products.
The researchers collected data* on annual budgets, new drug approvals per year, numbers of reviewers, standard and median review times, fees for new drug applications (NDAs), and other measurements.
The results were published in Nature Reviews Drug Discovery.
For the 2015 fiscal year, the US Food and Drug Administration (FDA) had the highest budget—$1.19 billion—and India’s Central Drugs Standard Control Organization (CDSCO) had the lowest—$26 million.
In 2016, the FDA again had the highest budget—$1.23 billion—while Health Canada and Switzerland’s SwissMedic had the lowest—$108 million.
In 2016, the European Medicines Agency (EMA) had the highest number of reviewers—4500—and SwissMedic had the lowest—60. (Data from 2015 were not included.)
In 2015, Japan’s Pharmaceuticals and Medical Devices Agency had the highest number of NDA submissions—127—and Health Canada had the lowest—27. Meanwhile, the Chinese FDA had the highest number of new drug approvals—72—and India’s CDSCO had the lowest—17.
The UK’s Medicines and Healthcare products Regulatory Agency (MHRA) technically had the most new drug approvals in 2015, at 146, but not all of these were unique, as the number included all decentralized applications, both with the UK as the reference member state and approvals from concerned member states.
In 2016, the EMA had the highest number of NDA submissions—68—and Health Canada had the lowest—25. Singapore’s Health Sciences Authority had the highest number of new drug approvals—72—while the US FDA and India’s CDSCO had the lowest—22.
The shortest standard review period was 210 days. This is the standard for the EMA, the UK’s MHRA, and Russia’s Roszdravnadzor. The regulatory agency with the longest standard review time—900 days—is the Chinese FDA.
The shortest median time to new drug approval in 2015 was 230 days, for the UK’s MHRA. The longest was 834 days, for the Brazilian Health Surveillance Agency.
The highest NDA review fees were those charged by the US FDA—$2.3 million. The lowest were those charged by India’s CDSCO—50,000 Indian rupees or about USD$1000.
The researchers noted that these data suggest products are being evaluated via different processes and according to different standards, which makes it challenging for pharmaceutical companies to develop drugs for simultaneous submission to all regulatory authorities.
Therefore, a harmonization of approval requirements and processes could significantly improve efficiency.
“Patients would profit especially since new drugs would be available faster and at lower prices,” said study author Thomas D. Szucs, MD, PhD, of the University of Basel in Switzerland.
“This suggests that companies and authorities should strengthen their international collaboration and communicate better with each other.”
*Some data were missing for most of the 12 agencies studied.
A new study has revealed substantial differences between health authorities in different regions of the world.
A pair of researchers compared 12 different regulatory authorities responsible for approving drugs and medical products.
The researchers collected data* on annual budgets, new drug approvals per year, numbers of reviewers, standard and median review times, fees for new drug applications (NDAs), and other measurements.
The results were published in Nature Reviews Drug Discovery.
For the 2015 fiscal year, the US Food and Drug Administration (FDA) had the highest budget—$1.19 billion—and India’s Central Drugs Standard Control Organization (CDSCO) had the lowest—$26 million.
In 2016, the FDA again had the highest budget—$1.23 billion—while Health Canada and Switzerland’s SwissMedic had the lowest—$108 million.
In 2016, the European Medicines Agency (EMA) had the highest number of reviewers—4500—and SwissMedic had the lowest—60. (Data from 2015 were not included.)
In 2015, Japan’s Pharmaceuticals and Medical Devices Agency had the highest number of NDA submissions—127—and Health Canada had the lowest—27. Meanwhile, the Chinese FDA had the highest number of new drug approvals—72—and India’s CDSCO had the lowest—17.
The UK’s Medicines and Healthcare products Regulatory Agency (MHRA) technically had the most new drug approvals in 2015, at 146, but not all of these were unique, as the number included all decentralized applications, both with the UK as the reference member state and approvals from concerned member states.
In 2016, the EMA had the highest number of NDA submissions—68—and Health Canada had the lowest—25. Singapore’s Health Sciences Authority had the highest number of new drug approvals—72—while the US FDA and India’s CDSCO had the lowest—22.
The shortest standard review period was 210 days. This is the standard for the EMA, the UK’s MHRA, and Russia’s Roszdravnadzor. The regulatory agency with the longest standard review time—900 days—is the Chinese FDA.
The shortest median time to new drug approval in 2015 was 230 days, for the UK’s MHRA. The longest was 834 days, for the Brazilian Health Surveillance Agency.
The highest NDA review fees were those charged by the US FDA—$2.3 million. The lowest were those charged by India’s CDSCO—50,000 Indian rupees or about USD$1000.
The researchers noted that these data suggest products are being evaluated via different processes and according to different standards, which makes it challenging for pharmaceutical companies to develop drugs for simultaneous submission to all regulatory authorities.
Therefore, a harmonization of approval requirements and processes could significantly improve efficiency.
“Patients would profit especially since new drugs would be available faster and at lower prices,” said study author Thomas D. Szucs, MD, PhD, of the University of Basel in Switzerland.
“This suggests that companies and authorities should strengthen their international collaboration and communicate better with each other.”
*Some data were missing for most of the 12 agencies studied.
Test uses nanotechnology to diagnose Zika virus
Researchers say they have developed a point-of-care, paper-based test that quickly detects the presence of Zika virus in blood.
Currently, testing for Zika requires that a blood sample be refrigerated and shipped to a medical center or laboratory, delaying diagnosis and possible treatment.
The new test, on the other hand, does not require refrigeration and can produce results in minutes.
“If an assay requires electricity and refrigeration, it defeats the purpose of developing something to use in a resource-limited setting, especially in tropical areas of the world,” said Srikanth Singamaneni, PhD, of Washington University in St. Louis, Missouri.
“We wanted to make the test immune from variations in temperature and humidity.”
Dr Singamaneni and his colleagues described this test in Advanced Biosystems.
The researchers used the test on blood samples from 4 subjects known to be infected with Zika virus and samples from 5 subjects who did not have the virus.
The test showed positive results for the Zika-infected patients and negative results for controls. There were no false-positives.
How the test works
The test uses gold nanorods mounted on paper to detect Zika infection in the blood.
The test relies on a protein made by Zika virus that causes an immune response in infected individuals— ZIKV-nonstructural protein 1 (NS1). The ZIKV-NS1 protein is attached to gold nanorods mounted on a piece of paper.
The paper is then covered with protective nanocrystals. The nanocrystals allow the diagnostic nanorods to be shipped and stored without refrigeration prior to use.
To use the test, a technician rinses the paper with slightly acidic water, removing the protective crystals and exposing the protein mounted on the nanorods.
Then, a drop of the patient’s blood is applied. If the patient has come into contact with the virus, the blood will contain immunoglobulins that react with the ZIKV-NS1 protein.
“We’re taking advantage of the fact that patients mount an immune attack against this viral protein,” said study author Jeremiah J. Morrissey, PhD, of Washington University.
“The immunoglobulins persist in the blood for a few months, and when they come into contact with the gold nanorods, the nanorods undergo a slight color change that can be detected with a hand-held spectrophotometer. With this test, results will be clear before the patient leaves the clinic, allowing immediate counseling and access to treatment.”
The color change cannot be seen with the naked eye, but the researchers are working to change that. They’re also working on developing ways to use saliva rather than blood.
Although the test uses gold, the nanorods are very small. The researchers estimate the cost of the gold used in one of the assays would be 10 to 15 cents.
Researchers say they have developed a point-of-care, paper-based test that quickly detects the presence of Zika virus in blood.
Currently, testing for Zika requires that a blood sample be refrigerated and shipped to a medical center or laboratory, delaying diagnosis and possible treatment.
The new test, on the other hand, does not require refrigeration and can produce results in minutes.
“If an assay requires electricity and refrigeration, it defeats the purpose of developing something to use in a resource-limited setting, especially in tropical areas of the world,” said Srikanth Singamaneni, PhD, of Washington University in St. Louis, Missouri.
“We wanted to make the test immune from variations in temperature and humidity.”
Dr Singamaneni and his colleagues described this test in Advanced Biosystems.
The researchers used the test on blood samples from 4 subjects known to be infected with Zika virus and samples from 5 subjects who did not have the virus.
The test showed positive results for the Zika-infected patients and negative results for controls. There were no false-positives.
How the test works
The test uses gold nanorods mounted on paper to detect Zika infection in the blood.
The test relies on a protein made by Zika virus that causes an immune response in infected individuals— ZIKV-nonstructural protein 1 (NS1). The ZIKV-NS1 protein is attached to gold nanorods mounted on a piece of paper.
The paper is then covered with protective nanocrystals. The nanocrystals allow the diagnostic nanorods to be shipped and stored without refrigeration prior to use.
To use the test, a technician rinses the paper with slightly acidic water, removing the protective crystals and exposing the protein mounted on the nanorods.
Then, a drop of the patient’s blood is applied. If the patient has come into contact with the virus, the blood will contain immunoglobulins that react with the ZIKV-NS1 protein.
“We’re taking advantage of the fact that patients mount an immune attack against this viral protein,” said study author Jeremiah J. Morrissey, PhD, of Washington University.
“The immunoglobulins persist in the blood for a few months, and when they come into contact with the gold nanorods, the nanorods undergo a slight color change that can be detected with a hand-held spectrophotometer. With this test, results will be clear before the patient leaves the clinic, allowing immediate counseling and access to treatment.”
The color change cannot be seen with the naked eye, but the researchers are working to change that. They’re also working on developing ways to use saliva rather than blood.
Although the test uses gold, the nanorods are very small. The researchers estimate the cost of the gold used in one of the assays would be 10 to 15 cents.
Researchers say they have developed a point-of-care, paper-based test that quickly detects the presence of Zika virus in blood.
Currently, testing for Zika requires that a blood sample be refrigerated and shipped to a medical center or laboratory, delaying diagnosis and possible treatment.
The new test, on the other hand, does not require refrigeration and can produce results in minutes.
“If an assay requires electricity and refrigeration, it defeats the purpose of developing something to use in a resource-limited setting, especially in tropical areas of the world,” said Srikanth Singamaneni, PhD, of Washington University in St. Louis, Missouri.
“We wanted to make the test immune from variations in temperature and humidity.”
Dr Singamaneni and his colleagues described this test in Advanced Biosystems.
The researchers used the test on blood samples from 4 subjects known to be infected with Zika virus and samples from 5 subjects who did not have the virus.
The test showed positive results for the Zika-infected patients and negative results for controls. There were no false-positives.
How the test works
The test uses gold nanorods mounted on paper to detect Zika infection in the blood.
The test relies on a protein made by Zika virus that causes an immune response in infected individuals— ZIKV-nonstructural protein 1 (NS1). The ZIKV-NS1 protein is attached to gold nanorods mounted on a piece of paper.
The paper is then covered with protective nanocrystals. The nanocrystals allow the diagnostic nanorods to be shipped and stored without refrigeration prior to use.
To use the test, a technician rinses the paper with slightly acidic water, removing the protective crystals and exposing the protein mounted on the nanorods.
Then, a drop of the patient’s blood is applied. If the patient has come into contact with the virus, the blood will contain immunoglobulins that react with the ZIKV-NS1 protein.
“We’re taking advantage of the fact that patients mount an immune attack against this viral protein,” said study author Jeremiah J. Morrissey, PhD, of Washington University.
“The immunoglobulins persist in the blood for a few months, and when they come into contact with the gold nanorods, the nanorods undergo a slight color change that can be detected with a hand-held spectrophotometer. With this test, results will be clear before the patient leaves the clinic, allowing immediate counseling and access to treatment.”
The color change cannot be seen with the naked eye, but the researchers are working to change that. They’re also working on developing ways to use saliva rather than blood.
Although the test uses gold, the nanorods are very small. The researchers estimate the cost of the gold used in one of the assays would be 10 to 15 cents.
Exelixis seeks expanded indication for cabozantinib in RCC
Exelixis has submitted a supplemental New Drug Application to the Food and Drug Administration for cabozantinib (Cabometyx) for the treatment of previously untreated advanced renal cell carcinoma (RCC).
The application, announced on Aug. 16, seeks to allow the manufacturer to modify the label. Cabozantinib was approved in April 2016 for treatment of patients with advanced RCC who had previously received antiangiogenic therapy.
The results of the trial were published in the Journal of Clinical Oncology (2017 Feb 20;35[6]:591-7). An independent review committee confirmed the primary efficacy endpoint results in June 2017.
Exelixis has submitted a supplemental New Drug Application to the Food and Drug Administration for cabozantinib (Cabometyx) for the treatment of previously untreated advanced renal cell carcinoma (RCC).
The application, announced on Aug. 16, seeks to allow the manufacturer to modify the label. Cabozantinib was approved in April 2016 for treatment of patients with advanced RCC who had previously received antiangiogenic therapy.
The results of the trial were published in the Journal of Clinical Oncology (2017 Feb 20;35[6]:591-7). An independent review committee confirmed the primary efficacy endpoint results in June 2017.
Exelixis has submitted a supplemental New Drug Application to the Food and Drug Administration for cabozantinib (Cabometyx) for the treatment of previously untreated advanced renal cell carcinoma (RCC).
The application, announced on Aug. 16, seeks to allow the manufacturer to modify the label. Cabozantinib was approved in April 2016 for treatment of patients with advanced RCC who had previously received antiangiogenic therapy.
The results of the trial were published in the Journal of Clinical Oncology (2017 Feb 20;35[6]:591-7). An independent review committee confirmed the primary efficacy endpoint results in June 2017.
NASH did not increase risk of poor liver transplantation outcomes
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
FROM DIGESTIVE AND LIVER DISEASES
Key clinical point: Adults with nonalcoholic steatohepatitis (NASH) fared as well as on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities.
Major finding: Patients with and without NASH had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival (94% and 90%, respectively), and major postoperative complications.
Data source: A single-center retrospective cohort study of 169 adult liver transplant recipients, of whom 20% were transplanted for NASH cirrhosis.
Disclosures: The investigators received no funding for the study and reported having no conflicts of interest.
Brain scan study suggests serotonin drives early cognitive decline
Serotonin levels may play an early role in cognitive decline, according to imaging data from a group of study participants with mild cognitive impairment.
Previous studies have shown a link between a loss of serotonin neurons as part of normal aging and as part of the development of Alzheimer’s disease, wrote Gwenn S. Smith, PhD, of Johns Hopkins University in Baltimore, and her colleagues. However, “it is not known whether serotonin degeneration occurs in the preclinical stages or later in the course of AD,” the researchers said. The researchers used MRI and PET imaging to measure serotonin transporters (SERT) in the brains of 28 adults with mild cognitive impairment (MCI) and 28 healthy controls.
Overall, MCI participants showed significantly lower SERT, compared with the healthy controls; the loss of SERT in the MCI group ranged from 10% to 38%. No significant regions of higher SERT were noted in the MCI group, compared with the controls, and no significant differences were noted in gray matter volume.
The participants were recruited from the community or from the Johns Hopkins University Alzheimer’s Disease Research Center. All of the subjects underwent several evaluations, including the Clinical Dementia Rating Scale and the Mini-Mental State Examination, and mild cognitive impairment was defined as a cognitive decline mainly in the ability to remember sequences or organization, as well as having lower scores on tests such as the California Verbal Learning Test (Neurobiol Dis. 2017 Sep;105:33-41). The average age of the participants was 66 years, and about 45% were women.
The study results were limited by several factors, including lack of arterial blood for quantification purposes and the absence of partial-volume correction in assessing the brain images, reported Dr. Smith, a professor of psychiatry and behavioral sciences at the university, and her colleagues.
However, the findings suggest that “the loss of SERT in MCI may have a substantial impact on brain function and behavior given the widespread distribution of SERT in the brain and the evidence that serotonin modulates other neurotransmitters (glutamate, norepinephrine, dopamine, and acetylcholine) implicated in AD and potentially MCI,” they emphasized.
Based on the findings, next steps for research could include targeting receptors that detect serotonin on message-receiving cells. “The substantially lower SERT in MCI observed in the present study suggests that the serotonin system may represent an important target for prevention and treatment,” the researchers noted.
Dr. Smith has received research funding from the National Institutes of Health and Functional Neuromodulation. This study was supported by the NIH.
Serotonin levels may play an early role in cognitive decline, according to imaging data from a group of study participants with mild cognitive impairment.
Previous studies have shown a link between a loss of serotonin neurons as part of normal aging and as part of the development of Alzheimer’s disease, wrote Gwenn S. Smith, PhD, of Johns Hopkins University in Baltimore, and her colleagues. However, “it is not known whether serotonin degeneration occurs in the preclinical stages or later in the course of AD,” the researchers said. The researchers used MRI and PET imaging to measure serotonin transporters (SERT) in the brains of 28 adults with mild cognitive impairment (MCI) and 28 healthy controls.
Overall, MCI participants showed significantly lower SERT, compared with the healthy controls; the loss of SERT in the MCI group ranged from 10% to 38%. No significant regions of higher SERT were noted in the MCI group, compared with the controls, and no significant differences were noted in gray matter volume.
The participants were recruited from the community or from the Johns Hopkins University Alzheimer’s Disease Research Center. All of the subjects underwent several evaluations, including the Clinical Dementia Rating Scale and the Mini-Mental State Examination, and mild cognitive impairment was defined as a cognitive decline mainly in the ability to remember sequences or organization, as well as having lower scores on tests such as the California Verbal Learning Test (Neurobiol Dis. 2017 Sep;105:33-41). The average age of the participants was 66 years, and about 45% were women.
The study results were limited by several factors, including lack of arterial blood for quantification purposes and the absence of partial-volume correction in assessing the brain images, reported Dr. Smith, a professor of psychiatry and behavioral sciences at the university, and her colleagues.
However, the findings suggest that “the loss of SERT in MCI may have a substantial impact on brain function and behavior given the widespread distribution of SERT in the brain and the evidence that serotonin modulates other neurotransmitters (glutamate, norepinephrine, dopamine, and acetylcholine) implicated in AD and potentially MCI,” they emphasized.
Based on the findings, next steps for research could include targeting receptors that detect serotonin on message-receiving cells. “The substantially lower SERT in MCI observed in the present study suggests that the serotonin system may represent an important target for prevention and treatment,” the researchers noted.
Dr. Smith has received research funding from the National Institutes of Health and Functional Neuromodulation. This study was supported by the NIH.
Serotonin levels may play an early role in cognitive decline, according to imaging data from a group of study participants with mild cognitive impairment.
Previous studies have shown a link between a loss of serotonin neurons as part of normal aging and as part of the development of Alzheimer’s disease, wrote Gwenn S. Smith, PhD, of Johns Hopkins University in Baltimore, and her colleagues. However, “it is not known whether serotonin degeneration occurs in the preclinical stages or later in the course of AD,” the researchers said. The researchers used MRI and PET imaging to measure serotonin transporters (SERT) in the brains of 28 adults with mild cognitive impairment (MCI) and 28 healthy controls.
Overall, MCI participants showed significantly lower SERT, compared with the healthy controls; the loss of SERT in the MCI group ranged from 10% to 38%. No significant regions of higher SERT were noted in the MCI group, compared with the controls, and no significant differences were noted in gray matter volume.
The participants were recruited from the community or from the Johns Hopkins University Alzheimer’s Disease Research Center. All of the subjects underwent several evaluations, including the Clinical Dementia Rating Scale and the Mini-Mental State Examination, and mild cognitive impairment was defined as a cognitive decline mainly in the ability to remember sequences or organization, as well as having lower scores on tests such as the California Verbal Learning Test (Neurobiol Dis. 2017 Sep;105:33-41). The average age of the participants was 66 years, and about 45% were women.
The study results were limited by several factors, including lack of arterial blood for quantification purposes and the absence of partial-volume correction in assessing the brain images, reported Dr. Smith, a professor of psychiatry and behavioral sciences at the university, and her colleagues.
However, the findings suggest that “the loss of SERT in MCI may have a substantial impact on brain function and behavior given the widespread distribution of SERT in the brain and the evidence that serotonin modulates other neurotransmitters (glutamate, norepinephrine, dopamine, and acetylcholine) implicated in AD and potentially MCI,” they emphasized.
Based on the findings, next steps for research could include targeting receptors that detect serotonin on message-receiving cells. “The substantially lower SERT in MCI observed in the present study suggests that the serotonin system may represent an important target for prevention and treatment,” the researchers noted.
Dr. Smith has received research funding from the National Institutes of Health and Functional Neuromodulation. This study was supported by the NIH.
FROM NEUROBIOLOGY OF DISEASE
Key clinical point: Levels of serotonin transporters were significantly lower in the cortical, striatal, thalamic, and limbic regions of the brain of adults with mild cognitive impairment, compared with healthy controls.
Major finding: The decrease in serotonin transporters (SERT) ranged from 10% to 38% in MCI adults, compared with healthy controls.
Data source: The data come from a comparison study of 28 adults with MCI and 28 healthy controls.
Disclosures: Dr. Smith has received research funding from the National Institutes of Health and Functional Neuromodulation. The study was supported by the NIH.
Breaking bad news
As psychiatrists, we do not often encounter situations in which we need to inform patients and their families that they have a life-threatening diagnosis. Nonetheless, on the rare occasions when we work with such patients, new psychiatrists may find their clinical skills challenged. Breaking bad news remains an aspect of clinical training that is often overlooked by medical schools.
Here I present a case that illustrates the challenges residents and medical students face when they need to deliver bad news and review the current literature on how to present patients with this type of information.
Case
Bizarre behavior, difficult diagnosis
Mr. C, age 59, with a 1-year history of major depressive disorder, was brought to the emergency department by his wife for worsening depression and disorganized behavior over the course of 3 weeks. Mr. C had obsessive thoughts about arranging things in symmetrical patterns, difficulty sleeping, loss of appetite, and anhedonia. His wife reported that his bizarre, disorganized behavior further intensified in the previous week; he was urinating on the rug, rubbing his genitals against the bathroom counter, staring into space without moving for prolonged periods of time, and arranging his food in symmetrical patterns. Mr. C has no reported substance use or suicidal or homicidal ideation.
Strategies for delivering bad news
Initially, I struggled when I realized I would have to deliver the news of this potentially life-threatening diagnosis to the patient and his wife because I had not received any training on how to do so. However, I took time to look into the literature and used the skills that we as psychiatrists have, including empathy, listening, and validation. My experience with Mr. C and his family made me realize that delivering bad news with empathy and being involved in the struggle that follows can make a significant difference to their suffering.
There are various models and techniques for breaking bad news to patients. Two of the most commonly used models in the oncology setting are the SPIKES (Set up, Perception, Interview, Knowledge, Emotions, Strategize and Summarize) model (Table 12) and Kaye’s model (Table 23).
A clinician’s attitude and communication skills play a crucial role in how well patients and family members cope when they receive bad news. When delivering bad news:
- Choose a setting with adequate privacy, use simple language that distills medical facts into appreciable pieces of information, and give the recipients ample space and time to process the implications. Doing so will foster better understanding and facilitate their acceptance of the bad news.
- Although physicians can rarely appreciate the patient’s feelings at a personal level, make every effort to validate their thoughts and offer emotional support. In such situations, it is often appropriate to move closer to the recipient and make brief physical gestures, such as laying a hand on the shoulder, which might comfort them.
- In the aftermath of such encounters, it is important to remain active in the patient’s emotional journey and available for further clarification, which can mitigate uncertainties and facilitate the difficult process of coming to terms with new realities.4,5
1. Munjal S, Pahlajani S, Baxi A, et al. Delayed diagnosis of glioblastoma multiforme presenting with atypical psychiatric symptoms. Prim Care Companion CNS Disord. 2016;18(6). doi: 10.4088/PCC.16l01972.
2. Baile WF, Buckman R, Lenzi R, et al. SPIKES-a six-step protocol for delivering bad news: application to the patient with cancer. Oncologist. 2000;5(4):302-311.
3. Kaye P. Breaking bad news: a 10 step approach. Northampton, MA: EPL Publications; 1995.
4. Chaturvedi SK, Chandra PS. Breaking bad news-issues important for psychiatrists. Asian J Psychiatr. 2010;3(2):87-89.
5. VandeKieft GK. Breaking bad news. Am Fam Physician. 2001;64(12):1975-1978.
As psychiatrists, we do not often encounter situations in which we need to inform patients and their families that they have a life-threatening diagnosis. Nonetheless, on the rare occasions when we work with such patients, new psychiatrists may find their clinical skills challenged. Breaking bad news remains an aspect of clinical training that is often overlooked by medical schools.
Here I present a case that illustrates the challenges residents and medical students face when they need to deliver bad news and review the current literature on how to present patients with this type of information.
Case
Bizarre behavior, difficult diagnosis
Mr. C, age 59, with a 1-year history of major depressive disorder, was brought to the emergency department by his wife for worsening depression and disorganized behavior over the course of 3 weeks. Mr. C had obsessive thoughts about arranging things in symmetrical patterns, difficulty sleeping, loss of appetite, and anhedonia. His wife reported that his bizarre, disorganized behavior further intensified in the previous week; he was urinating on the rug, rubbing his genitals against the bathroom counter, staring into space without moving for prolonged periods of time, and arranging his food in symmetrical patterns. Mr. C has no reported substance use or suicidal or homicidal ideation.
Strategies for delivering bad news
Initially, I struggled when I realized I would have to deliver the news of this potentially life-threatening diagnosis to the patient and his wife because I had not received any training on how to do so. However, I took time to look into the literature and used the skills that we as psychiatrists have, including empathy, listening, and validation. My experience with Mr. C and his family made me realize that delivering bad news with empathy and being involved in the struggle that follows can make a significant difference to their suffering.
There are various models and techniques for breaking bad news to patients. Two of the most commonly used models in the oncology setting are the SPIKES (Set up, Perception, Interview, Knowledge, Emotions, Strategize and Summarize) model (Table 12) and Kaye’s model (Table 23).
A clinician’s attitude and communication skills play a crucial role in how well patients and family members cope when they receive bad news. When delivering bad news:
- Choose a setting with adequate privacy, use simple language that distills medical facts into appreciable pieces of information, and give the recipients ample space and time to process the implications. Doing so will foster better understanding and facilitate their acceptance of the bad news.
- Although physicians can rarely appreciate the patient’s feelings at a personal level, make every effort to validate their thoughts and offer emotional support. In such situations, it is often appropriate to move closer to the recipient and make brief physical gestures, such as laying a hand on the shoulder, which might comfort them.
- In the aftermath of such encounters, it is important to remain active in the patient’s emotional journey and available for further clarification, which can mitigate uncertainties and facilitate the difficult process of coming to terms with new realities.4,5
As psychiatrists, we do not often encounter situations in which we need to inform patients and their families that they have a life-threatening diagnosis. Nonetheless, on the rare occasions when we work with such patients, new psychiatrists may find their clinical skills challenged. Breaking bad news remains an aspect of clinical training that is often overlooked by medical schools.
Here I present a case that illustrates the challenges residents and medical students face when they need to deliver bad news and review the current literature on how to present patients with this type of information.
Case
Bizarre behavior, difficult diagnosis
Mr. C, age 59, with a 1-year history of major depressive disorder, was brought to the emergency department by his wife for worsening depression and disorganized behavior over the course of 3 weeks. Mr. C had obsessive thoughts about arranging things in symmetrical patterns, difficulty sleeping, loss of appetite, and anhedonia. His wife reported that his bizarre, disorganized behavior further intensified in the previous week; he was urinating on the rug, rubbing his genitals against the bathroom counter, staring into space without moving for prolonged periods of time, and arranging his food in symmetrical patterns. Mr. C has no reported substance use or suicidal or homicidal ideation.
Strategies for delivering bad news
Initially, I struggled when I realized I would have to deliver the news of this potentially life-threatening diagnosis to the patient and his wife because I had not received any training on how to do so. However, I took time to look into the literature and used the skills that we as psychiatrists have, including empathy, listening, and validation. My experience with Mr. C and his family made me realize that delivering bad news with empathy and being involved in the struggle that follows can make a significant difference to their suffering.
There are various models and techniques for breaking bad news to patients. Two of the most commonly used models in the oncology setting are the SPIKES (Set up, Perception, Interview, Knowledge, Emotions, Strategize and Summarize) model (Table 12) and Kaye’s model (Table 23).
A clinician’s attitude and communication skills play a crucial role in how well patients and family members cope when they receive bad news. When delivering bad news:
- Choose a setting with adequate privacy, use simple language that distills medical facts into appreciable pieces of information, and give the recipients ample space and time to process the implications. Doing so will foster better understanding and facilitate their acceptance of the bad news.
- Although physicians can rarely appreciate the patient’s feelings at a personal level, make every effort to validate their thoughts and offer emotional support. In such situations, it is often appropriate to move closer to the recipient and make brief physical gestures, such as laying a hand on the shoulder, which might comfort them.
- In the aftermath of such encounters, it is important to remain active in the patient’s emotional journey and available for further clarification, which can mitigate uncertainties and facilitate the difficult process of coming to terms with new realities.4,5
1. Munjal S, Pahlajani S, Baxi A, et al. Delayed diagnosis of glioblastoma multiforme presenting with atypical psychiatric symptoms. Prim Care Companion CNS Disord. 2016;18(6). doi: 10.4088/PCC.16l01972.
2. Baile WF, Buckman R, Lenzi R, et al. SPIKES-a six-step protocol for delivering bad news: application to the patient with cancer. Oncologist. 2000;5(4):302-311.
3. Kaye P. Breaking bad news: a 10 step approach. Northampton, MA: EPL Publications; 1995.
4. Chaturvedi SK, Chandra PS. Breaking bad news-issues important for psychiatrists. Asian J Psychiatr. 2010;3(2):87-89.
5. VandeKieft GK. Breaking bad news. Am Fam Physician. 2001;64(12):1975-1978.
1. Munjal S, Pahlajani S, Baxi A, et al. Delayed diagnosis of glioblastoma multiforme presenting with atypical psychiatric symptoms. Prim Care Companion CNS Disord. 2016;18(6). doi: 10.4088/PCC.16l01972.
2. Baile WF, Buckman R, Lenzi R, et al. SPIKES-a six-step protocol for delivering bad news: application to the patient with cancer. Oncologist. 2000;5(4):302-311.
3. Kaye P. Breaking bad news: a 10 step approach. Northampton, MA: EPL Publications; 1995.
4. Chaturvedi SK, Chandra PS. Breaking bad news-issues important for psychiatrists. Asian J Psychiatr. 2010;3(2):87-89.
5. VandeKieft GK. Breaking bad news. Am Fam Physician. 2001;64(12):1975-1978.
Big changes coming for thyroid cancer staging
BOSTON – When the American Joint Committee on Cancer’s Eighth Edition Cancer Staging Manual becomes effective for U.S. practice on Jan. 1, 2018, substantially more patients with thyroid cancer will meet the definition for stage I disease, but their survival prognosis will remain as good as it was for the smaller slice of patients defined with stage I thyroid cancer by the seventh edition, Bryan R. Haugen, MD, predicted during a talk at the World Congress on Thyroid Cancer.
Under current stage definitions in the seventh edition, roughly 60% of thyroid cancer patients have stage I disease, but this will kick up to about 80% under the eighth edition, said Dr. Haugen, professor of medicine and head of the division of endocrinology, metabolism, and diabetes at the University of Colorado in Aurora. Despite this influx of more patients, “survival rates in stage I patients haven’t changed,” with a disease-specific survival (DSS) of 98%-100% for stage I patients in the eighth edition compared with 97%-100% in the seventh edition, he noted.
Dr. Haugen credited this apparent paradox to the revised staging system’s superior discrimination among various grades of disease progression. “The eighth edition better separates patients based on their projected survival.” As more patients fit stage I classification with its highest level of projected survival, fewer patients will classify with more advanced disease and its worse projected survival.
For example, in the seventh edition patients with stage IV disease had a projected DSS rate of 50%-75%; in the eighth edition that rate is now less than 50%. The projected DSS rate for patients with stage II disease has down shifted from 97%-100% in the seventh edition to 85%-95% in the eighth. For patients with stage III thyroid cancer the DSS rate of 88%-95% in the seventh edition became 60%-70% in the eighth edition.
‘The new system will take some getting used to,” Dr. Haugen admitted, and it involves even more “big” changes, he warned. These include:
• Changing the cutpoint separating younger from older patients to 55 years of age in the eighth edition, a rise from the 45-year-old cutpoint in the seventh edition.
• Allowing tumors classified as stage I to be as large as 4 cm, up from the 2 cm or less defining stage I in the seventh edition.
• Reserving stage II designation for patients with tumors larger than 4 cm. In the seventh edition tumors had to be 2-4 cm in size.
• Expanding stage II disease to include not only patients with disease confined to their thyroid, but also patients with N1 lymph node spread or gross extrathyroidal extension. In the seventh edition tumor spread like this put patients into stage III.
• Specifying in the eighth edition that stage III disease must feature gross extrathyroidal extension into the larynx, trachea, esophagus, or recurrent laryngial nerve. To qualify for stage IV in the eighth edition, spread must extend into prevertebral fascia or encase major vessels, for stage IVA, or involve distant metastases for stage IVB.
• Paring down three stage IV subgroups, A, B, and C, in the seventh edition to just an A or B subgroup in the eighth edition.
Dr. Haugen coauthored a recent editorial that laid out an assessment of the eighth edition in greater detail (Thyroid. 2017 Jun;27[6]:751-6).
[email protected]
On Twitter @mitchelzoler
BOSTON – When the American Joint Committee on Cancer’s Eighth Edition Cancer Staging Manual becomes effective for U.S. practice on Jan. 1, 2018, substantially more patients with thyroid cancer will meet the definition for stage I disease, but their survival prognosis will remain as good as it was for the smaller slice of patients defined with stage I thyroid cancer by the seventh edition, Bryan R. Haugen, MD, predicted during a talk at the World Congress on Thyroid Cancer.
Under current stage definitions in the seventh edition, roughly 60% of thyroid cancer patients have stage I disease, but this will kick up to about 80% under the eighth edition, said Dr. Haugen, professor of medicine and head of the division of endocrinology, metabolism, and diabetes at the University of Colorado in Aurora. Despite this influx of more patients, “survival rates in stage I patients haven’t changed,” with a disease-specific survival (DSS) of 98%-100% for stage I patients in the eighth edition compared with 97%-100% in the seventh edition, he noted.
Dr. Haugen credited this apparent paradox to the revised staging system’s superior discrimination among various grades of disease progression. “The eighth edition better separates patients based on their projected survival.” As more patients fit stage I classification with its highest level of projected survival, fewer patients will classify with more advanced disease and its worse projected survival.
For example, in the seventh edition patients with stage IV disease had a projected DSS rate of 50%-75%; in the eighth edition that rate is now less than 50%. The projected DSS rate for patients with stage II disease has down shifted from 97%-100% in the seventh edition to 85%-95% in the eighth. For patients with stage III thyroid cancer the DSS rate of 88%-95% in the seventh edition became 60%-70% in the eighth edition.
‘The new system will take some getting used to,” Dr. Haugen admitted, and it involves even more “big” changes, he warned. These include:
• Changing the cutpoint separating younger from older patients to 55 years of age in the eighth edition, a rise from the 45-year-old cutpoint in the seventh edition.
• Allowing tumors classified as stage I to be as large as 4 cm, up from the 2 cm or less defining stage I in the seventh edition.
• Reserving stage II designation for patients with tumors larger than 4 cm. In the seventh edition tumors had to be 2-4 cm in size.
• Expanding stage II disease to include not only patients with disease confined to their thyroid, but also patients with N1 lymph node spread or gross extrathyroidal extension. In the seventh edition tumor spread like this put patients into stage III.
• Specifying in the eighth edition that stage III disease must feature gross extrathyroidal extension into the larynx, trachea, esophagus, or recurrent laryngial nerve. To qualify for stage IV in the eighth edition, spread must extend into prevertebral fascia or encase major vessels, for stage IVA, or involve distant metastases for stage IVB.
• Paring down three stage IV subgroups, A, B, and C, in the seventh edition to just an A or B subgroup in the eighth edition.
Dr. Haugen coauthored a recent editorial that laid out an assessment of the eighth edition in greater detail (Thyroid. 2017 Jun;27[6]:751-6).
[email protected]
On Twitter @mitchelzoler
BOSTON – When the American Joint Committee on Cancer’s Eighth Edition Cancer Staging Manual becomes effective for U.S. practice on Jan. 1, 2018, substantially more patients with thyroid cancer will meet the definition for stage I disease, but their survival prognosis will remain as good as it was for the smaller slice of patients defined with stage I thyroid cancer by the seventh edition, Bryan R. Haugen, MD, predicted during a talk at the World Congress on Thyroid Cancer.
Under current stage definitions in the seventh edition, roughly 60% of thyroid cancer patients have stage I disease, but this will kick up to about 80% under the eighth edition, said Dr. Haugen, professor of medicine and head of the division of endocrinology, metabolism, and diabetes at the University of Colorado in Aurora. Despite this influx of more patients, “survival rates in stage I patients haven’t changed,” with a disease-specific survival (DSS) of 98%-100% for stage I patients in the eighth edition compared with 97%-100% in the seventh edition, he noted.
Dr. Haugen credited this apparent paradox to the revised staging system’s superior discrimination among various grades of disease progression. “The eighth edition better separates patients based on their projected survival.” As more patients fit stage I classification with its highest level of projected survival, fewer patients will classify with more advanced disease and its worse projected survival.
For example, in the seventh edition patients with stage IV disease had a projected DSS rate of 50%-75%; in the eighth edition that rate is now less than 50%. The projected DSS rate for patients with stage II disease has down shifted from 97%-100% in the seventh edition to 85%-95% in the eighth. For patients with stage III thyroid cancer the DSS rate of 88%-95% in the seventh edition became 60%-70% in the eighth edition.
‘The new system will take some getting used to,” Dr. Haugen admitted, and it involves even more “big” changes, he warned. These include:
• Changing the cutpoint separating younger from older patients to 55 years of age in the eighth edition, a rise from the 45-year-old cutpoint in the seventh edition.
• Allowing tumors classified as stage I to be as large as 4 cm, up from the 2 cm or less defining stage I in the seventh edition.
• Reserving stage II designation for patients with tumors larger than 4 cm. In the seventh edition tumors had to be 2-4 cm in size.
• Expanding stage II disease to include not only patients with disease confined to their thyroid, but also patients with N1 lymph node spread or gross extrathyroidal extension. In the seventh edition tumor spread like this put patients into stage III.
• Specifying in the eighth edition that stage III disease must feature gross extrathyroidal extension into the larynx, trachea, esophagus, or recurrent laryngial nerve. To qualify for stage IV in the eighth edition, spread must extend into prevertebral fascia or encase major vessels, for stage IVA, or involve distant metastases for stage IVB.
• Paring down three stage IV subgroups, A, B, and C, in the seventh edition to just an A or B subgroup in the eighth edition.
Dr. Haugen coauthored a recent editorial that laid out an assessment of the eighth edition in greater detail (Thyroid. 2017 Jun;27[6]:751-6).
[email protected]
On Twitter @mitchelzoler
EXPERT ANALYSIS FROM WCTC 2017