User login
Sudden Unexpected Death in Epilepsy: An Update
Pooja Patel, MD
Selim Benbadis, MD
Dr. Patel is a fourth year neurology resident at the University of South Florida and will begin an epilepsy fellowship at the University of South Florida in July.
Dr. Benbadis is Professor and Director of the Comprehensive Epilepsy Program at the University of South Florida and Tampa General Hospital in Tampa, Florida.
Sudden unexpected death in epilepsy (SUDEP) is the most common cause of death in patients with intractable epilepsy. SUDEP accounts for 7.5% to 17% of all deaths related to epilepsy and has an annual incidence of 3 to 9 per 1000 in the general epilepsy population. Even with such a high prevalence, epilepsy patients, their families, and even many physicians are unaware of the mortality and risk of SUDEP. SUDEP has recently received significant attention in the scientific literature due to its high prevalence and lack of well-defined mechanisms. Understanding the modifiable risk factors and pathophysiology of SUDEP is critically important to help delineate preventive strategies.
Several mechanisms have been proposed to play a role in the pathophysiology of SUDEP. Recent literature have included new insights derived from combined data from older and newer studies where clues were obtained from witnessed SUDEP cases, Epilepsy Monitoring Unit observation of SUDEP cases, physiological data obtained from nonfatal seizures, and animal models. Based on many cohort studies, the initial mechanism is thought to be due to hypoventilation or apnea resulting from the seizure itself. The prone position is thought to contribute to prolonged oxygen desaturation by causing loss of arousal and inability to sense increased carbon dioxide levels. This in turn can cause secondary cardiac arrhythmias that are fatal. The other proposed mechanism is primary cardiac arrhythmia resulting from autonomic dysfunction before, during, or after a seizure. Additionally, serotonergic neurons might contribute as well by causing combination of hypoventilation and seizure. Experts also have suggested that genetic mutations cause primary dysfunction leading to fatal seizures; however, this requires further research.
Based on recent findings about the pathophysiology of SUDEP, several preventive measures have been suggested. The critical preventive measure is still believed to be good control of seizures, as uncontrolled generalized tonic-clonic seizures continue to be the biggest risk factor of SUDEP. Good control of seizures can be difficult in chronic refractory epilepsy and early referral should be made to an epilepsy center. Several studies evaluating patients after epilepsy surgery have discovered that surgery reduces patients’ likelihood of SUDEP. A study done in 2000 evaluated vagus nerve stimulation (VNS) implantation and SUDEP risk, and concluded that during the first 2 years the risk of SUDEP was higher than isolated use of some of the antiepileptics. However, after a 2-year follow-up, the risk of SUDEP was remarkably lower. The initial higher rate was likely due to the fact that VNS was implanted in refractory patients who had failed antiepileptics and were candidates for surgery. Experts believe that VNS is likely protective toward SUDEP as it reduces the amount of generalized tonic-clonic seizures.
Sleep is considered to be high risk for SUDEP because of the higher number of seizures occurring in sleep and hormonal and autonomic changes occurring at nighttime, which increase SUDEP risk. Use of a bed where the head can be adjusted to be higher than the feet, a supine sleep position, a special pillow to prevent suffocation, and even supervision at night has been recommended to reduce the risk of SUDEP. Supervision at night, which includes a supervising person sharing the same bedroom, special precautions such as regular checks throughout the night, or use of a monitoring device, was associated with a decreased risk of SUDEP in a recent study. Medications such as alpha-blockers and beta-blockers might be considered as they can reduce sympathetic discharge and prevent cardiac arrhythmias. Selective serotonin reuptake inhibitors can also reduce ictal hypoxemia and can help prevent a potential mechanism leading to SUDEP.
SUDEP is a significant burden in the field of epilepsy because of its mortality, but it is a subject that is not known well among patients, family, and providers. A recent Australian study reported that only a minority of adult patients with epilepsy had heard about SUDEP from their neurologists. Due to limited experience and knowledge of SUDEP, some neurologists may be unable to provide the appropriate education to patients and their families. The first qualitative study to explore opinions of bereaved relatives on whether to discuss SUDEP with patients was recently performed in the United States. The study showed that 91% of parents of epilepsy patients and 89.5% of adults with epilepsy would have preferred to have information about SUDEP. These respondents would have liked their neurologists in particular to discuss SUDEP so that they could have focused on preventive techniques. Study results suggest a lack of knowledge and the need for more awareness of SUDEP among the epilepsy population. It should be the neurologists’ responsibility to discuss and inform patients and their families about SUDEP at an appropriate time based on their diagnosis.
Resources such as the SUDEP-7 inventory can be used to help physicians identify patients with refractory epilepsy at risk for SUDEP. In 2011, the first SUDEP-7 inventory was found to be associated to two biomarkers: vagus-mediated heart rate variability and post-ictal generalized electroencephalogram suppression. In 2015, this inventory was modified to the revised SUDEP-7 inventory, which re-evaluated the association with heart rate. Results indicated that older age, longer duration of epilepsy, and presence of developmental disability had direct influence on vagus-mediated heart rate variability and thus increased SUDEP risk. The higher the SUDEP-7 inventory score, the higher the risk of SUDEP. SUDEP-7 inventory and other similar inventories can be a valuable tool for risk stratification and in turn can be used when deciding whether or not to have discussion of SUDEP with patients and families.
In conclusion, SUDEP continues to be a growing concern of mortality in the epilepsy population. Recent research has shed light on its pathophysiologic mechanism, which will in turn help us determine preventive techniques for vulnerable patients. Hopefully, increased SUDEP awareness will help physicians be more knowledgeable and comfortable in leading the discussion of SUDEP. Further research is still needed to uncover the role of genetics using animal and human models.
Sources
Annegers JF, Coan SP, Hauser WA, Leestma J. Epilepsy, vagal nerve stimulation by the NCP system, all-cause mortality and sudden, unexpected, unexplained death. Epilepsia. 2000;41(5):549-553.
Dlouhy BJ, Gehlbach BK, Richerson GB. Sudden unexpected death in epilepsy: basic mechanisms and clinical implications for prevention. J Neurol Neurosurg Psychiatry. 2016;87(4):402-413.
Morse AM, Kothare SV. Pediatric sudden unexpected death in epilepsy. Pediatr Neurol. 2016;57:7-16.
Novak JL, Miller PR, Markovic D, Meymandi SK, DeGiorgio CM. Risk assessment for sudden death in epilepsy: the SUDEP-7 inventory. Front Neurol. 2015;6:252.
Pansani AP, Colugnati DB, Scorza CA, de Almeida AC, Cavalheiro EA, Scorza FA. Furthering our understanding of SUDEP: the role of animal models. Expert Rev Neurother. 2016;16(5):561-572.
RamachandranNair R, Jack SM, Strohm S. SUDEP: to discuss or not? recommendations from bereaved relatives. Epilepsy Behav. 2016;56:20-25.
Pooja Patel, MD
Selim Benbadis, MD
Dr. Patel is a fourth year neurology resident at the University of South Florida and will begin an epilepsy fellowship at the University of South Florida in July.
Dr. Benbadis is Professor and Director of the Comprehensive Epilepsy Program at the University of South Florida and Tampa General Hospital in Tampa, Florida.
Sudden unexpected death in epilepsy (SUDEP) is the most common cause of death in patients with intractable epilepsy. SUDEP accounts for 7.5% to 17% of all deaths related to epilepsy and has an annual incidence of 3 to 9 per 1000 in the general epilepsy population. Even with such a high prevalence, epilepsy patients, their families, and even many physicians are unaware of the mortality and risk of SUDEP. SUDEP has recently received significant attention in the scientific literature due to its high prevalence and lack of well-defined mechanisms. Understanding the modifiable risk factors and pathophysiology of SUDEP is critically important to help delineate preventive strategies.
Several mechanisms have been proposed to play a role in the pathophysiology of SUDEP. Recent literature have included new insights derived from combined data from older and newer studies where clues were obtained from witnessed SUDEP cases, Epilepsy Monitoring Unit observation of SUDEP cases, physiological data obtained from nonfatal seizures, and animal models. Based on many cohort studies, the initial mechanism is thought to be due to hypoventilation or apnea resulting from the seizure itself. The prone position is thought to contribute to prolonged oxygen desaturation by causing loss of arousal and inability to sense increased carbon dioxide levels. This in turn can cause secondary cardiac arrhythmias that are fatal. The other proposed mechanism is primary cardiac arrhythmia resulting from autonomic dysfunction before, during, or after a seizure. Additionally, serotonergic neurons might contribute as well by causing combination of hypoventilation and seizure. Experts also have suggested that genetic mutations cause primary dysfunction leading to fatal seizures; however, this requires further research.
Based on recent findings about the pathophysiology of SUDEP, several preventive measures have been suggested. The critical preventive measure is still believed to be good control of seizures, as uncontrolled generalized tonic-clonic seizures continue to be the biggest risk factor of SUDEP. Good control of seizures can be difficult in chronic refractory epilepsy and early referral should be made to an epilepsy center. Several studies evaluating patients after epilepsy surgery have discovered that surgery reduces patients’ likelihood of SUDEP. A study done in 2000 evaluated vagus nerve stimulation (VNS) implantation and SUDEP risk, and concluded that during the first 2 years the risk of SUDEP was higher than isolated use of some of the antiepileptics. However, after a 2-year follow-up, the risk of SUDEP was remarkably lower. The initial higher rate was likely due to the fact that VNS was implanted in refractory patients who had failed antiepileptics and were candidates for surgery. Experts believe that VNS is likely protective toward SUDEP as it reduces the amount of generalized tonic-clonic seizures.
Sleep is considered to be high risk for SUDEP because of the higher number of seizures occurring in sleep and hormonal and autonomic changes occurring at nighttime, which increase SUDEP risk. Use of a bed where the head can be adjusted to be higher than the feet, a supine sleep position, a special pillow to prevent suffocation, and even supervision at night has been recommended to reduce the risk of SUDEP. Supervision at night, which includes a supervising person sharing the same bedroom, special precautions such as regular checks throughout the night, or use of a monitoring device, was associated with a decreased risk of SUDEP in a recent study. Medications such as alpha-blockers and beta-blockers might be considered as they can reduce sympathetic discharge and prevent cardiac arrhythmias. Selective serotonin reuptake inhibitors can also reduce ictal hypoxemia and can help prevent a potential mechanism leading to SUDEP.
SUDEP is a significant burden in the field of epilepsy because of its mortality, but it is a subject that is not known well among patients, family, and providers. A recent Australian study reported that only a minority of adult patients with epilepsy had heard about SUDEP from their neurologists. Due to limited experience and knowledge of SUDEP, some neurologists may be unable to provide the appropriate education to patients and their families. The first qualitative study to explore opinions of bereaved relatives on whether to discuss SUDEP with patients was recently performed in the United States. The study showed that 91% of parents of epilepsy patients and 89.5% of adults with epilepsy would have preferred to have information about SUDEP. These respondents would have liked their neurologists in particular to discuss SUDEP so that they could have focused on preventive techniques. Study results suggest a lack of knowledge and the need for more awareness of SUDEP among the epilepsy population. It should be the neurologists’ responsibility to discuss and inform patients and their families about SUDEP at an appropriate time based on their diagnosis.
Resources such as the SUDEP-7 inventory can be used to help physicians identify patients with refractory epilepsy at risk for SUDEP. In 2011, the first SUDEP-7 inventory was found to be associated to two biomarkers: vagus-mediated heart rate variability and post-ictal generalized electroencephalogram suppression. In 2015, this inventory was modified to the revised SUDEP-7 inventory, which re-evaluated the association with heart rate. Results indicated that older age, longer duration of epilepsy, and presence of developmental disability had direct influence on vagus-mediated heart rate variability and thus increased SUDEP risk. The higher the SUDEP-7 inventory score, the higher the risk of SUDEP. SUDEP-7 inventory and other similar inventories can be a valuable tool for risk stratification and in turn can be used when deciding whether or not to have discussion of SUDEP with patients and families.
In conclusion, SUDEP continues to be a growing concern of mortality in the epilepsy population. Recent research has shed light on its pathophysiologic mechanism, which will in turn help us determine preventive techniques for vulnerable patients. Hopefully, increased SUDEP awareness will help physicians be more knowledgeable and comfortable in leading the discussion of SUDEP. Further research is still needed to uncover the role of genetics using animal and human models.
Sources
Annegers JF, Coan SP, Hauser WA, Leestma J. Epilepsy, vagal nerve stimulation by the NCP system, all-cause mortality and sudden, unexpected, unexplained death. Epilepsia. 2000;41(5):549-553.
Dlouhy BJ, Gehlbach BK, Richerson GB. Sudden unexpected death in epilepsy: basic mechanisms and clinical implications for prevention. J Neurol Neurosurg Psychiatry. 2016;87(4):402-413.
Morse AM, Kothare SV. Pediatric sudden unexpected death in epilepsy. Pediatr Neurol. 2016;57:7-16.
Novak JL, Miller PR, Markovic D, Meymandi SK, DeGiorgio CM. Risk assessment for sudden death in epilepsy: the SUDEP-7 inventory. Front Neurol. 2015;6:252.
Pansani AP, Colugnati DB, Scorza CA, de Almeida AC, Cavalheiro EA, Scorza FA. Furthering our understanding of SUDEP: the role of animal models. Expert Rev Neurother. 2016;16(5):561-572.
RamachandranNair R, Jack SM, Strohm S. SUDEP: to discuss or not? recommendations from bereaved relatives. Epilepsy Behav. 2016;56:20-25.
Pooja Patel, MD
Selim Benbadis, MD
Dr. Patel is a fourth year neurology resident at the University of South Florida and will begin an epilepsy fellowship at the University of South Florida in July.
Dr. Benbadis is Professor and Director of the Comprehensive Epilepsy Program at the University of South Florida and Tampa General Hospital in Tampa, Florida.
Sudden unexpected death in epilepsy (SUDEP) is the most common cause of death in patients with intractable epilepsy. SUDEP accounts for 7.5% to 17% of all deaths related to epilepsy and has an annual incidence of 3 to 9 per 1000 in the general epilepsy population. Even with such a high prevalence, epilepsy patients, their families, and even many physicians are unaware of the mortality and risk of SUDEP. SUDEP has recently received significant attention in the scientific literature due to its high prevalence and lack of well-defined mechanisms. Understanding the modifiable risk factors and pathophysiology of SUDEP is critically important to help delineate preventive strategies.
Several mechanisms have been proposed to play a role in the pathophysiology of SUDEP. Recent literature have included new insights derived from combined data from older and newer studies where clues were obtained from witnessed SUDEP cases, Epilepsy Monitoring Unit observation of SUDEP cases, physiological data obtained from nonfatal seizures, and animal models. Based on many cohort studies, the initial mechanism is thought to be due to hypoventilation or apnea resulting from the seizure itself. The prone position is thought to contribute to prolonged oxygen desaturation by causing loss of arousal and inability to sense increased carbon dioxide levels. This in turn can cause secondary cardiac arrhythmias that are fatal. The other proposed mechanism is primary cardiac arrhythmia resulting from autonomic dysfunction before, during, or after a seizure. Additionally, serotonergic neurons might contribute as well by causing combination of hypoventilation and seizure. Experts also have suggested that genetic mutations cause primary dysfunction leading to fatal seizures; however, this requires further research.
Based on recent findings about the pathophysiology of SUDEP, several preventive measures have been suggested. The critical preventive measure is still believed to be good control of seizures, as uncontrolled generalized tonic-clonic seizures continue to be the biggest risk factor of SUDEP. Good control of seizures can be difficult in chronic refractory epilepsy and early referral should be made to an epilepsy center. Several studies evaluating patients after epilepsy surgery have discovered that surgery reduces patients’ likelihood of SUDEP. A study done in 2000 evaluated vagus nerve stimulation (VNS) implantation and SUDEP risk, and concluded that during the first 2 years the risk of SUDEP was higher than isolated use of some of the antiepileptics. However, after a 2-year follow-up, the risk of SUDEP was remarkably lower. The initial higher rate was likely due to the fact that VNS was implanted in refractory patients who had failed antiepileptics and were candidates for surgery. Experts believe that VNS is likely protective toward SUDEP as it reduces the amount of generalized tonic-clonic seizures.
Sleep is considered to be high risk for SUDEP because of the higher number of seizures occurring in sleep and hormonal and autonomic changes occurring at nighttime, which increase SUDEP risk. Use of a bed where the head can be adjusted to be higher than the feet, a supine sleep position, a special pillow to prevent suffocation, and even supervision at night has been recommended to reduce the risk of SUDEP. Supervision at night, which includes a supervising person sharing the same bedroom, special precautions such as regular checks throughout the night, or use of a monitoring device, was associated with a decreased risk of SUDEP in a recent study. Medications such as alpha-blockers and beta-blockers might be considered as they can reduce sympathetic discharge and prevent cardiac arrhythmias. Selective serotonin reuptake inhibitors can also reduce ictal hypoxemia and can help prevent a potential mechanism leading to SUDEP.
SUDEP is a significant burden in the field of epilepsy because of its mortality, but it is a subject that is not known well among patients, family, and providers. A recent Australian study reported that only a minority of adult patients with epilepsy had heard about SUDEP from their neurologists. Due to limited experience and knowledge of SUDEP, some neurologists may be unable to provide the appropriate education to patients and their families. The first qualitative study to explore opinions of bereaved relatives on whether to discuss SUDEP with patients was recently performed in the United States. The study showed that 91% of parents of epilepsy patients and 89.5% of adults with epilepsy would have preferred to have information about SUDEP. These respondents would have liked their neurologists in particular to discuss SUDEP so that they could have focused on preventive techniques. Study results suggest a lack of knowledge and the need for more awareness of SUDEP among the epilepsy population. It should be the neurologists’ responsibility to discuss and inform patients and their families about SUDEP at an appropriate time based on their diagnosis.
Resources such as the SUDEP-7 inventory can be used to help physicians identify patients with refractory epilepsy at risk for SUDEP. In 2011, the first SUDEP-7 inventory was found to be associated to two biomarkers: vagus-mediated heart rate variability and post-ictal generalized electroencephalogram suppression. In 2015, this inventory was modified to the revised SUDEP-7 inventory, which re-evaluated the association with heart rate. Results indicated that older age, longer duration of epilepsy, and presence of developmental disability had direct influence on vagus-mediated heart rate variability and thus increased SUDEP risk. The higher the SUDEP-7 inventory score, the higher the risk of SUDEP. SUDEP-7 inventory and other similar inventories can be a valuable tool for risk stratification and in turn can be used when deciding whether or not to have discussion of SUDEP with patients and families.
In conclusion, SUDEP continues to be a growing concern of mortality in the epilepsy population. Recent research has shed light on its pathophysiologic mechanism, which will in turn help us determine preventive techniques for vulnerable patients. Hopefully, increased SUDEP awareness will help physicians be more knowledgeable and comfortable in leading the discussion of SUDEP. Further research is still needed to uncover the role of genetics using animal and human models.
Sources
Annegers JF, Coan SP, Hauser WA, Leestma J. Epilepsy, vagal nerve stimulation by the NCP system, all-cause mortality and sudden, unexpected, unexplained death. Epilepsia. 2000;41(5):549-553.
Dlouhy BJ, Gehlbach BK, Richerson GB. Sudden unexpected death in epilepsy: basic mechanisms and clinical implications for prevention. J Neurol Neurosurg Psychiatry. 2016;87(4):402-413.
Morse AM, Kothare SV. Pediatric sudden unexpected death in epilepsy. Pediatr Neurol. 2016;57:7-16.
Novak JL, Miller PR, Markovic D, Meymandi SK, DeGiorgio CM. Risk assessment for sudden death in epilepsy: the SUDEP-7 inventory. Front Neurol. 2015;6:252.
Pansani AP, Colugnati DB, Scorza CA, de Almeida AC, Cavalheiro EA, Scorza FA. Furthering our understanding of SUDEP: the role of animal models. Expert Rev Neurother. 2016;16(5):561-572.
RamachandranNair R, Jack SM, Strohm S. SUDEP: to discuss or not? recommendations from bereaved relatives. Epilepsy Behav. 2016;56:20-25.
Blood Pressure Trajectories May Affect Risk of Stroke and Mortality
Trajectories of blood pressure in mid to late life are associated with incident stroke and mortality, according to research published online ahead of print May 9 in Hypertension.
Most associations between blood pressure—a major modifiable risk factor for stroke—and incident stroke have been based on blood pressure measurements taken at a single time point.
Although long-term trajectories of blood pressure can vary considerably in the elderly, studies have not looked at the long-term blood pressure trajectories in mid to late life or at whether such trajectories relate to stroke, said M. Arfan Ikram, MD, PhD, senior study author and Associate Professor of Neuroepidemiology at Erasmus University Medical Center in Rotterdam, the Netherlands.
M. Arfan Ikram, MD, PhD
To identify long-term trajectories of blood pressure in a population-based study and examine the risk of stroke within those trajectories, Dr. Ikram and colleagues evaluated the course of systolic blood pressure in 6,745 participants within the Rotterdam Study.
Participants resided in Ommoord, a suburb of Rotterdam, and received baseline examinations starting in 1990. The investigators used data from five follow-up visits, which occurred every three to four years from 1990 to 2011. During each follow-up visit, blood pressure was measured twice in the right arm, in sitting position, after a resting period of five minutes. Researchers used the average of the two measurements. The investigators focused on systolic blood pressure because it is the best predictor of cardiovascular events.
Participants’ ages ranged from 55 to 106, and 60% were women. Participants had a mean follow-up of 13.5 years.
Four Trajectories
The investigators jointly modeled participants’ risk of stroke and competing causes of death using joint latent class mixed modeling. When assessing blood pressure trajectories, the researchers found that the joint latent class model with four trajectory classes had the best fit.
Class 1, the largest class, included 4,938 participants. It was characterized by a gradually increasing blood pressure, starting at an average of 120 mm Hg at age 55 and increasing to an average of 160 mm Hg at age 95. Class 2, with 822 participants, was characterized by a similar blood pressure at age 55, but a much steeper increase in blood pressure, to an average of 200 mm Hg. The two other classes were characterized by a relatively higher baseline blood pressure. In class 3 (870 participants), the average baseline blood pressure of 140 mm Hg had modest variation over time. In class 4 (115 patients), the average baseline blood pressure of 160 mm Hg decreased after age 65.
People in class 4 were more frequently men. Use of blood pressure-lowering medication was similar between classes at baseline. At the end of follow-up, classes 3 and 4 had higher proportions of blood pressure-lowering medication users. Frequency of current smokers varied between classes, with particularly higher frequencies in classes 2 and 4.
Groups’ Risk Varied
During the study period, 1,053 participants had a stroke. Researchers also studied the number of deaths that occurred from nonstroke health events. They adjusted for sex and baseline blood-pressure lowering medication.
Classes 2, 3, and 4 had a significantly and substantially higher risk of stroke, compared with class 1 (ie, 4.7% to 13.6% vs 0.7%). Classes 2 and 4 had the highest risk of dying of other causes. The risk of dying of other causes in class 3 was similar to that of class 1. The risk of stroke in class 3, however, continued to increase until older age and was highest overall.
In all, 2,546 people (51.5%) in class 1, 575 (70.0%) people in class 2, 288 (33.1%) people in class 3, and 87 (75.7%) people in class 4 died due to a nonstroke-related cause. Between 25% and 38% of nonstroke deaths were due to cardiovascular events.
In multivariable-adjusted models that controlled for cholesterol, lipid-lowering medication, BMI, smoking, alcohol use, diabetes mellitus type 2, and antithrombotic medication, the results were relatively similar, the researchers said. The risk of stroke in classes 2 and 4 was attenuated by data adjustment, whereas the risk increased in class 3.
“Assessing trajectories of blood pressure provides a more nuanced understanding of the associations between blood pressure, stroke, and mortality,” the authors said.
The researchers noted that people in class 2 with steep increases in blood pressure might not receive effective treatment in time under current guidelines, and future studies could determine whether this class can be a target for prevention.
Effect of Slope
Prior studies that examined blood pressure trajectories in young to middle-aged people identified several parallel trajectories and found that long-term higher blood pressure related to more cardiovascular pathology.
“In our older population, we also observed that the class with a high mid-life blood pressure had the highest risk of stroke and death, compared to the class with the lowest blood pressure,” Dr. Ikram and colleagues said. “However, a novel finding of our study is that the slope of increase was associated with an increasing risk of stroke and competing causes of death. Namely, we identified two classes characterized by equally low baseline blood pressure and increasing trajectories, but only the class characterized by steep increases had a high risk of stroke and death. Of note, the risks in that class were even similar to the class with a high mid-life blood pressure.”
The large study population, the use of repeated measures of blood pressure over a long follow-up, and thorough collection of stroke assessments were among the study’s strengths. The study was not large enough to examine stroke subtypes, the authors said. In addition, the study’s population was geographically limited and mostly white, although the findings likely apply to people from other communities, Dr. Ikram said.
“Blood pressure should be measured regularly because it can change markedly over the course of a couple years and put you at high risk for an adverse event,” said Dr. Ikram. “Since the risks of stroke and death differ across these trajectory paths, they are potentially important for preventive strategies.”
—Jake Remaly
Suggested Reading
Portegies ML, Mirza SS, Verlinden VJ, et al. Mid- to late-life trajectories of blood pressure and the risk of stroke: the Rotterdam Study. Hypertension. 2016 May 9 [Epub ahead of print].
Trajectories of blood pressure in mid to late life are associated with incident stroke and mortality, according to research published online ahead of print May 9 in Hypertension.
Most associations between blood pressure—a major modifiable risk factor for stroke—and incident stroke have been based on blood pressure measurements taken at a single time point.
Although long-term trajectories of blood pressure can vary considerably in the elderly, studies have not looked at the long-term blood pressure trajectories in mid to late life or at whether such trajectories relate to stroke, said M. Arfan Ikram, MD, PhD, senior study author and Associate Professor of Neuroepidemiology at Erasmus University Medical Center in Rotterdam, the Netherlands.
M. Arfan Ikram, MD, PhD
To identify long-term trajectories of blood pressure in a population-based study and examine the risk of stroke within those trajectories, Dr. Ikram and colleagues evaluated the course of systolic blood pressure in 6,745 participants within the Rotterdam Study.
Participants resided in Ommoord, a suburb of Rotterdam, and received baseline examinations starting in 1990. The investigators used data from five follow-up visits, which occurred every three to four years from 1990 to 2011. During each follow-up visit, blood pressure was measured twice in the right arm, in sitting position, after a resting period of five minutes. Researchers used the average of the two measurements. The investigators focused on systolic blood pressure because it is the best predictor of cardiovascular events.
Participants’ ages ranged from 55 to 106, and 60% were women. Participants had a mean follow-up of 13.5 years.
Four Trajectories
The investigators jointly modeled participants’ risk of stroke and competing causes of death using joint latent class mixed modeling. When assessing blood pressure trajectories, the researchers found that the joint latent class model with four trajectory classes had the best fit.
Class 1, the largest class, included 4,938 participants. It was characterized by a gradually increasing blood pressure, starting at an average of 120 mm Hg at age 55 and increasing to an average of 160 mm Hg at age 95. Class 2, with 822 participants, was characterized by a similar blood pressure at age 55, but a much steeper increase in blood pressure, to an average of 200 mm Hg. The two other classes were characterized by a relatively higher baseline blood pressure. In class 3 (870 participants), the average baseline blood pressure of 140 mm Hg had modest variation over time. In class 4 (115 patients), the average baseline blood pressure of 160 mm Hg decreased after age 65.
People in class 4 were more frequently men. Use of blood pressure-lowering medication was similar between classes at baseline. At the end of follow-up, classes 3 and 4 had higher proportions of blood pressure-lowering medication users. Frequency of current smokers varied between classes, with particularly higher frequencies in classes 2 and 4.
Groups’ Risk Varied
During the study period, 1,053 participants had a stroke. Researchers also studied the number of deaths that occurred from nonstroke health events. They adjusted for sex and baseline blood-pressure lowering medication.
Classes 2, 3, and 4 had a significantly and substantially higher risk of stroke, compared with class 1 (ie, 4.7% to 13.6% vs 0.7%). Classes 2 and 4 had the highest risk of dying of other causes. The risk of dying of other causes in class 3 was similar to that of class 1. The risk of stroke in class 3, however, continued to increase until older age and was highest overall.
In all, 2,546 people (51.5%) in class 1, 575 (70.0%) people in class 2, 288 (33.1%) people in class 3, and 87 (75.7%) people in class 4 died due to a nonstroke-related cause. Between 25% and 38% of nonstroke deaths were due to cardiovascular events.
In multivariable-adjusted models that controlled for cholesterol, lipid-lowering medication, BMI, smoking, alcohol use, diabetes mellitus type 2, and antithrombotic medication, the results were relatively similar, the researchers said. The risk of stroke in classes 2 and 4 was attenuated by data adjustment, whereas the risk increased in class 3.
“Assessing trajectories of blood pressure provides a more nuanced understanding of the associations between blood pressure, stroke, and mortality,” the authors said.
The researchers noted that people in class 2 with steep increases in blood pressure might not receive effective treatment in time under current guidelines, and future studies could determine whether this class can be a target for prevention.
Effect of Slope
Prior studies that examined blood pressure trajectories in young to middle-aged people identified several parallel trajectories and found that long-term higher blood pressure related to more cardiovascular pathology.
“In our older population, we also observed that the class with a high mid-life blood pressure had the highest risk of stroke and death, compared to the class with the lowest blood pressure,” Dr. Ikram and colleagues said. “However, a novel finding of our study is that the slope of increase was associated with an increasing risk of stroke and competing causes of death. Namely, we identified two classes characterized by equally low baseline blood pressure and increasing trajectories, but only the class characterized by steep increases had a high risk of stroke and death. Of note, the risks in that class were even similar to the class with a high mid-life blood pressure.”
The large study population, the use of repeated measures of blood pressure over a long follow-up, and thorough collection of stroke assessments were among the study’s strengths. The study was not large enough to examine stroke subtypes, the authors said. In addition, the study’s population was geographically limited and mostly white, although the findings likely apply to people from other communities, Dr. Ikram said.
“Blood pressure should be measured regularly because it can change markedly over the course of a couple years and put you at high risk for an adverse event,” said Dr. Ikram. “Since the risks of stroke and death differ across these trajectory paths, they are potentially important for preventive strategies.”
—Jake Remaly
Trajectories of blood pressure in mid to late life are associated with incident stroke and mortality, according to research published online ahead of print May 9 in Hypertension.
Most associations between blood pressure—a major modifiable risk factor for stroke—and incident stroke have been based on blood pressure measurements taken at a single time point.
Although long-term trajectories of blood pressure can vary considerably in the elderly, studies have not looked at the long-term blood pressure trajectories in mid to late life or at whether such trajectories relate to stroke, said M. Arfan Ikram, MD, PhD, senior study author and Associate Professor of Neuroepidemiology at Erasmus University Medical Center in Rotterdam, the Netherlands.
M. Arfan Ikram, MD, PhD
To identify long-term trajectories of blood pressure in a population-based study and examine the risk of stroke within those trajectories, Dr. Ikram and colleagues evaluated the course of systolic blood pressure in 6,745 participants within the Rotterdam Study.
Participants resided in Ommoord, a suburb of Rotterdam, and received baseline examinations starting in 1990. The investigators used data from five follow-up visits, which occurred every three to four years from 1990 to 2011. During each follow-up visit, blood pressure was measured twice in the right arm, in sitting position, after a resting period of five minutes. Researchers used the average of the two measurements. The investigators focused on systolic blood pressure because it is the best predictor of cardiovascular events.
Participants’ ages ranged from 55 to 106, and 60% were women. Participants had a mean follow-up of 13.5 years.
Four Trajectories
The investigators jointly modeled participants’ risk of stroke and competing causes of death using joint latent class mixed modeling. When assessing blood pressure trajectories, the researchers found that the joint latent class model with four trajectory classes had the best fit.
Class 1, the largest class, included 4,938 participants. It was characterized by a gradually increasing blood pressure, starting at an average of 120 mm Hg at age 55 and increasing to an average of 160 mm Hg at age 95. Class 2, with 822 participants, was characterized by a similar blood pressure at age 55, but a much steeper increase in blood pressure, to an average of 200 mm Hg. The two other classes were characterized by a relatively higher baseline blood pressure. In class 3 (870 participants), the average baseline blood pressure of 140 mm Hg had modest variation over time. In class 4 (115 patients), the average baseline blood pressure of 160 mm Hg decreased after age 65.
People in class 4 were more frequently men. Use of blood pressure-lowering medication was similar between classes at baseline. At the end of follow-up, classes 3 and 4 had higher proportions of blood pressure-lowering medication users. Frequency of current smokers varied between classes, with particularly higher frequencies in classes 2 and 4.
Groups’ Risk Varied
During the study period, 1,053 participants had a stroke. Researchers also studied the number of deaths that occurred from nonstroke health events. They adjusted for sex and baseline blood-pressure lowering medication.
Classes 2, 3, and 4 had a significantly and substantially higher risk of stroke, compared with class 1 (ie, 4.7% to 13.6% vs 0.7%). Classes 2 and 4 had the highest risk of dying of other causes. The risk of dying of other causes in class 3 was similar to that of class 1. The risk of stroke in class 3, however, continued to increase until older age and was highest overall.
In all, 2,546 people (51.5%) in class 1, 575 (70.0%) people in class 2, 288 (33.1%) people in class 3, and 87 (75.7%) people in class 4 died due to a nonstroke-related cause. Between 25% and 38% of nonstroke deaths were due to cardiovascular events.
In multivariable-adjusted models that controlled for cholesterol, lipid-lowering medication, BMI, smoking, alcohol use, diabetes mellitus type 2, and antithrombotic medication, the results were relatively similar, the researchers said. The risk of stroke in classes 2 and 4 was attenuated by data adjustment, whereas the risk increased in class 3.
“Assessing trajectories of blood pressure provides a more nuanced understanding of the associations between blood pressure, stroke, and mortality,” the authors said.
The researchers noted that people in class 2 with steep increases in blood pressure might not receive effective treatment in time under current guidelines, and future studies could determine whether this class can be a target for prevention.
Effect of Slope
Prior studies that examined blood pressure trajectories in young to middle-aged people identified several parallel trajectories and found that long-term higher blood pressure related to more cardiovascular pathology.
“In our older population, we also observed that the class with a high mid-life blood pressure had the highest risk of stroke and death, compared to the class with the lowest blood pressure,” Dr. Ikram and colleagues said. “However, a novel finding of our study is that the slope of increase was associated with an increasing risk of stroke and competing causes of death. Namely, we identified two classes characterized by equally low baseline blood pressure and increasing trajectories, but only the class characterized by steep increases had a high risk of stroke and death. Of note, the risks in that class were even similar to the class with a high mid-life blood pressure.”
The large study population, the use of repeated measures of blood pressure over a long follow-up, and thorough collection of stroke assessments were among the study’s strengths. The study was not large enough to examine stroke subtypes, the authors said. In addition, the study’s population was geographically limited and mostly white, although the findings likely apply to people from other communities, Dr. Ikram said.
“Blood pressure should be measured regularly because it can change markedly over the course of a couple years and put you at high risk for an adverse event,” said Dr. Ikram. “Since the risks of stroke and death differ across these trajectory paths, they are potentially important for preventive strategies.”
—Jake Remaly
Suggested Reading
Portegies ML, Mirza SS, Verlinden VJ, et al. Mid- to late-life trajectories of blood pressure and the risk of stroke: the Rotterdam Study. Hypertension. 2016 May 9 [Epub ahead of print].
Suggested Reading
Portegies ML, Mirza SS, Verlinden VJ, et al. Mid- to late-life trajectories of blood pressure and the risk of stroke: the Rotterdam Study. Hypertension. 2016 May 9 [Epub ahead of print].
Age, lower baseline ALC increase dimethyl fumarate lymphopenia risk
VANCOUVER – The risk of dimethyl fumarate lymphopenia – and perhaps progressive multifocal leukoencephalopathy – is greatest in patients 60 years or older and those with baseline absolute lymphocyte counts below 2 x 109/L, according to a review of 206 patients with relapsing-remitting or progressive multiple sclerosis from the University of Rochester (N.Y.).
A total of 87 patients (42%), all of whom were on dimethyl fumarate (DMF; Tecfidera) for at least 3 months, developed lymphopenia with an absolute lymphocyte count (ALC) below 0.91 x 109/L. That’s not a surprise; lymphopenia is a well-known side effect of the drug, and the rates in Rochester were similar to what was reported in clinical trials. The greatest concern with DMF lymphopenia is subsequent progressive multifocal leukoencephalopathy (PML); a handful of cases have been reported in lymphopenic patients, none in the University of Rochester review.
What was surprising was that in the 34 patients aged 60 years or older, 24 (71%) developed lymphopenia, versus 62 (36%) of the 172 under 60 years old (P = .0005). Meanwhile, of 93 patients with baseline ALCs below 2 x 109/L, 49 (53%) became lymphopenic, versus 34 of 104 patients (33%) who started DMF with higher lymphocyte counts (P = .0006). A total of nine patients in the study did not have a baseline ALC available.
“If I had a patient who was 70 years old with a low baseline lymphocyte count, [these findings] would weigh into my decisions about choosing” this medication. “Age and baseline ALC may guide future selection of patients for DMF therapy,” neurologist and investigator Dr. Jessica Robb said at the annual meeting of the American Academy of Neurology.
Also, because higher grade lymphopenia didn’t resolve in most cases until the drug was stopped, “if I had a patient who developed more severe grade 3 or 4 lymphopenia, I would probably have a lower threshold for” discontinuation. “I would probably think about changing medication more quickly rather than leaving them on [DMF] and hoping that their lymphopenia resolves,” Dr. Robb said.
The Rochester findings are in line with a 2015 report from Washington University, St. Louis, that also indicated a higher risk of moderate to severe lymphopenia in older patients and those with lower baseline ALCs, as well as recent natalizumab (Tysabri) users. Grade 2 or worse lymphopenia “is unlikely to resolve while on the drug,” the St. Louis investigators concluded (Mult Scler J Exp Transl Clin. 2015 Jan-Dec;1:2055217315596994).
Taken together, the two studies are important because there’s otherwise not much else in the medical literature identifying DMF lymphopenia risk factors. Lymphopenia and PML are also concerns with other multiple sclerosis (MS) agents.
“The increased prevalence of lymphopenia in older patients and in patients with a lower baseline ALC suggests a failure of lymphopoiesis triggered by DMF therapy. Indeed, lymphopoiesis declines with age due to thymic involution and decreased production of naive lymphocytes. ... Whether these consequences of normal aging could be amplified by DMF is an avenue for future study,” the St. Louis team said.
“The significance of increased risk for lymphopenia in patients recently exposed to natalizumab is not immediately obvious. ... Natalizumab is known to expand circulating leukocytes, including progenitor cells. If in turn, DMF causes lymphocyte apoptosis or arrest of differentiation, then patients sequentially exposed to natalizumab and DMF might have a larger number of circulating lymphocytes vulnerable to DMF effects than other patients,” they said.
Food and Drug Administration labeling for DMF recommends lymphocyte counts at baseline, 6 months, and every 6-12 months thereafter. However, European regulators recently recommended lymphocyte counts at baseline and every 3 months to catch problems early, as well as baseline MRIs as references for possible PML.
Standard, 240-mg twice-daily dosing was used at the University of Rochester, and the mean age in the study was 49 years. The majority of patients were women, and the mean duration of MS was 11 years. Almost three-quarters of the patients were new to immunosuppression, and none of the patients developed serious infections.
The University of Rochester team noted a higher rate of grade 1 lymphopenia than reported in clinical trials (18% vs. 10%). Twelve patients (6%) discontinued DMF because of lymphopenia.
Dr. Robb and the other investigators had no relevant disclosures.
VANCOUVER – The risk of dimethyl fumarate lymphopenia – and perhaps progressive multifocal leukoencephalopathy – is greatest in patients 60 years or older and those with baseline absolute lymphocyte counts below 2 x 109/L, according to a review of 206 patients with relapsing-remitting or progressive multiple sclerosis from the University of Rochester (N.Y.).
A total of 87 patients (42%), all of whom were on dimethyl fumarate (DMF; Tecfidera) for at least 3 months, developed lymphopenia with an absolute lymphocyte count (ALC) below 0.91 x 109/L. That’s not a surprise; lymphopenia is a well-known side effect of the drug, and the rates in Rochester were similar to what was reported in clinical trials. The greatest concern with DMF lymphopenia is subsequent progressive multifocal leukoencephalopathy (PML); a handful of cases have been reported in lymphopenic patients, none in the University of Rochester review.
What was surprising was that in the 34 patients aged 60 years or older, 24 (71%) developed lymphopenia, versus 62 (36%) of the 172 under 60 years old (P = .0005). Meanwhile, of 93 patients with baseline ALCs below 2 x 109/L, 49 (53%) became lymphopenic, versus 34 of 104 patients (33%) who started DMF with higher lymphocyte counts (P = .0006). A total of nine patients in the study did not have a baseline ALC available.
“If I had a patient who was 70 years old with a low baseline lymphocyte count, [these findings] would weigh into my decisions about choosing” this medication. “Age and baseline ALC may guide future selection of patients for DMF therapy,” neurologist and investigator Dr. Jessica Robb said at the annual meeting of the American Academy of Neurology.
Also, because higher grade lymphopenia didn’t resolve in most cases until the drug was stopped, “if I had a patient who developed more severe grade 3 or 4 lymphopenia, I would probably have a lower threshold for” discontinuation. “I would probably think about changing medication more quickly rather than leaving them on [DMF] and hoping that their lymphopenia resolves,” Dr. Robb said.
The Rochester findings are in line with a 2015 report from Washington University, St. Louis, that also indicated a higher risk of moderate to severe lymphopenia in older patients and those with lower baseline ALCs, as well as recent natalizumab (Tysabri) users. Grade 2 or worse lymphopenia “is unlikely to resolve while on the drug,” the St. Louis investigators concluded (Mult Scler J Exp Transl Clin. 2015 Jan-Dec;1:2055217315596994).
Taken together, the two studies are important because there’s otherwise not much else in the medical literature identifying DMF lymphopenia risk factors. Lymphopenia and PML are also concerns with other multiple sclerosis (MS) agents.
“The increased prevalence of lymphopenia in older patients and in patients with a lower baseline ALC suggests a failure of lymphopoiesis triggered by DMF therapy. Indeed, lymphopoiesis declines with age due to thymic involution and decreased production of naive lymphocytes. ... Whether these consequences of normal aging could be amplified by DMF is an avenue for future study,” the St. Louis team said.
“The significance of increased risk for lymphopenia in patients recently exposed to natalizumab is not immediately obvious. ... Natalizumab is known to expand circulating leukocytes, including progenitor cells. If in turn, DMF causes lymphocyte apoptosis or arrest of differentiation, then patients sequentially exposed to natalizumab and DMF might have a larger number of circulating lymphocytes vulnerable to DMF effects than other patients,” they said.
Food and Drug Administration labeling for DMF recommends lymphocyte counts at baseline, 6 months, and every 6-12 months thereafter. However, European regulators recently recommended lymphocyte counts at baseline and every 3 months to catch problems early, as well as baseline MRIs as references for possible PML.
Standard, 240-mg twice-daily dosing was used at the University of Rochester, and the mean age in the study was 49 years. The majority of patients were women, and the mean duration of MS was 11 years. Almost three-quarters of the patients were new to immunosuppression, and none of the patients developed serious infections.
The University of Rochester team noted a higher rate of grade 1 lymphopenia than reported in clinical trials (18% vs. 10%). Twelve patients (6%) discontinued DMF because of lymphopenia.
Dr. Robb and the other investigators had no relevant disclosures.
VANCOUVER – The risk of dimethyl fumarate lymphopenia – and perhaps progressive multifocal leukoencephalopathy – is greatest in patients 60 years or older and those with baseline absolute lymphocyte counts below 2 x 109/L, according to a review of 206 patients with relapsing-remitting or progressive multiple sclerosis from the University of Rochester (N.Y.).
A total of 87 patients (42%), all of whom were on dimethyl fumarate (DMF; Tecfidera) for at least 3 months, developed lymphopenia with an absolute lymphocyte count (ALC) below 0.91 x 109/L. That’s not a surprise; lymphopenia is a well-known side effect of the drug, and the rates in Rochester were similar to what was reported in clinical trials. The greatest concern with DMF lymphopenia is subsequent progressive multifocal leukoencephalopathy (PML); a handful of cases have been reported in lymphopenic patients, none in the University of Rochester review.
What was surprising was that in the 34 patients aged 60 years or older, 24 (71%) developed lymphopenia, versus 62 (36%) of the 172 under 60 years old (P = .0005). Meanwhile, of 93 patients with baseline ALCs below 2 x 109/L, 49 (53%) became lymphopenic, versus 34 of 104 patients (33%) who started DMF with higher lymphocyte counts (P = .0006). A total of nine patients in the study did not have a baseline ALC available.
“If I had a patient who was 70 years old with a low baseline lymphocyte count, [these findings] would weigh into my decisions about choosing” this medication. “Age and baseline ALC may guide future selection of patients for DMF therapy,” neurologist and investigator Dr. Jessica Robb said at the annual meeting of the American Academy of Neurology.
Also, because higher grade lymphopenia didn’t resolve in most cases until the drug was stopped, “if I had a patient who developed more severe grade 3 or 4 lymphopenia, I would probably have a lower threshold for” discontinuation. “I would probably think about changing medication more quickly rather than leaving them on [DMF] and hoping that their lymphopenia resolves,” Dr. Robb said.
The Rochester findings are in line with a 2015 report from Washington University, St. Louis, that also indicated a higher risk of moderate to severe lymphopenia in older patients and those with lower baseline ALCs, as well as recent natalizumab (Tysabri) users. Grade 2 or worse lymphopenia “is unlikely to resolve while on the drug,” the St. Louis investigators concluded (Mult Scler J Exp Transl Clin. 2015 Jan-Dec;1:2055217315596994).
Taken together, the two studies are important because there’s otherwise not much else in the medical literature identifying DMF lymphopenia risk factors. Lymphopenia and PML are also concerns with other multiple sclerosis (MS) agents.
“The increased prevalence of lymphopenia in older patients and in patients with a lower baseline ALC suggests a failure of lymphopoiesis triggered by DMF therapy. Indeed, lymphopoiesis declines with age due to thymic involution and decreased production of naive lymphocytes. ... Whether these consequences of normal aging could be amplified by DMF is an avenue for future study,” the St. Louis team said.
“The significance of increased risk for lymphopenia in patients recently exposed to natalizumab is not immediately obvious. ... Natalizumab is known to expand circulating leukocytes, including progenitor cells. If in turn, DMF causes lymphocyte apoptosis or arrest of differentiation, then patients sequentially exposed to natalizumab and DMF might have a larger number of circulating lymphocytes vulnerable to DMF effects than other patients,” they said.
Food and Drug Administration labeling for DMF recommends lymphocyte counts at baseline, 6 months, and every 6-12 months thereafter. However, European regulators recently recommended lymphocyte counts at baseline and every 3 months to catch problems early, as well as baseline MRIs as references for possible PML.
Standard, 240-mg twice-daily dosing was used at the University of Rochester, and the mean age in the study was 49 years. The majority of patients were women, and the mean duration of MS was 11 years. Almost three-quarters of the patients were new to immunosuppression, and none of the patients developed serious infections.
The University of Rochester team noted a higher rate of grade 1 lymphopenia than reported in clinical trials (18% vs. 10%). Twelve patients (6%) discontinued DMF because of lymphopenia.
Dr. Robb and the other investigators had no relevant disclosures.
AT THE AAN 2016 ANNUAL MEETING
Key clinical point: Dimethyl fumarate is probably not the best option for older patients with lower baseline lymphocyte counts.
Major finding: Among 34 patients aged 60 years or older, 24 (71%) developed lymphopenia, versus 62 (36%) of the 172 under 60 years old (P = .0005). Meanwhile, of 93 patients with baseline ALCs below 2 x 109/L, 49 (53%) became lymphopenic, versus 34 of 104 patients (33%) who started DMF with higher lymphocyte counts (P = .0006).
Data source: Review of 206 patients with relapsing-remitting or progressive multiple sclerosis
Disclosures: The investigators had no disclosures.
Keys to Success on the Focused Practice in Hospital Medicine Exam
- Enroll in the Focused Practice in Hospital Medicine MOC program by August 1 at www.abim.org.
- Schedule a seat for the exam before August 15 at www.abim.org.
- Order SHM SPARK, the missing piece of the MOC exam-prep puzzle.
SHM recently developed the only MOC exam-preparation tool by hospitalists for hospitalists, SHM SPARK. It complements tools already on the market and will help hospitalists succeed on the upcoming exam. SHM SPARK delivers 175 vignette-style multiple-choice questions that bridge the primary knowledge gaps found within existing MOC exam-preparation products and provides in-depth review on:
- Palliative care, ethics, and decision making
- Patient safety
- Perioperative care and consultative co-management
- Quality, cost, and clinical reasoning
SHM SPARK offers detailed learning objectives and discussion points and allows users to define individual areas of strengths and weaknesses. Users can claim 58 ABIM MOC Medical Knowledge points upon completion of all four modules with a minimum passing score of 80%. After successful completion of all four modules, participants may claim up to 10.5 AMA PRA Category 1 credits.
- Enroll in the Focused Practice in Hospital Medicine MOC program by August 1 at www.abim.org.
- Schedule a seat for the exam before August 15 at www.abim.org.
- Order SHM SPARK, the missing piece of the MOC exam-prep puzzle.
SHM recently developed the only MOC exam-preparation tool by hospitalists for hospitalists, SHM SPARK. It complements tools already on the market and will help hospitalists succeed on the upcoming exam. SHM SPARK delivers 175 vignette-style multiple-choice questions that bridge the primary knowledge gaps found within existing MOC exam-preparation products and provides in-depth review on:
- Palliative care, ethics, and decision making
- Patient safety
- Perioperative care and consultative co-management
- Quality, cost, and clinical reasoning
SHM SPARK offers detailed learning objectives and discussion points and allows users to define individual areas of strengths and weaknesses. Users can claim 58 ABIM MOC Medical Knowledge points upon completion of all four modules with a minimum passing score of 80%. After successful completion of all four modules, participants may claim up to 10.5 AMA PRA Category 1 credits.
- Enroll in the Focused Practice in Hospital Medicine MOC program by August 1 at www.abim.org.
- Schedule a seat for the exam before August 15 at www.abim.org.
- Order SHM SPARK, the missing piece of the MOC exam-prep puzzle.
SHM recently developed the only MOC exam-preparation tool by hospitalists for hospitalists, SHM SPARK. It complements tools already on the market and will help hospitalists succeed on the upcoming exam. SHM SPARK delivers 175 vignette-style multiple-choice questions that bridge the primary knowledge gaps found within existing MOC exam-preparation products and provides in-depth review on:
- Palliative care, ethics, and decision making
- Patient safety
- Perioperative care and consultative co-management
- Quality, cost, and clinical reasoning
SHM SPARK offers detailed learning objectives and discussion points and allows users to define individual areas of strengths and weaknesses. Users can claim 58 ABIM MOC Medical Knowledge points upon completion of all four modules with a minimum passing score of 80%. After successful completion of all four modules, participants may claim up to 10.5 AMA PRA Category 1 credits.
Update on the Interstate Medical Licensure Compact
In 2014, the Society of Hospital Medicine endorsed the Interstate Medical Licensure Compact as a way to address divergent physician licensing requirements among states. The thrust of SHM’s reasoning was that differing licensing policies across state lines not only hinder the ability of hospitalists to quickly adjust staffing to meet the needs of hospitals and patients but also create extensive, costly, and often redundant administrative hurdles for individual hospitalists and hospital medicine groups. For hospitalists looking to relocate to another state, practice in multiple states, provide telemedicine services, or even take on some per diem work, the Interstate Medical Licensure Compact should be of great help.
To briefly summarize, states participating in the compact agree to share information with one another and work together in streamlining the licensing process. For example, the compact aims to reduce redundant licensing requirements by creating one place where physicians submit basic information such as their education credentials. The compact does not establish a national license; a license to practice medicine will still be issued by individual state medical boards. Physicians will still need to be licensed in the state where the patient is located, but the difference is that the process of obtaining a license will be streamlined significantly.
To join the Interstate Medical Licensure Compact, state legislatures must enact the compact into state law. Two years in, the compact is now being implemented in 12 states: Alabama, Idaho, Illinois, Iowa, Minnesota, Montana, Nevada, South Dakota, Utah, West Virginia, Wisconsin, and Wyoming. States where it has been introduced but not yet adopted include Alaska, Arizona, Colorado, Kansas, Maryland, Michigan, Mississippi, Nebraska, New Hampshire, Oklahoma, Pennsylvania, Rhode Island, Vermont, and Washington.
Licenses via the compact process are not currently being issued, but representatives from the 12 participating states have begun to formally meet and are working out the administrative procedures needed to begin expedited licensure processes. With a core group of states adopting and implementing the compact, it will be important for state officials to hear why adoption of the compact is important to physicians.
This presents an opportunity for hospitalists residing in holdout states to participate in some advocacy work at the state level—on their own, as a group, or even within one of SHM’s many state chapters. To find your local chapter and get involved, visit www.hospitalmedicine.org/chapters.
To assist, detailed information on the Interstate Medical Licensure Compact can be found at www.licenseportability.org, and SHM advocacy staff is available to address questions members may have about getting started. You can reach them via email at [email protected]. TH
Josh Boswell is SHM’s director of government affairs.
In 2014, the Society of Hospital Medicine endorsed the Interstate Medical Licensure Compact as a way to address divergent physician licensing requirements among states. The thrust of SHM’s reasoning was that differing licensing policies across state lines not only hinder the ability of hospitalists to quickly adjust staffing to meet the needs of hospitals and patients but also create extensive, costly, and often redundant administrative hurdles for individual hospitalists and hospital medicine groups. For hospitalists looking to relocate to another state, practice in multiple states, provide telemedicine services, or even take on some per diem work, the Interstate Medical Licensure Compact should be of great help.
To briefly summarize, states participating in the compact agree to share information with one another and work together in streamlining the licensing process. For example, the compact aims to reduce redundant licensing requirements by creating one place where physicians submit basic information such as their education credentials. The compact does not establish a national license; a license to practice medicine will still be issued by individual state medical boards. Physicians will still need to be licensed in the state where the patient is located, but the difference is that the process of obtaining a license will be streamlined significantly.
To join the Interstate Medical Licensure Compact, state legislatures must enact the compact into state law. Two years in, the compact is now being implemented in 12 states: Alabama, Idaho, Illinois, Iowa, Minnesota, Montana, Nevada, South Dakota, Utah, West Virginia, Wisconsin, and Wyoming. States where it has been introduced but not yet adopted include Alaska, Arizona, Colorado, Kansas, Maryland, Michigan, Mississippi, Nebraska, New Hampshire, Oklahoma, Pennsylvania, Rhode Island, Vermont, and Washington.
Licenses via the compact process are not currently being issued, but representatives from the 12 participating states have begun to formally meet and are working out the administrative procedures needed to begin expedited licensure processes. With a core group of states adopting and implementing the compact, it will be important for state officials to hear why adoption of the compact is important to physicians.
This presents an opportunity for hospitalists residing in holdout states to participate in some advocacy work at the state level—on their own, as a group, or even within one of SHM’s many state chapters. To find your local chapter and get involved, visit www.hospitalmedicine.org/chapters.
To assist, detailed information on the Interstate Medical Licensure Compact can be found at www.licenseportability.org, and SHM advocacy staff is available to address questions members may have about getting started. You can reach them via email at [email protected]. TH
Josh Boswell is SHM’s director of government affairs.
In 2014, the Society of Hospital Medicine endorsed the Interstate Medical Licensure Compact as a way to address divergent physician licensing requirements among states. The thrust of SHM’s reasoning was that differing licensing policies across state lines not only hinder the ability of hospitalists to quickly adjust staffing to meet the needs of hospitals and patients but also create extensive, costly, and often redundant administrative hurdles for individual hospitalists and hospital medicine groups. For hospitalists looking to relocate to another state, practice in multiple states, provide telemedicine services, or even take on some per diem work, the Interstate Medical Licensure Compact should be of great help.
To briefly summarize, states participating in the compact agree to share information with one another and work together in streamlining the licensing process. For example, the compact aims to reduce redundant licensing requirements by creating one place where physicians submit basic information such as their education credentials. The compact does not establish a national license; a license to practice medicine will still be issued by individual state medical boards. Physicians will still need to be licensed in the state where the patient is located, but the difference is that the process of obtaining a license will be streamlined significantly.
To join the Interstate Medical Licensure Compact, state legislatures must enact the compact into state law. Two years in, the compact is now being implemented in 12 states: Alabama, Idaho, Illinois, Iowa, Minnesota, Montana, Nevada, South Dakota, Utah, West Virginia, Wisconsin, and Wyoming. States where it has been introduced but not yet adopted include Alaska, Arizona, Colorado, Kansas, Maryland, Michigan, Mississippi, Nebraska, New Hampshire, Oklahoma, Pennsylvania, Rhode Island, Vermont, and Washington.
Licenses via the compact process are not currently being issued, but representatives from the 12 participating states have begun to formally meet and are working out the administrative procedures needed to begin expedited licensure processes. With a core group of states adopting and implementing the compact, it will be important for state officials to hear why adoption of the compact is important to physicians.
This presents an opportunity for hospitalists residing in holdout states to participate in some advocacy work at the state level—on their own, as a group, or even within one of SHM’s many state chapters. To find your local chapter and get involved, visit www.hospitalmedicine.org/chapters.
To assist, detailed information on the Interstate Medical Licensure Compact can be found at www.licenseportability.org, and SHM advocacy staff is available to address questions members may have about getting started. You can reach them via email at [email protected]. TH
Josh Boswell is SHM’s director of government affairs.
FDA approves CMV test for use in HSCT recipients

The US Food and Drug Administration (FDA) has approved the first cytomegalovirus (CMV) test for use in hematopoietic stem cell transplant (HSCT) recipients.
With this approval, the COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is available for monitoring CMV treatment in all types of transplant patients in the US.
The test, which was developed by Roche, is an in vitro nucleic acid amplification test that quantitates CMV DNA in human plasma.
It is intended to aid the management of HSCT recipients and solid-organ transplant recipients who are undergoing anti-CMV therapy.
In this population, serial DNA measurements can be used to assess virological response to antiviral treatment. The results from the test must be interpreted within the context of all relevant clinical and laboratory findings.
The COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is not intended for use as a screening test for the presence of CMV DNA in blood or blood products.
The test is designed for use on the automated COBAS® AmpliPrep/COBAS® TaqMan® System, an established platform for viral load monitoring of multiple infectious diseases.
The system combines the COBAS® AmpliPrep Instrument for automated sample preparation and the COBAS® TaqMan® Analyzer or the smaller COBAS® TaqMan® 48 Analyzer for automated real-time PCR amplification and detection.
The COBAS® AmpliPrep/COBAS® TaqMan® System has parallel processing with other molecular diagnostics assays targeting other diseases. Roche’s AmpErase enzyme is also included in each test and is designed to prevent cross-contamination of samples and labs. ![]()

The US Food and Drug Administration (FDA) has approved the first cytomegalovirus (CMV) test for use in hematopoietic stem cell transplant (HSCT) recipients.
With this approval, the COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is available for monitoring CMV treatment in all types of transplant patients in the US.
The test, which was developed by Roche, is an in vitro nucleic acid amplification test that quantitates CMV DNA in human plasma.
It is intended to aid the management of HSCT recipients and solid-organ transplant recipients who are undergoing anti-CMV therapy.
In this population, serial DNA measurements can be used to assess virological response to antiviral treatment. The results from the test must be interpreted within the context of all relevant clinical and laboratory findings.
The COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is not intended for use as a screening test for the presence of CMV DNA in blood or blood products.
The test is designed for use on the automated COBAS® AmpliPrep/COBAS® TaqMan® System, an established platform for viral load monitoring of multiple infectious diseases.
The system combines the COBAS® AmpliPrep Instrument for automated sample preparation and the COBAS® TaqMan® Analyzer or the smaller COBAS® TaqMan® 48 Analyzer for automated real-time PCR amplification and detection.
The COBAS® AmpliPrep/COBAS® TaqMan® System has parallel processing with other molecular diagnostics assays targeting other diseases. Roche’s AmpErase enzyme is also included in each test and is designed to prevent cross-contamination of samples and labs. ![]()

The US Food and Drug Administration (FDA) has approved the first cytomegalovirus (CMV) test for use in hematopoietic stem cell transplant (HSCT) recipients.
With this approval, the COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is available for monitoring CMV treatment in all types of transplant patients in the US.
The test, which was developed by Roche, is an in vitro nucleic acid amplification test that quantitates CMV DNA in human plasma.
It is intended to aid the management of HSCT recipients and solid-organ transplant recipients who are undergoing anti-CMV therapy.
In this population, serial DNA measurements can be used to assess virological response to antiviral treatment. The results from the test must be interpreted within the context of all relevant clinical and laboratory findings.
The COBAS® AmpliPrep/COBAS® TaqMan® CMV Test is not intended for use as a screening test for the presence of CMV DNA in blood or blood products.
The test is designed for use on the automated COBAS® AmpliPrep/COBAS® TaqMan® System, an established platform for viral load monitoring of multiple infectious diseases.
The system combines the COBAS® AmpliPrep Instrument for automated sample preparation and the COBAS® TaqMan® Analyzer or the smaller COBAS® TaqMan® 48 Analyzer for automated real-time PCR amplification and detection.
The COBAS® AmpliPrep/COBAS® TaqMan® System has parallel processing with other molecular diagnostics assays targeting other diseases. Roche’s AmpErase enzyme is also included in each test and is designed to prevent cross-contamination of samples and labs. ![]()
Why patients don’t report possible cancer symptoms

Photo courtesy of NIH
Worrying about wasting their doctor’s time is stopping people from reporting symptoms that might be related to cancer, according to a small study published in the British Journal of General Practice.
The goal of the study was to determine why some people are more likely than others to worry about wasting a general practitioner’s (GP’s) time and delay reporting possible cancer symptoms.
“People worrying about wasting their doctor’s time is one of the challenges we need to tackle when thinking about trying to diagnose cancer earlier,” said study author Katriina Whitaker, PhD, of the University of Surrey in the UK.
“We need to get to the root of the problem and find out why people are feeling worried. Not a lot of work has been done on this so far. Our study draws attention to some reasons patients put off going to their GP to check out possible cancer symptoms.”
For this study, Dr Whitaker and her colleagues conducted interviews with subjects in London, South East England, and North West England.
The subjects were recruited from a sample of 2042 adults, age 50 and older, who completed a survey that included a list of “cancer alarm symptoms.”
Ultimately, the researchers interviewed 62 subjects who had reported symptoms at baseline, were still present at the 3-month follow-up, and had agreed to be contacted.
The interviews revealed a few reasons why subjects were hesitant to report symptoms to their GP.
Some subjects felt that long waiting times for appointments indicated GPs were very busy, so they shouldn’t bother making an appointment unless symptoms seemed very serious.
Other subjects felt that seeking help when their symptoms did not seem serious—ie, persistent, worsening, or life-threatening—was a waste of a doctor’s time.
Still other subjects were hesitant to seek help because their doctors had been dismissive about symptoms in the past.
On the other hand, subjects who reported positive interactions with GPs or good relationships with them were less worried about time-wasting.
And other subjects weren’t worried about wasting their doctor’s time because they think of GPs as fulfilling a service financed by taxpayers.
“We’ve all had times where we’ve wondered if we should go to see a GP, but getting unusual or persistent changes checked out is really important,” said Julie Sharp, head of health and patient information at Cancer Research UK, which funded this study.
“Worrying about wasting a GP’s time should not put people off. Doctors are there to help spot cancer symptoms early when treatment is more likely to be successful, and delaying a visit could save up bigger problems for later. So if you’ve noticed anything that isn’t normal for you, make an appointment to see your doctor.” ![]()

Photo courtesy of NIH
Worrying about wasting their doctor’s time is stopping people from reporting symptoms that might be related to cancer, according to a small study published in the British Journal of General Practice.
The goal of the study was to determine why some people are more likely than others to worry about wasting a general practitioner’s (GP’s) time and delay reporting possible cancer symptoms.
“People worrying about wasting their doctor’s time is one of the challenges we need to tackle when thinking about trying to diagnose cancer earlier,” said study author Katriina Whitaker, PhD, of the University of Surrey in the UK.
“We need to get to the root of the problem and find out why people are feeling worried. Not a lot of work has been done on this so far. Our study draws attention to some reasons patients put off going to their GP to check out possible cancer symptoms.”
For this study, Dr Whitaker and her colleagues conducted interviews with subjects in London, South East England, and North West England.
The subjects were recruited from a sample of 2042 adults, age 50 and older, who completed a survey that included a list of “cancer alarm symptoms.”
Ultimately, the researchers interviewed 62 subjects who had reported symptoms at baseline, were still present at the 3-month follow-up, and had agreed to be contacted.
The interviews revealed a few reasons why subjects were hesitant to report symptoms to their GP.
Some subjects felt that long waiting times for appointments indicated GPs were very busy, so they shouldn’t bother making an appointment unless symptoms seemed very serious.
Other subjects felt that seeking help when their symptoms did not seem serious—ie, persistent, worsening, or life-threatening—was a waste of a doctor’s time.
Still other subjects were hesitant to seek help because their doctors had been dismissive about symptoms in the past.
On the other hand, subjects who reported positive interactions with GPs or good relationships with them were less worried about time-wasting.
And other subjects weren’t worried about wasting their doctor’s time because they think of GPs as fulfilling a service financed by taxpayers.
“We’ve all had times where we’ve wondered if we should go to see a GP, but getting unusual or persistent changes checked out is really important,” said Julie Sharp, head of health and patient information at Cancer Research UK, which funded this study.
“Worrying about wasting a GP’s time should not put people off. Doctors are there to help spot cancer symptoms early when treatment is more likely to be successful, and delaying a visit could save up bigger problems for later. So if you’ve noticed anything that isn’t normal for you, make an appointment to see your doctor.” ![]()

Photo courtesy of NIH
Worrying about wasting their doctor’s time is stopping people from reporting symptoms that might be related to cancer, according to a small study published in the British Journal of General Practice.
The goal of the study was to determine why some people are more likely than others to worry about wasting a general practitioner’s (GP’s) time and delay reporting possible cancer symptoms.
“People worrying about wasting their doctor’s time is one of the challenges we need to tackle when thinking about trying to diagnose cancer earlier,” said study author Katriina Whitaker, PhD, of the University of Surrey in the UK.
“We need to get to the root of the problem and find out why people are feeling worried. Not a lot of work has been done on this so far. Our study draws attention to some reasons patients put off going to their GP to check out possible cancer symptoms.”
For this study, Dr Whitaker and her colleagues conducted interviews with subjects in London, South East England, and North West England.
The subjects were recruited from a sample of 2042 adults, age 50 and older, who completed a survey that included a list of “cancer alarm symptoms.”
Ultimately, the researchers interviewed 62 subjects who had reported symptoms at baseline, were still present at the 3-month follow-up, and had agreed to be contacted.
The interviews revealed a few reasons why subjects were hesitant to report symptoms to their GP.
Some subjects felt that long waiting times for appointments indicated GPs were very busy, so they shouldn’t bother making an appointment unless symptoms seemed very serious.
Other subjects felt that seeking help when their symptoms did not seem serious—ie, persistent, worsening, or life-threatening—was a waste of a doctor’s time.
Still other subjects were hesitant to seek help because their doctors had been dismissive about symptoms in the past.
On the other hand, subjects who reported positive interactions with GPs or good relationships with them were less worried about time-wasting.
And other subjects weren’t worried about wasting their doctor’s time because they think of GPs as fulfilling a service financed by taxpayers.
“We’ve all had times where we’ve wondered if we should go to see a GP, but getting unusual or persistent changes checked out is really important,” said Julie Sharp, head of health and patient information at Cancer Research UK, which funded this study.
“Worrying about wasting a GP’s time should not put people off. Doctors are there to help spot cancer symptoms early when treatment is more likely to be successful, and delaying a visit could save up bigger problems for later. So if you’ve noticed anything that isn’t normal for you, make an appointment to see your doctor.” ![]()
Anemia hinders recovery from TBIs

Recent studies have suggested that roughly half of patients hospitalized with traumatic brain injuries (TBIs) are anemic, but it hasn’t been clear how the anemia affects patients’ recovery.
Now, researchers have found evidence suggesting that low hemoglobin levels can negatively influence the outcomes of patients with TBIs.
The team detailed this evidence in a paper published in World Neurosurgery.
“More research is needed to develop treatment protocols for anemic patients with traumatic brain injuries,” said study author N. Scott Litofsky, MD, of the University of Missouri School of Medicine in Columbia.
“There has been a lack of consensus among physicians regarding the relationship of anemia and traumatic brain injuries on a patient’s health. Because of this uncertainty, treatment protocols are unclear and inconsistent. Our observational study found that a patient’s outcome is worse when he or she is anemic.”
The researchers studied 939 TBI patients with anemia who were admitted to a Level I trauma center.
The team assessed the relationships between patients’ initial hemoglobin level and lowest hemoglobin level during hospitalization at threshold values of ≤7, ≤8, ≤9, and ≤10 g/dL relative to their Glasgow Outcome Score within a year of surgery.
The data suggested that both initial hemoglobin levels and lowest hemoglobin levels were independent predictors of poor outcome (P<0.0001).
For each increase in initial hemoglobin level of 1 g/dL, the odds of a patient achieving a good outcome increased by 32%. For each increase in lowest hemoglobin level of 1 g/dL, the probability of a good outcome increased by 35.6%.
Female patients had worse outcomes than male patients if their initial hemoglobin levels were between 7 g/dL and 8 g/dL (P<0.05).
And receiving a blood transfusion was associated with poorer outcomes at hemoglobin levels ≤9 g/dL and ≤10 g/dL (P<0.05) but not at the lower hemoglobin thresholds.
The researchers said these data suggest clinicians may want to consider giving blood transfusions in TBI patients with hemoglobin levels of 8 g/dL or lower.
However, Dr Litofsky noted that the purpose of this study was not to propose transfusion guidelines. It was to determine the effects of anemia on TBI outcomes.
“Now that we have shown that anemia affects a patient’s recovery, further studies are needed to determine the best way to correct it,” he said. “The ultimate goal of this research is to help patients recover more quickly from traumatic brain injuries.” ![]()

Recent studies have suggested that roughly half of patients hospitalized with traumatic brain injuries (TBIs) are anemic, but it hasn’t been clear how the anemia affects patients’ recovery.
Now, researchers have found evidence suggesting that low hemoglobin levels can negatively influence the outcomes of patients with TBIs.
The team detailed this evidence in a paper published in World Neurosurgery.
“More research is needed to develop treatment protocols for anemic patients with traumatic brain injuries,” said study author N. Scott Litofsky, MD, of the University of Missouri School of Medicine in Columbia.
“There has been a lack of consensus among physicians regarding the relationship of anemia and traumatic brain injuries on a patient’s health. Because of this uncertainty, treatment protocols are unclear and inconsistent. Our observational study found that a patient’s outcome is worse when he or she is anemic.”
The researchers studied 939 TBI patients with anemia who were admitted to a Level I trauma center.
The team assessed the relationships between patients’ initial hemoglobin level and lowest hemoglobin level during hospitalization at threshold values of ≤7, ≤8, ≤9, and ≤10 g/dL relative to their Glasgow Outcome Score within a year of surgery.
The data suggested that both initial hemoglobin levels and lowest hemoglobin levels were independent predictors of poor outcome (P<0.0001).
For each increase in initial hemoglobin level of 1 g/dL, the odds of a patient achieving a good outcome increased by 32%. For each increase in lowest hemoglobin level of 1 g/dL, the probability of a good outcome increased by 35.6%.
Female patients had worse outcomes than male patients if their initial hemoglobin levels were between 7 g/dL and 8 g/dL (P<0.05).
And receiving a blood transfusion was associated with poorer outcomes at hemoglobin levels ≤9 g/dL and ≤10 g/dL (P<0.05) but not at the lower hemoglobin thresholds.
The researchers said these data suggest clinicians may want to consider giving blood transfusions in TBI patients with hemoglobin levels of 8 g/dL or lower.
However, Dr Litofsky noted that the purpose of this study was not to propose transfusion guidelines. It was to determine the effects of anemia on TBI outcomes.
“Now that we have shown that anemia affects a patient’s recovery, further studies are needed to determine the best way to correct it,” he said. “The ultimate goal of this research is to help patients recover more quickly from traumatic brain injuries.” ![]()

Recent studies have suggested that roughly half of patients hospitalized with traumatic brain injuries (TBIs) are anemic, but it hasn’t been clear how the anemia affects patients’ recovery.
Now, researchers have found evidence suggesting that low hemoglobin levels can negatively influence the outcomes of patients with TBIs.
The team detailed this evidence in a paper published in World Neurosurgery.
“More research is needed to develop treatment protocols for anemic patients with traumatic brain injuries,” said study author N. Scott Litofsky, MD, of the University of Missouri School of Medicine in Columbia.
“There has been a lack of consensus among physicians regarding the relationship of anemia and traumatic brain injuries on a patient’s health. Because of this uncertainty, treatment protocols are unclear and inconsistent. Our observational study found that a patient’s outcome is worse when he or she is anemic.”
The researchers studied 939 TBI patients with anemia who were admitted to a Level I trauma center.
The team assessed the relationships between patients’ initial hemoglobin level and lowest hemoglobin level during hospitalization at threshold values of ≤7, ≤8, ≤9, and ≤10 g/dL relative to their Glasgow Outcome Score within a year of surgery.
The data suggested that both initial hemoglobin levels and lowest hemoglobin levels were independent predictors of poor outcome (P<0.0001).
For each increase in initial hemoglobin level of 1 g/dL, the odds of a patient achieving a good outcome increased by 32%. For each increase in lowest hemoglobin level of 1 g/dL, the probability of a good outcome increased by 35.6%.
Female patients had worse outcomes than male patients if their initial hemoglobin levels were between 7 g/dL and 8 g/dL (P<0.05).
And receiving a blood transfusion was associated with poorer outcomes at hemoglobin levels ≤9 g/dL and ≤10 g/dL (P<0.05) but not at the lower hemoglobin thresholds.
The researchers said these data suggest clinicians may want to consider giving blood transfusions in TBI patients with hemoglobin levels of 8 g/dL or lower.
However, Dr Litofsky noted that the purpose of this study was not to propose transfusion guidelines. It was to determine the effects of anemia on TBI outcomes.
“Now that we have shown that anemia affects a patient’s recovery, further studies are needed to determine the best way to correct it,” he said. “The ultimate goal of this research is to help patients recover more quickly from traumatic brain injuries.” ![]()
Study: rFVIII increases risk of inhibitors

The source of factor VIII (FVIII) replacement therapy affects the risk of inhibitor development in previously untreated patients with severe hemophilia A, according to the SIPPET study.
The data indicated that receiving recombinant FVIII (rFVIII) is associated with a nearly 2-fold higher risk of developing inhibitory alloantibodies than receiving plasma-derived FVIII.
Flora Peyvandi, MD, PhD, of Angelo Bianchi Bonomi Hemophilia and Thrombosis Center in Milan, Italy, and her colleagues reported this discovery in NEJM.
Dr Peyvandi previously presented results from the SIPPET at the 2015 ASH Annual Meeting.
The study included 251 patients (all males) who were younger than age 6 at enrollment. They had severe hemophilia A, negative inhibitor measurement at enrollment, and no or minimal exposure (less than 5 exposure days) to blood products.
The patients were randomized to either a single plasma-derived FVIII product containing von Willebrand factor (n=125) or a single rFVIII product (n=126). The treatment was at the discretion of the local physician.
Confounders—such as family history, previous exposure, and surgery—were equally distributed between the treatment arms. The same was true for the treatment type—on-demand, standard prophylaxis, etc.
Patients were treated for 50 exposure days, 3 years, or until inhibitor development. They were assessed every 3 to 5 exposure days in the first 20 exposure days, then every 10 exposure days or every 3 months and every 2 weeks during prophylaxis.
Results
The primary outcome was any FVIII inhibitor at titers ≥ 0.4 BU/mL. High-titer inhibitors (≥ 5 BU/mL) were a secondary outcome. Transient inhibitors were defined as those that spontaneously disappeared within 6 months.
Overall, 76 patients developed inhibitors, for a cumulative incidence of 35.4%. Fifty patients had high-titer inhibitors, for a cumulative incidence of 23.3%.
The cumulative incidence of all inhibitors was 44.5% (n=47) in the rFVIII arm and 26.8% (n=29) in the plasma-derived FVIII arm. The cumulative incidence of high-titer inhibitors was 28.4% (n=30) and 18.6% (n=20), respectively.
More than 73% of all inhibitors were non-transient in both arms.
By univariate Cox regression analysis, rFVIII was associated with an 87% higher incidence of inhibitors than plasma-derived FVIII (hazard ratio [HR]=1.87). And rFVIII was associated with a 69% higher incidence of high-titer inhibitors (HR=1.69).
A previous study published in NEJM in 2013 suggested that second-generation, full-length FVIII products are associated with an increased risk of inhibitor development when compared to third-generation FVIII products.
So Dr Peyvandi and her colleagues stopped using second-generation FVIII products during the course of the SIPPET study. And they adjusted their analysis to ensure their observations were not due to any confounding effects of the products.
After excluding second-generation, full-length rFVIII from their analysis, the researchers still observed an increased risk of inhibitor development with rFVIII. The HRs were 1.98 for all inhibitors and 2.59 for high-titer inhibitors. ![]()

The source of factor VIII (FVIII) replacement therapy affects the risk of inhibitor development in previously untreated patients with severe hemophilia A, according to the SIPPET study.
The data indicated that receiving recombinant FVIII (rFVIII) is associated with a nearly 2-fold higher risk of developing inhibitory alloantibodies than receiving plasma-derived FVIII.
Flora Peyvandi, MD, PhD, of Angelo Bianchi Bonomi Hemophilia and Thrombosis Center in Milan, Italy, and her colleagues reported this discovery in NEJM.
Dr Peyvandi previously presented results from the SIPPET at the 2015 ASH Annual Meeting.
The study included 251 patients (all males) who were younger than age 6 at enrollment. They had severe hemophilia A, negative inhibitor measurement at enrollment, and no or minimal exposure (less than 5 exposure days) to blood products.
The patients were randomized to either a single plasma-derived FVIII product containing von Willebrand factor (n=125) or a single rFVIII product (n=126). The treatment was at the discretion of the local physician.
Confounders—such as family history, previous exposure, and surgery—were equally distributed between the treatment arms. The same was true for the treatment type—on-demand, standard prophylaxis, etc.
Patients were treated for 50 exposure days, 3 years, or until inhibitor development. They were assessed every 3 to 5 exposure days in the first 20 exposure days, then every 10 exposure days or every 3 months and every 2 weeks during prophylaxis.
Results
The primary outcome was any FVIII inhibitor at titers ≥ 0.4 BU/mL. High-titer inhibitors (≥ 5 BU/mL) were a secondary outcome. Transient inhibitors were defined as those that spontaneously disappeared within 6 months.
Overall, 76 patients developed inhibitors, for a cumulative incidence of 35.4%. Fifty patients had high-titer inhibitors, for a cumulative incidence of 23.3%.
The cumulative incidence of all inhibitors was 44.5% (n=47) in the rFVIII arm and 26.8% (n=29) in the plasma-derived FVIII arm. The cumulative incidence of high-titer inhibitors was 28.4% (n=30) and 18.6% (n=20), respectively.
More than 73% of all inhibitors were non-transient in both arms.
By univariate Cox regression analysis, rFVIII was associated with an 87% higher incidence of inhibitors than plasma-derived FVIII (hazard ratio [HR]=1.87). And rFVIII was associated with a 69% higher incidence of high-titer inhibitors (HR=1.69).
A previous study published in NEJM in 2013 suggested that second-generation, full-length FVIII products are associated with an increased risk of inhibitor development when compared to third-generation FVIII products.
So Dr Peyvandi and her colleagues stopped using second-generation FVIII products during the course of the SIPPET study. And they adjusted their analysis to ensure their observations were not due to any confounding effects of the products.
After excluding second-generation, full-length rFVIII from their analysis, the researchers still observed an increased risk of inhibitor development with rFVIII. The HRs were 1.98 for all inhibitors and 2.59 for high-titer inhibitors. ![]()

The source of factor VIII (FVIII) replacement therapy affects the risk of inhibitor development in previously untreated patients with severe hemophilia A, according to the SIPPET study.
The data indicated that receiving recombinant FVIII (rFVIII) is associated with a nearly 2-fold higher risk of developing inhibitory alloantibodies than receiving plasma-derived FVIII.
Flora Peyvandi, MD, PhD, of Angelo Bianchi Bonomi Hemophilia and Thrombosis Center in Milan, Italy, and her colleagues reported this discovery in NEJM.
Dr Peyvandi previously presented results from the SIPPET at the 2015 ASH Annual Meeting.
The study included 251 patients (all males) who were younger than age 6 at enrollment. They had severe hemophilia A, negative inhibitor measurement at enrollment, and no or minimal exposure (less than 5 exposure days) to blood products.
The patients were randomized to either a single plasma-derived FVIII product containing von Willebrand factor (n=125) or a single rFVIII product (n=126). The treatment was at the discretion of the local physician.
Confounders—such as family history, previous exposure, and surgery—were equally distributed between the treatment arms. The same was true for the treatment type—on-demand, standard prophylaxis, etc.
Patients were treated for 50 exposure days, 3 years, or until inhibitor development. They were assessed every 3 to 5 exposure days in the first 20 exposure days, then every 10 exposure days or every 3 months and every 2 weeks during prophylaxis.
Results
The primary outcome was any FVIII inhibitor at titers ≥ 0.4 BU/mL. High-titer inhibitors (≥ 5 BU/mL) were a secondary outcome. Transient inhibitors were defined as those that spontaneously disappeared within 6 months.
Overall, 76 patients developed inhibitors, for a cumulative incidence of 35.4%. Fifty patients had high-titer inhibitors, for a cumulative incidence of 23.3%.
The cumulative incidence of all inhibitors was 44.5% (n=47) in the rFVIII arm and 26.8% (n=29) in the plasma-derived FVIII arm. The cumulative incidence of high-titer inhibitors was 28.4% (n=30) and 18.6% (n=20), respectively.
More than 73% of all inhibitors were non-transient in both arms.
By univariate Cox regression analysis, rFVIII was associated with an 87% higher incidence of inhibitors than plasma-derived FVIII (hazard ratio [HR]=1.87). And rFVIII was associated with a 69% higher incidence of high-titer inhibitors (HR=1.69).
A previous study published in NEJM in 2013 suggested that second-generation, full-length FVIII products are associated with an increased risk of inhibitor development when compared to third-generation FVIII products.
So Dr Peyvandi and her colleagues stopped using second-generation FVIII products during the course of the SIPPET study. And they adjusted their analysis to ensure their observations were not due to any confounding effects of the products.
After excluding second-generation, full-length rFVIII from their analysis, the researchers still observed an increased risk of inhibitor development with rFVIII. The HRs were 1.98 for all inhibitors and 2.59 for high-titer inhibitors. ![]()
“Go Low” or “Say No” to Aggressive Systolic BP Goals?
PRACTICE CHANGER
Consider treating nondiabetic patients ages 50 and older to a systolic blood pressure (SBP) target < 120 mm Hg (as compared to < 140 mm Hg) when the benefits—lower rates of fatal and nonfatal cardiovascular (CV) events and death from any cause—are likely to outweigh the risks from possible additional medication.1
Strength of Recommendation
B: Based on a single, good-quality randomized controlled trial (RCT). 1
A 55-year-old man with hypertension and stage 3 chronic kidney disease (CKD) presents for routine care. His blood pressure is 135/85 mm Hg, and he is currently taking lisinopril 40 mg/d. Should you increase his antihypertensive regimen?
Hypertension is common and leads to significant morbidity and mortality, but pharmacologic treatment reduces incidence of stroke by 35% to 40%, myocardial infarction (MI) by 15% to 25%, and heart failure by up to 64%.2-4 Specific blood pressure targets for defined populations continue to be studied.
The ACCORD (Action to Control Cardiovascular Risk in Diabetes) trial found that more intensive BP targets did not reduce the rate of major CV events in patients with diabetes, but the study may have been underpowered.5 The members of the Eighth Joint National Committee (JNC 8) recommended treating patients older than 60 to BP goals < 150/90 mm Hg.6 This was based on evidence from six RCTs, but there remains debate—even among the JNC 8 committee members—as to appropriate BP goals in patients of any age without CV disease who have BP measurements of 140-159/90-99 mm Hg. 7-13
Continue for the study summary >>
STUDY SUMMARY
Treating to SBP < 120 mm Hg lowers mortality
The Systolic Blood Pressure Intervention Trial (SPRINT) was a multicenter RCT designed to determine if treating to lower SBP targets in nondiabetic patients at high risk for CV events improves outcomes, compared with standard care. Patients were at least 50, had an SBP of 130 to 180 mm Hg, and were at increased CV risk; the last was defined as clinical or subclinical CV disease other than stroke; CKD with a glomerular filtration rate (GFR) of 20 to 60 mL/min/1.73 m2; 10-year risk for CV disease > 15% on Framingham risk score; or age 75 or older. Patients with diabetes, prior stroke, polycystic kidney disease, significant proteinuria or symptomatic heart failure within the past six months, or left ventricular ejection fraction < 35% were excluded.1
Patients (N = 9,361) were randomly assigned to an SBP target < 120 mm Hg in the intensive group or < 140 mm Hg in the standard treatment group, in an open-label design. Allocation was concealed. The study protocol encouraged, but did not require, the use of thiazide-type diuretics, loop diuretics (for those with advanced renal disease), ACE inhibitors or angiotensin receptor blockers, calcium channel blockers, and ß-blockers. Clinicians could add other agents as needed. All major classes of antihypertensives were used.
Medication dosing adjustments were based on the average of three BP measurements taken with an automated measurement system with the patient seated after 5 minutes of quiet rest. Target SBP in the standard therapy group was 135 to 139 mm Hg. Medication dosages were lowered if SBP was < 130 mm Hg at a single visit or < 135 mm Hg at two consecutive visits.1
The primary composite outcome included the first occurrence of MI, acute coronary syndrome, stroke, heart failure, or death from CV causes. Secondary outcomes were the individual components of the primary composite outcome; death from any cause; and the composite of the primary outcome or death from any cause.1
Study halted early. The study was stopped early due to significantly lower rates of the primary outcome in the intensive therapy group versus the standard therapy group (1.65% vs 2.19% per year, respectively; hazard ratio [HR], 0.75 with intensive treatment). The resulting median follow-up time was 3.26 years.1 This corresponds to a 25% lower relative risk for the primary outcome, with a decrease in event rates from 6.8% to 5.2% over the trial period. All-cause mortality was also lower in the intensive therapy group: 3.4% vs 4.5% (HR, 0.73).
The number needed to treat (NNT) over 3.26 years to prevent a primary outcome event, death from any cause, and death from CV causes was 61, 90, and 172, respectively. Serious adverse events occurred more frequently in the intensive therapy group than in the standard therapy group (38.3% vs 37.1%; HR, 1.04), with a number needed to harm (NNH) of 46 over the study period.1
Rates of serious adverse events that were identified as likely associated with the intervention were 4.7% vs 2.5%, respectively. Hypotension, syncope, electrolyte abnormalities, and acute kidney injury/acute renal failure reached statistical significance. The incidence of bradycardia and injurious falls, although higher in the intensive treatment group, did not reach statistical significance. In the subgroup of patients 75 or older, 48% in each study group experienced a serious adverse event.1
Throughout the study, mean SBP was 121.5 mm Hg in the intensive therapy group and 134.6 mm Hg in the standard treatment group. Patients in the intensive therapy group required, on average, one additional BP medication, compared to those in the standard treatment group (2.8 vs 1.8, respectively).1
Continue for what's new >>
WHAT’S NEW
Lower SBP produces mortality benefits in those younger, and older, than 75
This trial builds on a body of evidence that shows the advantages of lowering SBP to < 150 mm Hg7,11,12 by demonstrating benefits, including reduced all-cause mortality, for lower SBP targets in nondiabetic patients at high risk for CV disease. The SPRINT trial also showed that the benefits of intensive therapy remained true in a subgroup of patients 75 or older.
The incidence of the primary outcome in the cohort 75 or older receiving intensive therapy was 7.7%, compared with 10.9% for those receiving standard therapy (HR, 0.67; NNT, 31). All-cause mortality was also lower in the intensive therapy group than in the standard therapy group among patients 75 or older: 5.5% vs 8.04% (HR, 0.68; NNT, 38).1
CAVEATS
Many do not benefit from—or are harmed by—increased medication
The absolute risk reduction for the primary outcome is 1.6%, meaning 98.4% of patients receiving more intensive treatment will not benefit. In a group of 1,000 patients, an estimated 16 patients will benefit, 22 patients will be seriously harmed, and 962 patients will experience neither benefit nor harm.14 The difference between how BP was measured in this trial (an average of three readings after the patient had rested for 5 minutes) and what occurs typically in clinical practice could potentially lead to overtreatment in a “real world” setting.
Also, reducing antihypertensive therapies when the SBP was about 130 to 135 mm Hg in the standard therapy group likely exaggerated the difference in outcomes between the intensive and standard therapy groups; this is neither routine nor recommended in clinical practice.6 Finally, the trial specifically studied nondiabetic patients at high risk for CV disease who were 50 or older, limiting generalizability to other populations.
CHALLENGES TO IMPLEMENTATION
Who will benefit/who can achieve intensive SBP goals?
Identifying patients most likely to benefit from more intensive BP targets remains challenging. The SPRINT trial showed a mortality benefit, but at a cost of increased morbidity.1,14 Caution should be exercised particularly in the subgroup of patients 75 or older. Despite a lower NNT than the rest of the study population, this group experienced serious adverse events more frequently. Also, this particular cohort of volunteers may not be representative of those 75 or older in the general population.
Additionally, achieving intensive SBP goals can be challenging. In the SPRINT trial, only half of the intensive target group achieved an SBP < 120 mm Hg.1 And in a 2011-2012 National Health and Nutrition Examination Survey, only 52% of patients in the general population achieved a BP target < 140/90 mm Hg.15 Lower morbidity and mortality should remain the ultimate goals in the management of hypertension, requiring clinicians to carefully assess an individual patient’s likelihood of benefit versus harm.
REFERENCES
1. Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116.
2. Chobanian AV, Bakris GL, Black HR, et al. The seventh report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: the JNC 7 report. JAMA. 2003;289:2560-2572.
3. Neal B, MacMahon S, Chapman N. Effects of ACE inhibitors, calcium antagonists, and other blood-pressure-lowering drugs: results of prospectively designed overviews of randomised trials.Lancet. 2000;356:1955-1964.
4. Psaty BM, Smith NL, Siscovick DS, et al. Health outcomes associated with antihypertensive therapies used as first-line agents: a systematic review and meta-analysis. JAMA. 1997;277:739-745.
5. Margolis KL, O’Connor PJ, Morgan TM, et al. Outcomes of combined cardiovascular risk factor management strategies in type 2 diabetes: the ACCORD randomized trial. Diabetes Care. 2014;37:1721-1728.
6. James PA, Oparil S, Carter BL, et al. 2014 evidence-based guideline for the management of high blood pressure in adults: report from the panel members appointed to the Eighth Joint National Committee (JNC 8).JAMA. 2014;311:507-520.
7. Beckett NS, Peters R, Fletcher AE, et al. Treatment of hypertension in patients 80 years of age or older.N Engl J Med. 2008;358:1887-1898.
8. Verdecchia P, Staessen JA, Angeli F, et al. Usual versus tight control of systolic blood pressure in non-diabetic patients with hypertension (Cardio-Sis): an open-label randomised trial. Lancet. 2009;374:525-533.
9. JATOS Study Group. Principal results of the Japanese trial to assess optimal systolic blood pressure in elderly hypertensive patients (JATOS). Hypertens Res. 2008;31:2115-2127.
10. Ogihara T, Saruta T, Rakugi H, et al. Target blood pressure for treatment of isolated systolic hypertension in the elderly: valsartan in elderly isolated systolic hypertension study. Hypertension. 2010;56:196-202.
11. Staessen JA, Fagard R, Thijs L, et al; the Systolic Hypertension in Europe (Syst-Eur) Trial Investigators. Randomised double-blind comparison of placebo and active treatment for older patients with isolated systolic hypertension.Lancet. 1997;350:757-764.
12. SHEP Cooperative Research Group. Prevention of stroke by antihypertensive drug treatment in older persons with isolated systolic hypertension: final results of the Systolic Hypertension in the Elderly Program (SHEP). JAMA. 1991;265:3255-3264.
13. Cundiff DK, Gueyffier F, Wright JM. Guidelines for managing high blood pressure. JAMA. 2014;312:294.
14. Ortiz E, James PA. Let’s not SPRINT to judgment about new blood pressure goals. Ann Intern Med. 2016 Feb 23. [Epub ahead of print]
15. Nwankwo T, Yoon SS, Burt V, et al. Hypertension among adults in the United States: National Health and Nutrition Examination Survey, 2011-2012. NCHS Data Brief. 2013;1-8.
ACKNOWLEDGEMENT
The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.
Copyright © 2016. The Family Physicians Inquiries Network. All rights reserved.
Reprinted with permission from the Family Physicians Inquiries Network and The Journal of Family Practice. 2016;65(5):342-344.
PRACTICE CHANGER
Consider treating nondiabetic patients ages 50 and older to a systolic blood pressure (SBP) target < 120 mm Hg (as compared to < 140 mm Hg) when the benefits—lower rates of fatal and nonfatal cardiovascular (CV) events and death from any cause—are likely to outweigh the risks from possible additional medication.1
Strength of Recommendation
B: Based on a single, good-quality randomized controlled trial (RCT). 1
A 55-year-old man with hypertension and stage 3 chronic kidney disease (CKD) presents for routine care. His blood pressure is 135/85 mm Hg, and he is currently taking lisinopril 40 mg/d. Should you increase his antihypertensive regimen?
Hypertension is common and leads to significant morbidity and mortality, but pharmacologic treatment reduces incidence of stroke by 35% to 40%, myocardial infarction (MI) by 15% to 25%, and heart failure by up to 64%.2-4 Specific blood pressure targets for defined populations continue to be studied.
The ACCORD (Action to Control Cardiovascular Risk in Diabetes) trial found that more intensive BP targets did not reduce the rate of major CV events in patients with diabetes, but the study may have been underpowered.5 The members of the Eighth Joint National Committee (JNC 8) recommended treating patients older than 60 to BP goals < 150/90 mm Hg.6 This was based on evidence from six RCTs, but there remains debate—even among the JNC 8 committee members—as to appropriate BP goals in patients of any age without CV disease who have BP measurements of 140-159/90-99 mm Hg. 7-13
Continue for the study summary >>
STUDY SUMMARY
Treating to SBP < 120 mm Hg lowers mortality
The Systolic Blood Pressure Intervention Trial (SPRINT) was a multicenter RCT designed to determine if treating to lower SBP targets in nondiabetic patients at high risk for CV events improves outcomes, compared with standard care. Patients were at least 50, had an SBP of 130 to 180 mm Hg, and were at increased CV risk; the last was defined as clinical or subclinical CV disease other than stroke; CKD with a glomerular filtration rate (GFR) of 20 to 60 mL/min/1.73 m2; 10-year risk for CV disease > 15% on Framingham risk score; or age 75 or older. Patients with diabetes, prior stroke, polycystic kidney disease, significant proteinuria or symptomatic heart failure within the past six months, or left ventricular ejection fraction < 35% were excluded.1
Patients (N = 9,361) were randomly assigned to an SBP target < 120 mm Hg in the intensive group or < 140 mm Hg in the standard treatment group, in an open-label design. Allocation was concealed. The study protocol encouraged, but did not require, the use of thiazide-type diuretics, loop diuretics (for those with advanced renal disease), ACE inhibitors or angiotensin receptor blockers, calcium channel blockers, and ß-blockers. Clinicians could add other agents as needed. All major classes of antihypertensives were used.
Medication dosing adjustments were based on the average of three BP measurements taken with an automated measurement system with the patient seated after 5 minutes of quiet rest. Target SBP in the standard therapy group was 135 to 139 mm Hg. Medication dosages were lowered if SBP was < 130 mm Hg at a single visit or < 135 mm Hg at two consecutive visits.1
The primary composite outcome included the first occurrence of MI, acute coronary syndrome, stroke, heart failure, or death from CV causes. Secondary outcomes were the individual components of the primary composite outcome; death from any cause; and the composite of the primary outcome or death from any cause.1
Study halted early. The study was stopped early due to significantly lower rates of the primary outcome in the intensive therapy group versus the standard therapy group (1.65% vs 2.19% per year, respectively; hazard ratio [HR], 0.75 with intensive treatment). The resulting median follow-up time was 3.26 years.1 This corresponds to a 25% lower relative risk for the primary outcome, with a decrease in event rates from 6.8% to 5.2% over the trial period. All-cause mortality was also lower in the intensive therapy group: 3.4% vs 4.5% (HR, 0.73).
The number needed to treat (NNT) over 3.26 years to prevent a primary outcome event, death from any cause, and death from CV causes was 61, 90, and 172, respectively. Serious adverse events occurred more frequently in the intensive therapy group than in the standard therapy group (38.3% vs 37.1%; HR, 1.04), with a number needed to harm (NNH) of 46 over the study period.1
Rates of serious adverse events that were identified as likely associated with the intervention were 4.7% vs 2.5%, respectively. Hypotension, syncope, electrolyte abnormalities, and acute kidney injury/acute renal failure reached statistical significance. The incidence of bradycardia and injurious falls, although higher in the intensive treatment group, did not reach statistical significance. In the subgroup of patients 75 or older, 48% in each study group experienced a serious adverse event.1
Throughout the study, mean SBP was 121.5 mm Hg in the intensive therapy group and 134.6 mm Hg in the standard treatment group. Patients in the intensive therapy group required, on average, one additional BP medication, compared to those in the standard treatment group (2.8 vs 1.8, respectively).1
Continue for what's new >>
WHAT’S NEW
Lower SBP produces mortality benefits in those younger, and older, than 75
This trial builds on a body of evidence that shows the advantages of lowering SBP to < 150 mm Hg7,11,12 by demonstrating benefits, including reduced all-cause mortality, for lower SBP targets in nondiabetic patients at high risk for CV disease. The SPRINT trial also showed that the benefits of intensive therapy remained true in a subgroup of patients 75 or older.
The incidence of the primary outcome in the cohort 75 or older receiving intensive therapy was 7.7%, compared with 10.9% for those receiving standard therapy (HR, 0.67; NNT, 31). All-cause mortality was also lower in the intensive therapy group than in the standard therapy group among patients 75 or older: 5.5% vs 8.04% (HR, 0.68; NNT, 38).1
CAVEATS
Many do not benefit from—or are harmed by—increased medication
The absolute risk reduction for the primary outcome is 1.6%, meaning 98.4% of patients receiving more intensive treatment will not benefit. In a group of 1,000 patients, an estimated 16 patients will benefit, 22 patients will be seriously harmed, and 962 patients will experience neither benefit nor harm.14 The difference between how BP was measured in this trial (an average of three readings after the patient had rested for 5 minutes) and what occurs typically in clinical practice could potentially lead to overtreatment in a “real world” setting.
Also, reducing antihypertensive therapies when the SBP was about 130 to 135 mm Hg in the standard therapy group likely exaggerated the difference in outcomes between the intensive and standard therapy groups; this is neither routine nor recommended in clinical practice.6 Finally, the trial specifically studied nondiabetic patients at high risk for CV disease who were 50 or older, limiting generalizability to other populations.
CHALLENGES TO IMPLEMENTATION
Who will benefit/who can achieve intensive SBP goals?
Identifying patients most likely to benefit from more intensive BP targets remains challenging. The SPRINT trial showed a mortality benefit, but at a cost of increased morbidity.1,14 Caution should be exercised particularly in the subgroup of patients 75 or older. Despite a lower NNT than the rest of the study population, this group experienced serious adverse events more frequently. Also, this particular cohort of volunteers may not be representative of those 75 or older in the general population.
Additionally, achieving intensive SBP goals can be challenging. In the SPRINT trial, only half of the intensive target group achieved an SBP < 120 mm Hg.1 And in a 2011-2012 National Health and Nutrition Examination Survey, only 52% of patients in the general population achieved a BP target < 140/90 mm Hg.15 Lower morbidity and mortality should remain the ultimate goals in the management of hypertension, requiring clinicians to carefully assess an individual patient’s likelihood of benefit versus harm.
REFERENCES
1. Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116.
2. Chobanian AV, Bakris GL, Black HR, et al. The seventh report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: the JNC 7 report. JAMA. 2003;289:2560-2572.
3. Neal B, MacMahon S, Chapman N. Effects of ACE inhibitors, calcium antagonists, and other blood-pressure-lowering drugs: results of prospectively designed overviews of randomised trials.Lancet. 2000;356:1955-1964.
4. Psaty BM, Smith NL, Siscovick DS, et al. Health outcomes associated with antihypertensive therapies used as first-line agents: a systematic review and meta-analysis. JAMA. 1997;277:739-745.
5. Margolis KL, O’Connor PJ, Morgan TM, et al. Outcomes of combined cardiovascular risk factor management strategies in type 2 diabetes: the ACCORD randomized trial. Diabetes Care. 2014;37:1721-1728.
6. James PA, Oparil S, Carter BL, et al. 2014 evidence-based guideline for the management of high blood pressure in adults: report from the panel members appointed to the Eighth Joint National Committee (JNC 8).JAMA. 2014;311:507-520.
7. Beckett NS, Peters R, Fletcher AE, et al. Treatment of hypertension in patients 80 years of age or older.N Engl J Med. 2008;358:1887-1898.
8. Verdecchia P, Staessen JA, Angeli F, et al. Usual versus tight control of systolic blood pressure in non-diabetic patients with hypertension (Cardio-Sis): an open-label randomised trial. Lancet. 2009;374:525-533.
9. JATOS Study Group. Principal results of the Japanese trial to assess optimal systolic blood pressure in elderly hypertensive patients (JATOS). Hypertens Res. 2008;31:2115-2127.
10. Ogihara T, Saruta T, Rakugi H, et al. Target blood pressure for treatment of isolated systolic hypertension in the elderly: valsartan in elderly isolated systolic hypertension study. Hypertension. 2010;56:196-202.
11. Staessen JA, Fagard R, Thijs L, et al; the Systolic Hypertension in Europe (Syst-Eur) Trial Investigators. Randomised double-blind comparison of placebo and active treatment for older patients with isolated systolic hypertension.Lancet. 1997;350:757-764.
12. SHEP Cooperative Research Group. Prevention of stroke by antihypertensive drug treatment in older persons with isolated systolic hypertension: final results of the Systolic Hypertension in the Elderly Program (SHEP). JAMA. 1991;265:3255-3264.
13. Cundiff DK, Gueyffier F, Wright JM. Guidelines for managing high blood pressure. JAMA. 2014;312:294.
14. Ortiz E, James PA. Let’s not SPRINT to judgment about new blood pressure goals. Ann Intern Med. 2016 Feb 23. [Epub ahead of print]
15. Nwankwo T, Yoon SS, Burt V, et al. Hypertension among adults in the United States: National Health and Nutrition Examination Survey, 2011-2012. NCHS Data Brief. 2013;1-8.
ACKNOWLEDGEMENT
The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.
Copyright © 2016. The Family Physicians Inquiries Network. All rights reserved.
Reprinted with permission from the Family Physicians Inquiries Network and The Journal of Family Practice. 2016;65(5):342-344.
PRACTICE CHANGER
Consider treating nondiabetic patients ages 50 and older to a systolic blood pressure (SBP) target < 120 mm Hg (as compared to < 140 mm Hg) when the benefits—lower rates of fatal and nonfatal cardiovascular (CV) events and death from any cause—are likely to outweigh the risks from possible additional medication.1
Strength of Recommendation
B: Based on a single, good-quality randomized controlled trial (RCT). 1
A 55-year-old man with hypertension and stage 3 chronic kidney disease (CKD) presents for routine care. His blood pressure is 135/85 mm Hg, and he is currently taking lisinopril 40 mg/d. Should you increase his antihypertensive regimen?
Hypertension is common and leads to significant morbidity and mortality, but pharmacologic treatment reduces incidence of stroke by 35% to 40%, myocardial infarction (MI) by 15% to 25%, and heart failure by up to 64%.2-4 Specific blood pressure targets for defined populations continue to be studied.
The ACCORD (Action to Control Cardiovascular Risk in Diabetes) trial found that more intensive BP targets did not reduce the rate of major CV events in patients with diabetes, but the study may have been underpowered.5 The members of the Eighth Joint National Committee (JNC 8) recommended treating patients older than 60 to BP goals < 150/90 mm Hg.6 This was based on evidence from six RCTs, but there remains debate—even among the JNC 8 committee members—as to appropriate BP goals in patients of any age without CV disease who have BP measurements of 140-159/90-99 mm Hg. 7-13
Continue for the study summary >>
STUDY SUMMARY
Treating to SBP < 120 mm Hg lowers mortality
The Systolic Blood Pressure Intervention Trial (SPRINT) was a multicenter RCT designed to determine if treating to lower SBP targets in nondiabetic patients at high risk for CV events improves outcomes, compared with standard care. Patients were at least 50, had an SBP of 130 to 180 mm Hg, and were at increased CV risk; the last was defined as clinical or subclinical CV disease other than stroke; CKD with a glomerular filtration rate (GFR) of 20 to 60 mL/min/1.73 m2; 10-year risk for CV disease > 15% on Framingham risk score; or age 75 or older. Patients with diabetes, prior stroke, polycystic kidney disease, significant proteinuria or symptomatic heart failure within the past six months, or left ventricular ejection fraction < 35% were excluded.1
Patients (N = 9,361) were randomly assigned to an SBP target < 120 mm Hg in the intensive group or < 140 mm Hg in the standard treatment group, in an open-label design. Allocation was concealed. The study protocol encouraged, but did not require, the use of thiazide-type diuretics, loop diuretics (for those with advanced renal disease), ACE inhibitors or angiotensin receptor blockers, calcium channel blockers, and ß-blockers. Clinicians could add other agents as needed. All major classes of antihypertensives were used.
Medication dosing adjustments were based on the average of three BP measurements taken with an automated measurement system with the patient seated after 5 minutes of quiet rest. Target SBP in the standard therapy group was 135 to 139 mm Hg. Medication dosages were lowered if SBP was < 130 mm Hg at a single visit or < 135 mm Hg at two consecutive visits.1
The primary composite outcome included the first occurrence of MI, acute coronary syndrome, stroke, heart failure, or death from CV causes. Secondary outcomes were the individual components of the primary composite outcome; death from any cause; and the composite of the primary outcome or death from any cause.1
Study halted early. The study was stopped early due to significantly lower rates of the primary outcome in the intensive therapy group versus the standard therapy group (1.65% vs 2.19% per year, respectively; hazard ratio [HR], 0.75 with intensive treatment). The resulting median follow-up time was 3.26 years.1 This corresponds to a 25% lower relative risk for the primary outcome, with a decrease in event rates from 6.8% to 5.2% over the trial period. All-cause mortality was also lower in the intensive therapy group: 3.4% vs 4.5% (HR, 0.73).
The number needed to treat (NNT) over 3.26 years to prevent a primary outcome event, death from any cause, and death from CV causes was 61, 90, and 172, respectively. Serious adverse events occurred more frequently in the intensive therapy group than in the standard therapy group (38.3% vs 37.1%; HR, 1.04), with a number needed to harm (NNH) of 46 over the study period.1
Rates of serious adverse events that were identified as likely associated with the intervention were 4.7% vs 2.5%, respectively. Hypotension, syncope, electrolyte abnormalities, and acute kidney injury/acute renal failure reached statistical significance. The incidence of bradycardia and injurious falls, although higher in the intensive treatment group, did not reach statistical significance. In the subgroup of patients 75 or older, 48% in each study group experienced a serious adverse event.1
Throughout the study, mean SBP was 121.5 mm Hg in the intensive therapy group and 134.6 mm Hg in the standard treatment group. Patients in the intensive therapy group required, on average, one additional BP medication, compared to those in the standard treatment group (2.8 vs 1.8, respectively).1
Continue for what's new >>
WHAT’S NEW
Lower SBP produces mortality benefits in those younger, and older, than 75
This trial builds on a body of evidence that shows the advantages of lowering SBP to < 150 mm Hg7,11,12 by demonstrating benefits, including reduced all-cause mortality, for lower SBP targets in nondiabetic patients at high risk for CV disease. The SPRINT trial also showed that the benefits of intensive therapy remained true in a subgroup of patients 75 or older.
The incidence of the primary outcome in the cohort 75 or older receiving intensive therapy was 7.7%, compared with 10.9% for those receiving standard therapy (HR, 0.67; NNT, 31). All-cause mortality was also lower in the intensive therapy group than in the standard therapy group among patients 75 or older: 5.5% vs 8.04% (HR, 0.68; NNT, 38).1
CAVEATS
Many do not benefit from—or are harmed by—increased medication
The absolute risk reduction for the primary outcome is 1.6%, meaning 98.4% of patients receiving more intensive treatment will not benefit. In a group of 1,000 patients, an estimated 16 patients will benefit, 22 patients will be seriously harmed, and 962 patients will experience neither benefit nor harm.14 The difference between how BP was measured in this trial (an average of three readings after the patient had rested for 5 minutes) and what occurs typically in clinical practice could potentially lead to overtreatment in a “real world” setting.
Also, reducing antihypertensive therapies when the SBP was about 130 to 135 mm Hg in the standard therapy group likely exaggerated the difference in outcomes between the intensive and standard therapy groups; this is neither routine nor recommended in clinical practice.6 Finally, the trial specifically studied nondiabetic patients at high risk for CV disease who were 50 or older, limiting generalizability to other populations.
CHALLENGES TO IMPLEMENTATION
Who will benefit/who can achieve intensive SBP goals?
Identifying patients most likely to benefit from more intensive BP targets remains challenging. The SPRINT trial showed a mortality benefit, but at a cost of increased morbidity.1,14 Caution should be exercised particularly in the subgroup of patients 75 or older. Despite a lower NNT than the rest of the study population, this group experienced serious adverse events more frequently. Also, this particular cohort of volunteers may not be representative of those 75 or older in the general population.
Additionally, achieving intensive SBP goals can be challenging. In the SPRINT trial, only half of the intensive target group achieved an SBP < 120 mm Hg.1 And in a 2011-2012 National Health and Nutrition Examination Survey, only 52% of patients in the general population achieved a BP target < 140/90 mm Hg.15 Lower morbidity and mortality should remain the ultimate goals in the management of hypertension, requiring clinicians to carefully assess an individual patient’s likelihood of benefit versus harm.
REFERENCES
1. Wright JT Jr, Williamson JD, Whelton PK, et al. A randomized trial of intensive versus standard blood-pressure control. N Engl J Med. 2015;373:2103-2116.
2. Chobanian AV, Bakris GL, Black HR, et al. The seventh report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: the JNC 7 report. JAMA. 2003;289:2560-2572.
3. Neal B, MacMahon S, Chapman N. Effects of ACE inhibitors, calcium antagonists, and other blood-pressure-lowering drugs: results of prospectively designed overviews of randomised trials.Lancet. 2000;356:1955-1964.
4. Psaty BM, Smith NL, Siscovick DS, et al. Health outcomes associated with antihypertensive therapies used as first-line agents: a systematic review and meta-analysis. JAMA. 1997;277:739-745.
5. Margolis KL, O’Connor PJ, Morgan TM, et al. Outcomes of combined cardiovascular risk factor management strategies in type 2 diabetes: the ACCORD randomized trial. Diabetes Care. 2014;37:1721-1728.
6. James PA, Oparil S, Carter BL, et al. 2014 evidence-based guideline for the management of high blood pressure in adults: report from the panel members appointed to the Eighth Joint National Committee (JNC 8).JAMA. 2014;311:507-520.
7. Beckett NS, Peters R, Fletcher AE, et al. Treatment of hypertension in patients 80 years of age or older.N Engl J Med. 2008;358:1887-1898.
8. Verdecchia P, Staessen JA, Angeli F, et al. Usual versus tight control of systolic blood pressure in non-diabetic patients with hypertension (Cardio-Sis): an open-label randomised trial. Lancet. 2009;374:525-533.
9. JATOS Study Group. Principal results of the Japanese trial to assess optimal systolic blood pressure in elderly hypertensive patients (JATOS). Hypertens Res. 2008;31:2115-2127.
10. Ogihara T, Saruta T, Rakugi H, et al. Target blood pressure for treatment of isolated systolic hypertension in the elderly: valsartan in elderly isolated systolic hypertension study. Hypertension. 2010;56:196-202.
11. Staessen JA, Fagard R, Thijs L, et al; the Systolic Hypertension in Europe (Syst-Eur) Trial Investigators. Randomised double-blind comparison of placebo and active treatment for older patients with isolated systolic hypertension.Lancet. 1997;350:757-764.
12. SHEP Cooperative Research Group. Prevention of stroke by antihypertensive drug treatment in older persons with isolated systolic hypertension: final results of the Systolic Hypertension in the Elderly Program (SHEP). JAMA. 1991;265:3255-3264.
13. Cundiff DK, Gueyffier F, Wright JM. Guidelines for managing high blood pressure. JAMA. 2014;312:294.
14. Ortiz E, James PA. Let’s not SPRINT to judgment about new blood pressure goals. Ann Intern Med. 2016 Feb 23. [Epub ahead of print]
15. Nwankwo T, Yoon SS, Burt V, et al. Hypertension among adults in the United States: National Health and Nutrition Examination Survey, 2011-2012. NCHS Data Brief. 2013;1-8.
ACKNOWLEDGEMENT
The PURLs Surveillance System was supported in part by Grant Number UL1RR024999 from the National Center For Research Resources, a Clinical Translational Science Award to the University of Chicago. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center For Research Resources or the National Institutes of Health.
Copyright © 2016. The Family Physicians Inquiries Network. All rights reserved.
Reprinted with permission from the Family Physicians Inquiries Network and The Journal of Family Practice. 2016;65(5):342-344.

