User login
News and Views that Matter to Physicians
Ticagrelor slashes first stroke risk after MI
ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.
PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.
But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.
All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.
The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.
To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.
Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.
Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”
“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.
He reported serving as a consultant to AstraZeneca, Merck, and Bayer.
Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).
ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.
PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.
But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.
All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.
The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.
To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.
Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.
Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”
“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.
He reported serving as a consultant to AstraZeneca, Merck, and Bayer.
Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).
ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.
PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.
But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.
All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.
The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.
To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.
Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.
Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”
“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.
He reported serving as a consultant to AstraZeneca, Merck, and Bayer.
Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).
AT THE ESC CONGRESS 2016
Key clinical point: Ticagrelor reduced the risk of a first stroke by 25% in patients with a prior MI.
Major finding: Ticagrelor, at the approved dose of 60 mg twice daily for long-term secondary cardiovascular prevention, reduced the risk of a first stroke by 25% in patients with a prior MI.
Data source: This secondary analysis of a randomized, double-blind, placebo-controlled trial included 14,112 stable patients with a prior MI 1-3 years earlier who were randomized to ticagrelor at 60 mg twice daily or placebo and followed prospectively for a median of 33 months.
Disclosures: PEGASUS-TIMI 54 was supported by AstraZeneca. The presenter of the updated analysis reported serving as a consultant to AstraZeneca, Merck, and Bayer.
Trials offer lessons despite negative primary endpoints
Conventional wisdom holds that for randomized, controlled trials, it’s all about the primary endpoint. If it’s negative or neutral, then none of the other results means much beyond “hypothesis generating.”
This strict-constructionist thinking has now been called into question. A recent article in the New England Journal of Medicine declared “an unreasonable yet widespread practice is the labeling of all randomized trials as either positive or negative on the basis of whether the P value for the primary outcome is less than .05. This view is overly simplistic.” (2016 Sept 1;375[9]:861-70).
The article, by the highly experienced and respected trialists Stuart J. Pocock, PhD, and Gregg W. Stone, MD, adds this: “If the primary outcome is negative, positive findings for secondary outcomes are usually considered to be hypothesis generating. Certainly, regulatory approval of a new drug is unlikely to follow. However, in some instances, secondary findings are compelling enough to affect guidelines and practice.”
This unconventional take from a pair of high-level trialists was especially timely given the buzz around the results from two studies reported at the European Society of Cardiology annual congress in late August, DANISH and NORSTENT.
The DANISH trial compared the impact of implantable cardioverter-defibrillators (ICDs) plus optimal care against optimal care without ICDs in 1,116 patients with nonischemic systolic heart failure. The primary outcome, all-cause death during more than 5 years of follow-up, was a relative 13% less with ICD use, a difference that was not statistically significant, and one secondary outcome, cardiovascular death, was cut by a relative 25% with ICD use, also not statistically significant.
But for the study’s second prespecified secondary endpoint of sudden cardiac death, treatment with ICDs cut the rate in half, compared with nonischemic heart failure patients who did not receive an ICD, a 4-percentage-point difference that was statistically significant.
And in a prespecified secondary analysis of the primary endpoint that broke down the study group by age, the two-thirds of patients younger than 68 years had a significant reduction in all-cause mortality with ICD use, a benefit not seen in patients aged 68 or older.
Discussion of the results at the meeting mainly focused on what meaning, if any, could be drawn from these strongly positive secondary outcomes in a trial neutral for its primary outcome.
“The ICDs did what they were supposed to, prevent sudden cardiac death,” said the lead investigator of the study, Lars Køber, MD. “As a principle I say don’t believe in a subgroup, but guidelines are often based on subgroup analyses.”
“The primary outcome was neutral, but the reduction in sudden cardiac death, the primary objective of an ICD, was significant, so an ICD should be taken into consideration,” commented Michel Komajda, MD, a discussant for the report.
After I wrote a news article about the DANISH report at ESC, I received an email from a reader who objected to spinning the results this way and insisted that no valid lessons can be drawn from the DANISH results because the study’s primary endpoint failed to show a statistical significance. This purist view misses the important, relevant lessons from the DANISH results. The DANISH trial was not designed to provide pivotal data for regulatory approval of ICDs in these patients. Rather, Dr. Køber and his associates designed DANISH to see whether ICD use in these patients could cut all-cause death over a fairly long follow-up. It was a very high bar and ICDs failed, but the deck was stacked against an ICD win. Enrolled patients averaged 64 years old at entry into the study, and they all had New York Heart Association class II or III heart failure. “The overall survival curves start to diverge, but then converge after 5 years because of the comorbidities and patients dying for other reasons,” Dr. Køber noted.
“The message is, in younger patients with less morbidity and more life expectancy, sudden cardiac death is a bigger problem, and they had a substantial drop in mortality” with ICD use, commented heart failure specialist Javed Butler, MD. “It’s very consistent with the way we think about providing ICD treatment to patients.”
In other words, the DANISH results showed that all patients with nonischemic systolic heart failure can’t expect to live substantially longer during extended follow-up if they get an ICD, because the cut in sudden cardiac death the devices provide eventually gets washed out by the many other risks for death these patients face. But younger, relatively healthier patients might very well see their reduced rate of sudden cardiac death translate into an overall mortality benefit even when they are followed for at least 5 years. That’s important information to help an individual patient decide whether to have an ICD placed, and an important message from the DANISH trial despite the neutral primary endpoint.
NORSTENT involved a similar scenario in a trial that addressed a totally different issue: Should patients with either stable or unstable coronary artery disease who are undergoing coronary stenting receive a drug-eluting stent (DES) or a bare metal stent (BMS)? The trial randomized 9,013 patients to receive either of the two stent types plus optimal medical therapy. The primary endpoint was the rate of all-cause death or nonfatal MI during 5 years of follow-up, and the results showed no statistically significant difference between the patients who received a DES and those who got a BMS (N Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).
But for the secondary endpoint of repeat revascularizations performed during follow-up, the use of a DES cut the procedure rate by 3.3 percentage points, a 17% relative risk reduction that was statistically significant. The use of a DES also cut the stent thrombosis rate by 0.4 percentage points, a one-third relative drop in these events that was also statistically significant.
In short, despite the neutral primary endpoint for the trial, the results showed that drug-eluting stents did what they were designed to do relative to bare metal stents: cut the rate of target lesion restenosis and the need for repeat revascularization. Several interventional cardiologists who heard the results at the meeting said that the findings would not change their practice and that they would continue to use the DES as their default device for percutaneous coronary interventions. Although “the long-term benefit of contemporary DES over BMS was less than expected,” said Kaare H. Bønaa, MD, lead investigator for the NORSTENT trial, the secondary benefit of significantly reduced repeat revascularization and the very modest price difference that now exists between drug-eluting stents and bare metal stents means that many interventionalists will continue to use a DES for most patients.
The message from Dr. Pocock and Dr. Stone, underscored by the DANISH and NORSTENT results, is that large and well-run randomized trials can yield important evidence to inform practice that transcends a simple black or white statistical assessment of the primary endpoint.
On Twitter @mitchelzoler
Conventional wisdom holds that for randomized, controlled trials, it’s all about the primary endpoint. If it’s negative or neutral, then none of the other results means much beyond “hypothesis generating.”
This strict-constructionist thinking has now been called into question. A recent article in the New England Journal of Medicine declared “an unreasonable yet widespread practice is the labeling of all randomized trials as either positive or negative on the basis of whether the P value for the primary outcome is less than .05. This view is overly simplistic.” (2016 Sept 1;375[9]:861-70).
The article, by the highly experienced and respected trialists Stuart J. Pocock, PhD, and Gregg W. Stone, MD, adds this: “If the primary outcome is negative, positive findings for secondary outcomes are usually considered to be hypothesis generating. Certainly, regulatory approval of a new drug is unlikely to follow. However, in some instances, secondary findings are compelling enough to affect guidelines and practice.”
This unconventional take from a pair of high-level trialists was especially timely given the buzz around the results from two studies reported at the European Society of Cardiology annual congress in late August, DANISH and NORSTENT.
The DANISH trial compared the impact of implantable cardioverter-defibrillators (ICDs) plus optimal care against optimal care without ICDs in 1,116 patients with nonischemic systolic heart failure. The primary outcome, all-cause death during more than 5 years of follow-up, was a relative 13% less with ICD use, a difference that was not statistically significant, and one secondary outcome, cardiovascular death, was cut by a relative 25% with ICD use, also not statistically significant.
But for the study’s second prespecified secondary endpoint of sudden cardiac death, treatment with ICDs cut the rate in half, compared with nonischemic heart failure patients who did not receive an ICD, a 4-percentage-point difference that was statistically significant.
And in a prespecified secondary analysis of the primary endpoint that broke down the study group by age, the two-thirds of patients younger than 68 years had a significant reduction in all-cause mortality with ICD use, a benefit not seen in patients aged 68 or older.
Discussion of the results at the meeting mainly focused on what meaning, if any, could be drawn from these strongly positive secondary outcomes in a trial neutral for its primary outcome.
“The ICDs did what they were supposed to, prevent sudden cardiac death,” said the lead investigator of the study, Lars Køber, MD. “As a principle I say don’t believe in a subgroup, but guidelines are often based on subgroup analyses.”
“The primary outcome was neutral, but the reduction in sudden cardiac death, the primary objective of an ICD, was significant, so an ICD should be taken into consideration,” commented Michel Komajda, MD, a discussant for the report.
After I wrote a news article about the DANISH report at ESC, I received an email from a reader who objected to spinning the results this way and insisted that no valid lessons can be drawn from the DANISH results because the study’s primary endpoint failed to show a statistical significance. This purist view misses the important, relevant lessons from the DANISH results. The DANISH trial was not designed to provide pivotal data for regulatory approval of ICDs in these patients. Rather, Dr. Køber and his associates designed DANISH to see whether ICD use in these patients could cut all-cause death over a fairly long follow-up. It was a very high bar and ICDs failed, but the deck was stacked against an ICD win. Enrolled patients averaged 64 years old at entry into the study, and they all had New York Heart Association class II or III heart failure. “The overall survival curves start to diverge, but then converge after 5 years because of the comorbidities and patients dying for other reasons,” Dr. Køber noted.
“The message is, in younger patients with less morbidity and more life expectancy, sudden cardiac death is a bigger problem, and they had a substantial drop in mortality” with ICD use, commented heart failure specialist Javed Butler, MD. “It’s very consistent with the way we think about providing ICD treatment to patients.”
In other words, the DANISH results showed that all patients with nonischemic systolic heart failure can’t expect to live substantially longer during extended follow-up if they get an ICD, because the cut in sudden cardiac death the devices provide eventually gets washed out by the many other risks for death these patients face. But younger, relatively healthier patients might very well see their reduced rate of sudden cardiac death translate into an overall mortality benefit even when they are followed for at least 5 years. That’s important information to help an individual patient decide whether to have an ICD placed, and an important message from the DANISH trial despite the neutral primary endpoint.
NORSTENT involved a similar scenario in a trial that addressed a totally different issue: Should patients with either stable or unstable coronary artery disease who are undergoing coronary stenting receive a drug-eluting stent (DES) or a bare metal stent (BMS)? The trial randomized 9,013 patients to receive either of the two stent types plus optimal medical therapy. The primary endpoint was the rate of all-cause death or nonfatal MI during 5 years of follow-up, and the results showed no statistically significant difference between the patients who received a DES and those who got a BMS (N Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).
But for the secondary endpoint of repeat revascularizations performed during follow-up, the use of a DES cut the procedure rate by 3.3 percentage points, a 17% relative risk reduction that was statistically significant. The use of a DES also cut the stent thrombosis rate by 0.4 percentage points, a one-third relative drop in these events that was also statistically significant.
In short, despite the neutral primary endpoint for the trial, the results showed that drug-eluting stents did what they were designed to do relative to bare metal stents: cut the rate of target lesion restenosis and the need for repeat revascularization. Several interventional cardiologists who heard the results at the meeting said that the findings would not change their practice and that they would continue to use the DES as their default device for percutaneous coronary interventions. Although “the long-term benefit of contemporary DES over BMS was less than expected,” said Kaare H. Bønaa, MD, lead investigator for the NORSTENT trial, the secondary benefit of significantly reduced repeat revascularization and the very modest price difference that now exists between drug-eluting stents and bare metal stents means that many interventionalists will continue to use a DES for most patients.
The message from Dr. Pocock and Dr. Stone, underscored by the DANISH and NORSTENT results, is that large and well-run randomized trials can yield important evidence to inform practice that transcends a simple black or white statistical assessment of the primary endpoint.
On Twitter @mitchelzoler
Conventional wisdom holds that for randomized, controlled trials, it’s all about the primary endpoint. If it’s negative or neutral, then none of the other results means much beyond “hypothesis generating.”
This strict-constructionist thinking has now been called into question. A recent article in the New England Journal of Medicine declared “an unreasonable yet widespread practice is the labeling of all randomized trials as either positive or negative on the basis of whether the P value for the primary outcome is less than .05. This view is overly simplistic.” (2016 Sept 1;375[9]:861-70).
The article, by the highly experienced and respected trialists Stuart J. Pocock, PhD, and Gregg W. Stone, MD, adds this: “If the primary outcome is negative, positive findings for secondary outcomes are usually considered to be hypothesis generating. Certainly, regulatory approval of a new drug is unlikely to follow. However, in some instances, secondary findings are compelling enough to affect guidelines and practice.”
This unconventional take from a pair of high-level trialists was especially timely given the buzz around the results from two studies reported at the European Society of Cardiology annual congress in late August, DANISH and NORSTENT.
The DANISH trial compared the impact of implantable cardioverter-defibrillators (ICDs) plus optimal care against optimal care without ICDs in 1,116 patients with nonischemic systolic heart failure. The primary outcome, all-cause death during more than 5 years of follow-up, was a relative 13% less with ICD use, a difference that was not statistically significant, and one secondary outcome, cardiovascular death, was cut by a relative 25% with ICD use, also not statistically significant.
But for the study’s second prespecified secondary endpoint of sudden cardiac death, treatment with ICDs cut the rate in half, compared with nonischemic heart failure patients who did not receive an ICD, a 4-percentage-point difference that was statistically significant.
And in a prespecified secondary analysis of the primary endpoint that broke down the study group by age, the two-thirds of patients younger than 68 years had a significant reduction in all-cause mortality with ICD use, a benefit not seen in patients aged 68 or older.
Discussion of the results at the meeting mainly focused on what meaning, if any, could be drawn from these strongly positive secondary outcomes in a trial neutral for its primary outcome.
“The ICDs did what they were supposed to, prevent sudden cardiac death,” said the lead investigator of the study, Lars Køber, MD. “As a principle I say don’t believe in a subgroup, but guidelines are often based on subgroup analyses.”
“The primary outcome was neutral, but the reduction in sudden cardiac death, the primary objective of an ICD, was significant, so an ICD should be taken into consideration,” commented Michel Komajda, MD, a discussant for the report.
After I wrote a news article about the DANISH report at ESC, I received an email from a reader who objected to spinning the results this way and insisted that no valid lessons can be drawn from the DANISH results because the study’s primary endpoint failed to show a statistical significance. This purist view misses the important, relevant lessons from the DANISH results. The DANISH trial was not designed to provide pivotal data for regulatory approval of ICDs in these patients. Rather, Dr. Køber and his associates designed DANISH to see whether ICD use in these patients could cut all-cause death over a fairly long follow-up. It was a very high bar and ICDs failed, but the deck was stacked against an ICD win. Enrolled patients averaged 64 years old at entry into the study, and they all had New York Heart Association class II or III heart failure. “The overall survival curves start to diverge, but then converge after 5 years because of the comorbidities and patients dying for other reasons,” Dr. Køber noted.
“The message is, in younger patients with less morbidity and more life expectancy, sudden cardiac death is a bigger problem, and they had a substantial drop in mortality” with ICD use, commented heart failure specialist Javed Butler, MD. “It’s very consistent with the way we think about providing ICD treatment to patients.”
In other words, the DANISH results showed that all patients with nonischemic systolic heart failure can’t expect to live substantially longer during extended follow-up if they get an ICD, because the cut in sudden cardiac death the devices provide eventually gets washed out by the many other risks for death these patients face. But younger, relatively healthier patients might very well see their reduced rate of sudden cardiac death translate into an overall mortality benefit even when they are followed for at least 5 years. That’s important information to help an individual patient decide whether to have an ICD placed, and an important message from the DANISH trial despite the neutral primary endpoint.
NORSTENT involved a similar scenario in a trial that addressed a totally different issue: Should patients with either stable or unstable coronary artery disease who are undergoing coronary stenting receive a drug-eluting stent (DES) or a bare metal stent (BMS)? The trial randomized 9,013 patients to receive either of the two stent types plus optimal medical therapy. The primary endpoint was the rate of all-cause death or nonfatal MI during 5 years of follow-up, and the results showed no statistically significant difference between the patients who received a DES and those who got a BMS (N Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).
But for the secondary endpoint of repeat revascularizations performed during follow-up, the use of a DES cut the procedure rate by 3.3 percentage points, a 17% relative risk reduction that was statistically significant. The use of a DES also cut the stent thrombosis rate by 0.4 percentage points, a one-third relative drop in these events that was also statistically significant.
In short, despite the neutral primary endpoint for the trial, the results showed that drug-eluting stents did what they were designed to do relative to bare metal stents: cut the rate of target lesion restenosis and the need for repeat revascularization. Several interventional cardiologists who heard the results at the meeting said that the findings would not change their practice and that they would continue to use the DES as their default device for percutaneous coronary interventions. Although “the long-term benefit of contemporary DES over BMS was less than expected,” said Kaare H. Bønaa, MD, lead investigator for the NORSTENT trial, the secondary benefit of significantly reduced repeat revascularization and the very modest price difference that now exists between drug-eluting stents and bare metal stents means that many interventionalists will continue to use a DES for most patients.
The message from Dr. Pocock and Dr. Stone, underscored by the DANISH and NORSTENT results, is that large and well-run randomized trials can yield important evidence to inform practice that transcends a simple black or white statistical assessment of the primary endpoint.
On Twitter @mitchelzoler
CMS offers lower-stress reporting options for MACRA in 2017
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
ACOs score slight bump in bonus payments
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
High free T4 levels linked to sudden cardiac death
Higher levels of free thyroxine are associated with an increased risk of sudden cardiac death, even in euthyroid adults, according to a report published online Sept. 6 in Circulation.
Thyroid dysfunction, even in the subclinical range, is known to correlate with increased cardiovascular disease, but until now a possible link between free thyroxine levels and sudden cardiac death (SCD) has never been explored in the general population. Any factors that could improve prediction of SCD in the general population would be helpful because almost half of these cases are the first indication that the patient had heart disease, said Layal Chaker, MD, of the Rotterdam Thyroid Center and the departments of internal medicine and epidemiology, Erasmus University, Rotterdam, and her associates.
They assessed SCD among 10,318 participants in the Rotterdam Study, a prospective population-based cohort study examining endocrine, cardiovascular, neurologic, ophthalmologic, and psychiatric diseases in middle-aged and older adults in the Netherlands. Men and women aged 45-106 years who had thyroid testing at baseline were followed for a median of 9.2 years (range, 4-21 years) for the development of SCD. There were 261 cases of SCD, and 231 of these occurred in euthyroid participants.
Higher levels of free thyroxine (T4) were associated with an increased risk of SCD, with a hazard ratio of 1.87 for every 1 ng/dL increase in free T4. When the analysis was confined to the 231 euthyroid participants, this association was even stronger, with an HR of 2.26, the investigators said (Circulation 2016 Sept 6. doi: 10.1161/CirculationAHA.115.020789).
The findings were similar in several sensitivity analyses, including one that excluded participants who had an unwitnessed SCD. In addition, adjustment of the data to account for the presence or absence of diabetes, as well as exclusion of patients who had heart failure, did not alter the risk estimates significantly. The results also were consistent across all age groups and both sexes, Dr. Chaker and her associates said.
The exact mechanism for the association between free thyroxine and SCD is not yet known but appears to be independent of traditional cardiovascular risk factors. “Bigger sample size and more detailed data are needed to determine whether these associations share the same or have distinct pathways,” they added.
The Netherlands Organisation for Health Research and Development and Erasmus Medical Center supported the study. Dr. Chaker and her associates reported having no relevant financial disclosures.
Higher levels of free thyroxine are associated with an increased risk of sudden cardiac death, even in euthyroid adults, according to a report published online Sept. 6 in Circulation.
Thyroid dysfunction, even in the subclinical range, is known to correlate with increased cardiovascular disease, but until now a possible link between free thyroxine levels and sudden cardiac death (SCD) has never been explored in the general population. Any factors that could improve prediction of SCD in the general population would be helpful because almost half of these cases are the first indication that the patient had heart disease, said Layal Chaker, MD, of the Rotterdam Thyroid Center and the departments of internal medicine and epidemiology, Erasmus University, Rotterdam, and her associates.
They assessed SCD among 10,318 participants in the Rotterdam Study, a prospective population-based cohort study examining endocrine, cardiovascular, neurologic, ophthalmologic, and psychiatric diseases in middle-aged and older adults in the Netherlands. Men and women aged 45-106 years who had thyroid testing at baseline were followed for a median of 9.2 years (range, 4-21 years) for the development of SCD. There were 261 cases of SCD, and 231 of these occurred in euthyroid participants.
Higher levels of free thyroxine (T4) were associated with an increased risk of SCD, with a hazard ratio of 1.87 for every 1 ng/dL increase in free T4. When the analysis was confined to the 231 euthyroid participants, this association was even stronger, with an HR of 2.26, the investigators said (Circulation 2016 Sept 6. doi: 10.1161/CirculationAHA.115.020789).
The findings were similar in several sensitivity analyses, including one that excluded participants who had an unwitnessed SCD. In addition, adjustment of the data to account for the presence or absence of diabetes, as well as exclusion of patients who had heart failure, did not alter the risk estimates significantly. The results also were consistent across all age groups and both sexes, Dr. Chaker and her associates said.
The exact mechanism for the association between free thyroxine and SCD is not yet known but appears to be independent of traditional cardiovascular risk factors. “Bigger sample size and more detailed data are needed to determine whether these associations share the same or have distinct pathways,” they added.
The Netherlands Organisation for Health Research and Development and Erasmus Medical Center supported the study. Dr. Chaker and her associates reported having no relevant financial disclosures.
Higher levels of free thyroxine are associated with an increased risk of sudden cardiac death, even in euthyroid adults, according to a report published online Sept. 6 in Circulation.
Thyroid dysfunction, even in the subclinical range, is known to correlate with increased cardiovascular disease, but until now a possible link between free thyroxine levels and sudden cardiac death (SCD) has never been explored in the general population. Any factors that could improve prediction of SCD in the general population would be helpful because almost half of these cases are the first indication that the patient had heart disease, said Layal Chaker, MD, of the Rotterdam Thyroid Center and the departments of internal medicine and epidemiology, Erasmus University, Rotterdam, and her associates.
They assessed SCD among 10,318 participants in the Rotterdam Study, a prospective population-based cohort study examining endocrine, cardiovascular, neurologic, ophthalmologic, and psychiatric diseases in middle-aged and older adults in the Netherlands. Men and women aged 45-106 years who had thyroid testing at baseline were followed for a median of 9.2 years (range, 4-21 years) for the development of SCD. There were 261 cases of SCD, and 231 of these occurred in euthyroid participants.
Higher levels of free thyroxine (T4) were associated with an increased risk of SCD, with a hazard ratio of 1.87 for every 1 ng/dL increase in free T4. When the analysis was confined to the 231 euthyroid participants, this association was even stronger, with an HR of 2.26, the investigators said (Circulation 2016 Sept 6. doi: 10.1161/CirculationAHA.115.020789).
The findings were similar in several sensitivity analyses, including one that excluded participants who had an unwitnessed SCD. In addition, adjustment of the data to account for the presence or absence of diabetes, as well as exclusion of patients who had heart failure, did not alter the risk estimates significantly. The results also were consistent across all age groups and both sexes, Dr. Chaker and her associates said.
The exact mechanism for the association between free thyroxine and SCD is not yet known but appears to be independent of traditional cardiovascular risk factors. “Bigger sample size and more detailed data are needed to determine whether these associations share the same or have distinct pathways,” they added.
The Netherlands Organisation for Health Research and Development and Erasmus Medical Center supported the study. Dr. Chaker and her associates reported having no relevant financial disclosures.
FROM CIRCULATION
Key clinical point: High levels of free thyroxine are associated with an increased risk of sudden cardiac death, even in euthyroid adults.
Major finding: Higher levels of free thyroxine (T4) were associated with an increased risk of SCD, with a hazard ratio of 1.87 for every 1 ng/dL increase in free T4.
Data source: A prospective population-based cohort study involving 10,318 older adults in the Netherlands followed for a median of 9 years.
Disclosures: The Netherlands Organisation for Health Research and Development and Erasmus Medical Center supported the study. Dr. Chaker and her associates reported having no relevant financial disclosures.
The new NOACs are generally the best bet
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
Clot retrieval devices approved for initial ischemic stroke treatment
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Commentary: INR instability in the NOAC era
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Antibiotic susceptibility differs in transplant recipients
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
FROM DIAGNOSTIC MICROBIOLOGY AND INFECTIOUS DISEASE
Key clinical point: Antibiotic susceptibility in bacteria cultured from transplant recipients differs markedly from that in hospital-wide antibiograms.
Major finding: In the transplant recipients, E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin.
Data source: A single-center study comparing the antibiotic susceptibility of 1,889 bacterial isolates from transplant recipients with 10,439 isolates from other patients.
Disclosures: This study was not supported by funding from any public, commercial, or not-for-profit entities. Dr. Rosa and her associates reported having no relevant financial disclosures.
USPSTF: Screen for tuberculosis in those at greatest risk
Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.
The recommendations were published online Sept. 6 in JAMA.
“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).
TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.
Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.
TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.
The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.
The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.
“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”
The researchers had no financial conflicts to disclose.
Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.
The recommendations were published online Sept. 6 in JAMA.
“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).
TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.
Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.
TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.
The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.
The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.
“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”
The researchers had no financial conflicts to disclose.
Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.
The recommendations were published online Sept. 6 in JAMA.
“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).
TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.
Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.
TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.
The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.
The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.
“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”
The researchers had no financial conflicts to disclose.
FROM JAMA
Key clinical point: Latent tuberculosis infection is a significant problem, and both the tuberculin skin test (TST) and interferon-gamma release assays (IGRA) were moderately sensitive and highly specific in areas with a low tuberculosis burden.
Major finding: Approximately 5%-10% of individuals with latent TB progress to active disease, according to the USPSTF, and treatment reduces the risk of progression.
Data source: An evidence review including 72 studies and 51,711 individuals.
Disclosures: The researchers had no financial conflicts to disclose.