FDA clears test to detect bacteria in platelets

Article Type
Changed
Display Headline
FDA clears test to detect bacteria in platelets

Platelets for transfusion

The US Food and Drug Administration (FDA) has expanded the authorized use of Verax Biomedical’s Platelet PGD Test, which detects bacteria in platelets intended for transfusion.

The FDA previously approved the test for leukocyte-reduced apheresis platelets (in 2007) and platelets derived from whole blood (in 2009).

Now, the test has been approved for pre-storage pooled platelets and apheresis platelets in platelet additive solution C (PAS-C) and plasma.

This makes the Platelet PGD Test the only rapid test on the market that can check every commonly distributed platelet type in the US, according to Verax Biomedical.

About the test

The Platelet PGD Test is an immunoassay used on the day of transfusion at the point of care—a hospital or transfusion service—to detect bacterial contamination in platelets to be transfused.

The test consists of a disposable plastic cartridge and 3 pretreatment reagents. To use, the tester pretreats a freshly collected platelet sample (500µL) and applies it to the sample well on the test cartridge.

Lights on the cartridge change from yellow to blue-violet when the test is ready to be interpreted, which is typically about 20 minutes after the sample is applied to the cartridge. The lights confirm that the appropriate volume of a sample was added and the testing is complete.

If the test is positive, a pink line will appear in 1 of the 2 windows on the cartridge. One window represents Gram-positive bacteria and the other Gram-negative. Non-reactive samples will have no line in either window.

Now that the FDA has expanded the indications for the Platelet PGD Test, it can be used as a quality control test for pools of up to 6 units of leukocyte-reduced and non-leukocyte-reduced whole-blood-derived platelets suspended in plasma that are pooled within 4 hours of transfusion.

The test can also be used within 24 hours of transfusion as a safety measure following testing with a growth-based, quality control test cleared by the FDA. For this indication, the Platelet PGD Test can be used with:

  • Leukocyte-reduced apheresis platelets suspended in plasma
  • Leukocyte-reduced apheresis platelets suspended in PAS-C and plasma
  • Pre-storage pools of up to 6 leukocyte-reduced whole-blood-derived platelets suspended in plasma.

In studies conducted by Verax Biomedical (described in the summary document here), the Platelet PGD Test successfully detected bacteria in pre-storage pools of whole-blood derived platelets suspended in plasma and leukocyte-reduced apheresis platelets suspended in plasma or PAS-C and plasma.

Publications
Topics

Platelets for transfusion

The US Food and Drug Administration (FDA) has expanded the authorized use of Verax Biomedical’s Platelet PGD Test, which detects bacteria in platelets intended for transfusion.

The FDA previously approved the test for leukocyte-reduced apheresis platelets (in 2007) and platelets derived from whole blood (in 2009).

Now, the test has been approved for pre-storage pooled platelets and apheresis platelets in platelet additive solution C (PAS-C) and plasma.

This makes the Platelet PGD Test the only rapid test on the market that can check every commonly distributed platelet type in the US, according to Verax Biomedical.

About the test

The Platelet PGD Test is an immunoassay used on the day of transfusion at the point of care—a hospital or transfusion service—to detect bacterial contamination in platelets to be transfused.

The test consists of a disposable plastic cartridge and 3 pretreatment reagents. To use, the tester pretreats a freshly collected platelet sample (500µL) and applies it to the sample well on the test cartridge.

Lights on the cartridge change from yellow to blue-violet when the test is ready to be interpreted, which is typically about 20 minutes after the sample is applied to the cartridge. The lights confirm that the appropriate volume of a sample was added and the testing is complete.

If the test is positive, a pink line will appear in 1 of the 2 windows on the cartridge. One window represents Gram-positive bacteria and the other Gram-negative. Non-reactive samples will have no line in either window.

Now that the FDA has expanded the indications for the Platelet PGD Test, it can be used as a quality control test for pools of up to 6 units of leukocyte-reduced and non-leukocyte-reduced whole-blood-derived platelets suspended in plasma that are pooled within 4 hours of transfusion.

The test can also be used within 24 hours of transfusion as a safety measure following testing with a growth-based, quality control test cleared by the FDA. For this indication, the Platelet PGD Test can be used with:

  • Leukocyte-reduced apheresis platelets suspended in plasma
  • Leukocyte-reduced apheresis platelets suspended in PAS-C and plasma
  • Pre-storage pools of up to 6 leukocyte-reduced whole-blood-derived platelets suspended in plasma.

In studies conducted by Verax Biomedical (described in the summary document here), the Platelet PGD Test successfully detected bacteria in pre-storage pools of whole-blood derived platelets suspended in plasma and leukocyte-reduced apheresis platelets suspended in plasma or PAS-C and plasma.

Platelets for transfusion

The US Food and Drug Administration (FDA) has expanded the authorized use of Verax Biomedical’s Platelet PGD Test, which detects bacteria in platelets intended for transfusion.

The FDA previously approved the test for leukocyte-reduced apheresis platelets (in 2007) and platelets derived from whole blood (in 2009).

Now, the test has been approved for pre-storage pooled platelets and apheresis platelets in platelet additive solution C (PAS-C) and plasma.

This makes the Platelet PGD Test the only rapid test on the market that can check every commonly distributed platelet type in the US, according to Verax Biomedical.

About the test

The Platelet PGD Test is an immunoassay used on the day of transfusion at the point of care—a hospital or transfusion service—to detect bacterial contamination in platelets to be transfused.

The test consists of a disposable plastic cartridge and 3 pretreatment reagents. To use, the tester pretreats a freshly collected platelet sample (500µL) and applies it to the sample well on the test cartridge.

Lights on the cartridge change from yellow to blue-violet when the test is ready to be interpreted, which is typically about 20 minutes after the sample is applied to the cartridge. The lights confirm that the appropriate volume of a sample was added and the testing is complete.

If the test is positive, a pink line will appear in 1 of the 2 windows on the cartridge. One window represents Gram-positive bacteria and the other Gram-negative. Non-reactive samples will have no line in either window.

Now that the FDA has expanded the indications for the Platelet PGD Test, it can be used as a quality control test for pools of up to 6 units of leukocyte-reduced and non-leukocyte-reduced whole-blood-derived platelets suspended in plasma that are pooled within 4 hours of transfusion.

The test can also be used within 24 hours of transfusion as a safety measure following testing with a growth-based, quality control test cleared by the FDA. For this indication, the Platelet PGD Test can be used with:

  • Leukocyte-reduced apheresis platelets suspended in plasma
  • Leukocyte-reduced apheresis platelets suspended in PAS-C and plasma
  • Pre-storage pools of up to 6 leukocyte-reduced whole-blood-derived platelets suspended in plasma.

In studies conducted by Verax Biomedical (described in the summary document here), the Platelet PGD Test successfully detected bacteria in pre-storage pools of whole-blood derived platelets suspended in plasma and leukocyte-reduced apheresis platelets suspended in plasma or PAS-C and plasma.

Publications
Publications
Topics
Article Type
Display Headline
FDA clears test to detect bacteria in platelets
Display Headline
FDA clears test to detect bacteria in platelets
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Team says delayed cord clamping can’t hurt

Article Type
Changed
Display Headline
Team says delayed cord clamping can’t hurt

Umbilical cord clamping

Photo by Meutia Chaerani

and Indradi Soemardjan

New research suggests that delayed umbilical cord clamping in full-term infants may confer some minor long-term benefits and, at the very least, does not pose any harm.

Delayed clamping did not appear to have a significant effect on most of the mental and physical measures assessed in the study.

It was associated with improved scores in fine-motor skills and social skills at age 4, but these effects only occurred in boys.

Researchers reported these results in JAMA Pediatrics alongside a related editorial.

Previous research has shown that delaying umbilical cord clamping by 2 to 3 minutes after delivery allows fetal blood remaining in the placental circulation to be transfused back to the newborn, and this is associated with improved iron status at 4 to 6 months of age.

However, there is a lack of knowledge regarding the long-term effects of delayed clamping. So policymakers have been hesitant about making clear recommendations regarding cord clamping in full-term infants.

To gain more insight, Ola Andersson, MD, PhD, of Uppsala University in Sweden, and his colleagues performed follow-up assessments of 263 children who were previously enrolled in a randomized trial of cord clamping in full-term infants born in a Swedish hospital.

The team assessed the effects of delayed cord clamping on childhood development at age 4. Delayed clamping (n=141) was defined as occurring 3 or more minutes after delivery, and early clamping (n=122) was defined as occurring 10 seconds or fewer after delivery.

The researchers evaluated child behavior and development using parents’ responses on the Ages and Stages Questionnaire, Third Edition (ASQ), which is used to assess communication, motor skills, and other measures; and the Strengths and Difficulties Questionnaire, which is used to score children’s emotional difficulties, hyperactivity, and other difficulties.

A blinded psychologist also assessed children’s scores on the Wechsler Preschool and Primary Scale of Intelligence (WPPSI-III), which is used to assess IQ and similar measures, and the Movement Assessment Battery for Children (Movement ABC), which is used to assess manual dexterity and similar measures.

The researchers found no significant differences between the delayed and early clamping groups with regard to results on the WPPSI-III or the Movement ABC.

However, delayed clamping was associated with a significant improvement over early clamping in ASQ personal-social scores (adjusted mean difference [AMD]=2.8, P=0.006), fine-motor scores (AMD=2.1, P=0.03), and the Strengths and Difficulties Questionnaire prosocial subscale (AMD=0.5, P=0.05).

When the researchers assessed the children according to sex, they found that significant improvements associated with delayed clamping were only present in males.

Males in the delayed clamping group had significantly higher mean scores in tasks involving fine-motor function, including the WPPSI-III processing-speed quotient (AMD=4.2, P=0.02), the Movement ABC bicycle-trail task (AMD=0.8, P=0.03), and fine-motor scores on the ASQ (AMD=4.7, P=0.01). These boys also had significantly higher personal-social scores on the ASQ (AMD=4.9, P=0.004).

The researchers concluded that, although delayed cord clamping and early clamping resulted in similar overall neurodevelopment and behavior among 4-year-old children, there were differences in this study. And this suggests there are some positive, and no harmful, long-term effects of delayed cord clamping.

Publications
Topics

Umbilical cord clamping

Photo by Meutia Chaerani

and Indradi Soemardjan

New research suggests that delayed umbilical cord clamping in full-term infants may confer some minor long-term benefits and, at the very least, does not pose any harm.

Delayed clamping did not appear to have a significant effect on most of the mental and physical measures assessed in the study.

It was associated with improved scores in fine-motor skills and social skills at age 4, but these effects only occurred in boys.

Researchers reported these results in JAMA Pediatrics alongside a related editorial.

Previous research has shown that delaying umbilical cord clamping by 2 to 3 minutes after delivery allows fetal blood remaining in the placental circulation to be transfused back to the newborn, and this is associated with improved iron status at 4 to 6 months of age.

However, there is a lack of knowledge regarding the long-term effects of delayed clamping. So policymakers have been hesitant about making clear recommendations regarding cord clamping in full-term infants.

To gain more insight, Ola Andersson, MD, PhD, of Uppsala University in Sweden, and his colleagues performed follow-up assessments of 263 children who were previously enrolled in a randomized trial of cord clamping in full-term infants born in a Swedish hospital.

The team assessed the effects of delayed cord clamping on childhood development at age 4. Delayed clamping (n=141) was defined as occurring 3 or more minutes after delivery, and early clamping (n=122) was defined as occurring 10 seconds or fewer after delivery.

The researchers evaluated child behavior and development using parents’ responses on the Ages and Stages Questionnaire, Third Edition (ASQ), which is used to assess communication, motor skills, and other measures; and the Strengths and Difficulties Questionnaire, which is used to score children’s emotional difficulties, hyperactivity, and other difficulties.

A blinded psychologist also assessed children’s scores on the Wechsler Preschool and Primary Scale of Intelligence (WPPSI-III), which is used to assess IQ and similar measures, and the Movement Assessment Battery for Children (Movement ABC), which is used to assess manual dexterity and similar measures.

The researchers found no significant differences between the delayed and early clamping groups with regard to results on the WPPSI-III or the Movement ABC.

However, delayed clamping was associated with a significant improvement over early clamping in ASQ personal-social scores (adjusted mean difference [AMD]=2.8, P=0.006), fine-motor scores (AMD=2.1, P=0.03), and the Strengths and Difficulties Questionnaire prosocial subscale (AMD=0.5, P=0.05).

When the researchers assessed the children according to sex, they found that significant improvements associated with delayed clamping were only present in males.

Males in the delayed clamping group had significantly higher mean scores in tasks involving fine-motor function, including the WPPSI-III processing-speed quotient (AMD=4.2, P=0.02), the Movement ABC bicycle-trail task (AMD=0.8, P=0.03), and fine-motor scores on the ASQ (AMD=4.7, P=0.01). These boys also had significantly higher personal-social scores on the ASQ (AMD=4.9, P=0.004).

The researchers concluded that, although delayed cord clamping and early clamping resulted in similar overall neurodevelopment and behavior among 4-year-old children, there were differences in this study. And this suggests there are some positive, and no harmful, long-term effects of delayed cord clamping.

Umbilical cord clamping

Photo by Meutia Chaerani

and Indradi Soemardjan

New research suggests that delayed umbilical cord clamping in full-term infants may confer some minor long-term benefits and, at the very least, does not pose any harm.

Delayed clamping did not appear to have a significant effect on most of the mental and physical measures assessed in the study.

It was associated with improved scores in fine-motor skills and social skills at age 4, but these effects only occurred in boys.

Researchers reported these results in JAMA Pediatrics alongside a related editorial.

Previous research has shown that delaying umbilical cord clamping by 2 to 3 minutes after delivery allows fetal blood remaining in the placental circulation to be transfused back to the newborn, and this is associated with improved iron status at 4 to 6 months of age.

However, there is a lack of knowledge regarding the long-term effects of delayed clamping. So policymakers have been hesitant about making clear recommendations regarding cord clamping in full-term infants.

To gain more insight, Ola Andersson, MD, PhD, of Uppsala University in Sweden, and his colleagues performed follow-up assessments of 263 children who were previously enrolled in a randomized trial of cord clamping in full-term infants born in a Swedish hospital.

The team assessed the effects of delayed cord clamping on childhood development at age 4. Delayed clamping (n=141) was defined as occurring 3 or more minutes after delivery, and early clamping (n=122) was defined as occurring 10 seconds or fewer after delivery.

The researchers evaluated child behavior and development using parents’ responses on the Ages and Stages Questionnaire, Third Edition (ASQ), which is used to assess communication, motor skills, and other measures; and the Strengths and Difficulties Questionnaire, which is used to score children’s emotional difficulties, hyperactivity, and other difficulties.

A blinded psychologist also assessed children’s scores on the Wechsler Preschool and Primary Scale of Intelligence (WPPSI-III), which is used to assess IQ and similar measures, and the Movement Assessment Battery for Children (Movement ABC), which is used to assess manual dexterity and similar measures.

The researchers found no significant differences between the delayed and early clamping groups with regard to results on the WPPSI-III or the Movement ABC.

However, delayed clamping was associated with a significant improvement over early clamping in ASQ personal-social scores (adjusted mean difference [AMD]=2.8, P=0.006), fine-motor scores (AMD=2.1, P=0.03), and the Strengths and Difficulties Questionnaire prosocial subscale (AMD=0.5, P=0.05).

When the researchers assessed the children according to sex, they found that significant improvements associated with delayed clamping were only present in males.

Males in the delayed clamping group had significantly higher mean scores in tasks involving fine-motor function, including the WPPSI-III processing-speed quotient (AMD=4.2, P=0.02), the Movement ABC bicycle-trail task (AMD=0.8, P=0.03), and fine-motor scores on the ASQ (AMD=4.7, P=0.01). These boys also had significantly higher personal-social scores on the ASQ (AMD=4.9, P=0.004).

The researchers concluded that, although delayed cord clamping and early clamping resulted in similar overall neurodevelopment and behavior among 4-year-old children, there were differences in this study. And this suggests there are some positive, and no harmful, long-term effects of delayed cord clamping.

Publications
Publications
Topics
Article Type
Display Headline
Team says delayed cord clamping can’t hurt
Display Headline
Team says delayed cord clamping can’t hurt
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Histone variant may contribute to lymphoma

Article Type
Changed
Display Headline
Histone variant may contribute to lymphoma

DNA coiled around histones

Image by Eric Smith

Researchers say they have identified histone chaperones that play an important role in the structure of chromatin.

The team believes this finding, published in Molecular Cell, could lead to a better understanding of lymphomas and other cancers.

“Maintaining an appropriate chromatin structure is essential for normal development, and, not surprisingly, defects in chromatin components can lead to several diseases,” said study author François Robert, PhD, of Institut de Recherches Cliniques de Montréal in Québec, Canada.

In studying chromatin, Dr Robert and his colleagues have been interested in a histone variant called H2A.Z.

The researchers knew that H2A.Z is incorporated into promoter regions of the gene by SWR-C-related chromatin remodeling complexes, but they wanted to determine if H2A.Z is actively excluded from non-promoter regions.

“With this study, we discovered that 2 other proteins, FACT and Spt6, play an important role in the location of H2A.Z,” said Célia Jeronimo, PhD, a research associate in Dr Robert’s lab.

The team found that FACT and SPt6 both help keep H2A.Z from accumulating in intragenic regions. When either histone chaperone is absent, H2A.Z is mislocalized, which alters chromatin composition and contributes to cryptic transcription.

“Inappropriate H2A.Z localization has previously been observed in cancer cells, but little was understood about the consequences of this phenomenon,” Dr Robert said.

“Although our study was performed in yeast cells, it suggests that mislocalization of H2A.Z may lead to cryptic transcription in some types of cancer such as lymphoma, and this may contribute to the disease. Our next step is therefore to investigate the possible role of H2A.Z and its associated gene expression defects in cancer cells.”

Publications
Topics

DNA coiled around histones

Image by Eric Smith

Researchers say they have identified histone chaperones that play an important role in the structure of chromatin.

The team believes this finding, published in Molecular Cell, could lead to a better understanding of lymphomas and other cancers.

“Maintaining an appropriate chromatin structure is essential for normal development, and, not surprisingly, defects in chromatin components can lead to several diseases,” said study author François Robert, PhD, of Institut de Recherches Cliniques de Montréal in Québec, Canada.

In studying chromatin, Dr Robert and his colleagues have been interested in a histone variant called H2A.Z.

The researchers knew that H2A.Z is incorporated into promoter regions of the gene by SWR-C-related chromatin remodeling complexes, but they wanted to determine if H2A.Z is actively excluded from non-promoter regions.

“With this study, we discovered that 2 other proteins, FACT and Spt6, play an important role in the location of H2A.Z,” said Célia Jeronimo, PhD, a research associate in Dr Robert’s lab.

The team found that FACT and SPt6 both help keep H2A.Z from accumulating in intragenic regions. When either histone chaperone is absent, H2A.Z is mislocalized, which alters chromatin composition and contributes to cryptic transcription.

“Inappropriate H2A.Z localization has previously been observed in cancer cells, but little was understood about the consequences of this phenomenon,” Dr Robert said.

“Although our study was performed in yeast cells, it suggests that mislocalization of H2A.Z may lead to cryptic transcription in some types of cancer such as lymphoma, and this may contribute to the disease. Our next step is therefore to investigate the possible role of H2A.Z and its associated gene expression defects in cancer cells.”

DNA coiled around histones

Image by Eric Smith

Researchers say they have identified histone chaperones that play an important role in the structure of chromatin.

The team believes this finding, published in Molecular Cell, could lead to a better understanding of lymphomas and other cancers.

“Maintaining an appropriate chromatin structure is essential for normal development, and, not surprisingly, defects in chromatin components can lead to several diseases,” said study author François Robert, PhD, of Institut de Recherches Cliniques de Montréal in Québec, Canada.

In studying chromatin, Dr Robert and his colleagues have been interested in a histone variant called H2A.Z.

The researchers knew that H2A.Z is incorporated into promoter regions of the gene by SWR-C-related chromatin remodeling complexes, but they wanted to determine if H2A.Z is actively excluded from non-promoter regions.

“With this study, we discovered that 2 other proteins, FACT and Spt6, play an important role in the location of H2A.Z,” said Célia Jeronimo, PhD, a research associate in Dr Robert’s lab.

The team found that FACT and SPt6 both help keep H2A.Z from accumulating in intragenic regions. When either histone chaperone is absent, H2A.Z is mislocalized, which alters chromatin composition and contributes to cryptic transcription.

“Inappropriate H2A.Z localization has previously been observed in cancer cells, but little was understood about the consequences of this phenomenon,” Dr Robert said.

“Although our study was performed in yeast cells, it suggests that mislocalization of H2A.Z may lead to cryptic transcription in some types of cancer such as lymphoma, and this may contribute to the disease. Our next step is therefore to investigate the possible role of H2A.Z and its associated gene expression defects in cancer cells.”

Publications
Publications
Topics
Article Type
Display Headline
Histone variant may contribute to lymphoma
Display Headline
Histone variant may contribute to lymphoma
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Optimal duration of DAPT still unclear

Article Type
Changed
Display Headline
Optimal duration of DAPT still unclear

Aspirin tablets

Photo by Sage Ross

A systematic review of published evidence has failed to elucidate the optimal duration of dual antiplatelet therapy (DAPT) in patients who have a drug-eluting stent.

The data showed that patients who received DAPT for a longer period had a small reduction in myocardial infarction as well as a small increase in major bleeding and an even smaller increase in all-cause mortality, compared to patients who received DAPT for a shorter period.

Frederick A. Spencer, MD, of McMaster University in Hamilton, Ontario, Canada, and his colleagues reported these findings in Annals of Internal Medicine.

The team searched databases for trials of DAPT published from 1996 to March 2015. They identified 9 randomized, controlled trials including a total of 29,531 patients. There was complete data for 28,808 patients who had coronary artery disease and received DAPT after drug-eluting stent placement.

In 4 of the trials, patients were randomized to DAPT when they received their stent. Patients in the shorter-duration arm received DAPT for 3 to 6 months, and patients in the longer-duration arm received DAPT for 12 to 24 months.

In a fifth study, patients were randomized to DAPT at stent placement, but thrombotic events occurring during the first 6 months (when both arms received DAPT) were excluded.

In the 4 remaining trials, patients were randomized to DAPT 6 months or more after stent placement. Patients in the shorter-duration arm received DAPT for 6 to 18 months, and patients in the longer-duration arm received DAPT for 12 to 42 months.

Analyzing data from these trials together, Dr Spencer and his colleagues found moderate-quality evidence suggesting that receiving DAPT for a longer period decreased the risk of myocardial infarction (risk ratio [RR]=0.73) but increased the risk of mortality (RR=1.19).

The team also said there was high-quality evidence suggesting that longer-duration DAPT increased the risk of major bleeding (RR=1.63).

Receiving DAPT for a longer period was associated with approximately 8 fewer myocardial infarctions per 1000 patients per year, 6 more major bleeding events per 1000 patients per year, and 2 more deaths per 1000 patients per year, when compared to shorter-duration DAPT.

Because these differences are small, Dr Spencer and his colleagues said the duration of DAPT therapy should probably be based on patient preference, following a discussion of the potential risks and benefits.

Publications
Topics

Aspirin tablets

Photo by Sage Ross

A systematic review of published evidence has failed to elucidate the optimal duration of dual antiplatelet therapy (DAPT) in patients who have a drug-eluting stent.

The data showed that patients who received DAPT for a longer period had a small reduction in myocardial infarction as well as a small increase in major bleeding and an even smaller increase in all-cause mortality, compared to patients who received DAPT for a shorter period.

Frederick A. Spencer, MD, of McMaster University in Hamilton, Ontario, Canada, and his colleagues reported these findings in Annals of Internal Medicine.

The team searched databases for trials of DAPT published from 1996 to March 2015. They identified 9 randomized, controlled trials including a total of 29,531 patients. There was complete data for 28,808 patients who had coronary artery disease and received DAPT after drug-eluting stent placement.

In 4 of the trials, patients were randomized to DAPT when they received their stent. Patients in the shorter-duration arm received DAPT for 3 to 6 months, and patients in the longer-duration arm received DAPT for 12 to 24 months.

In a fifth study, patients were randomized to DAPT at stent placement, but thrombotic events occurring during the first 6 months (when both arms received DAPT) were excluded.

In the 4 remaining trials, patients were randomized to DAPT 6 months or more after stent placement. Patients in the shorter-duration arm received DAPT for 6 to 18 months, and patients in the longer-duration arm received DAPT for 12 to 42 months.

Analyzing data from these trials together, Dr Spencer and his colleagues found moderate-quality evidence suggesting that receiving DAPT for a longer period decreased the risk of myocardial infarction (risk ratio [RR]=0.73) but increased the risk of mortality (RR=1.19).

The team also said there was high-quality evidence suggesting that longer-duration DAPT increased the risk of major bleeding (RR=1.63).

Receiving DAPT for a longer period was associated with approximately 8 fewer myocardial infarctions per 1000 patients per year, 6 more major bleeding events per 1000 patients per year, and 2 more deaths per 1000 patients per year, when compared to shorter-duration DAPT.

Because these differences are small, Dr Spencer and his colleagues said the duration of DAPT therapy should probably be based on patient preference, following a discussion of the potential risks and benefits.

Aspirin tablets

Photo by Sage Ross

A systematic review of published evidence has failed to elucidate the optimal duration of dual antiplatelet therapy (DAPT) in patients who have a drug-eluting stent.

The data showed that patients who received DAPT for a longer period had a small reduction in myocardial infarction as well as a small increase in major bleeding and an even smaller increase in all-cause mortality, compared to patients who received DAPT for a shorter period.

Frederick A. Spencer, MD, of McMaster University in Hamilton, Ontario, Canada, and his colleagues reported these findings in Annals of Internal Medicine.

The team searched databases for trials of DAPT published from 1996 to March 2015. They identified 9 randomized, controlled trials including a total of 29,531 patients. There was complete data for 28,808 patients who had coronary artery disease and received DAPT after drug-eluting stent placement.

In 4 of the trials, patients were randomized to DAPT when they received their stent. Patients in the shorter-duration arm received DAPT for 3 to 6 months, and patients in the longer-duration arm received DAPT for 12 to 24 months.

In a fifth study, patients were randomized to DAPT at stent placement, but thrombotic events occurring during the first 6 months (when both arms received DAPT) were excluded.

In the 4 remaining trials, patients were randomized to DAPT 6 months or more after stent placement. Patients in the shorter-duration arm received DAPT for 6 to 18 months, and patients in the longer-duration arm received DAPT for 12 to 42 months.

Analyzing data from these trials together, Dr Spencer and his colleagues found moderate-quality evidence suggesting that receiving DAPT for a longer period decreased the risk of myocardial infarction (risk ratio [RR]=0.73) but increased the risk of mortality (RR=1.19).

The team also said there was high-quality evidence suggesting that longer-duration DAPT increased the risk of major bleeding (RR=1.63).

Receiving DAPT for a longer period was associated with approximately 8 fewer myocardial infarctions per 1000 patients per year, 6 more major bleeding events per 1000 patients per year, and 2 more deaths per 1000 patients per year, when compared to shorter-duration DAPT.

Because these differences are small, Dr Spencer and his colleagues said the duration of DAPT therapy should probably be based on patient preference, following a discussion of the potential risks and benefits.

Publications
Publications
Topics
Article Type
Display Headline
Optimal duration of DAPT still unclear
Display Headline
Optimal duration of DAPT still unclear
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Herbs reduce fatigue in cancer patients

Article Type
Changed
Display Headline
Herbs reduce fatigue in cancer patients

Herbs and spices

Photo by Alexander Baxevanis

An herbal mixture used in traditional Chinese medicine can reduce fatigue in cancer patients, results of a phase 1/2 study suggest.

The mixture, Ren Shen Yangrong Tang (RSYRT), is a soup containing 12 herbs.

In the study, cancer patients suffering from moderate to severe fatigue reported significantly less fatigue after taking RSYRT for 2 to 3 weeks.

Researchers reported these results in the Journal of Alternative and Complementary Medicine.

Yichen Xu, MD, of Beijing Cancer Hospital & Institute in China, and colleagues evaluated RSYRT in 33 patients who had completed cancer treatment. The patients had stable disease and no anemia.

Eleven patients had moderate fatigue (a score of 4-6 on a 0-10 scale), and 22 had severe fatigue (a score of 7-10). All patients had experienced fatigue for at least 4 months.

Patients took RSYRT twice a day for 6 weeks and experienced a significant decrease in fatigue severity. The mean fatigue score decreased from 7.06 at baseline to 3.30 at the 6-week mark (P<0.001).

The fatigue category also changed significantly (P=0.024). Among the 22 patients who had severe fatigue before RSYRT, half had mild fatigue after therapy, and half had moderate fatigue.

Among the 11 patients who had moderate fatigue at baseline, only 1 still had moderate fatigue after receiving RSYRT. The rest had mild fatigue.

All of the patients said they felt better after taking RSYRT for 4 weeks.

There were no “uncomfortable events” related to RSYRT, such as gastrointestinal upset, insomnia, headache, or rash. None of the patients required a dose reduction or dose interruption.

None of the patients had blood chemistry abnormalities or abnormal liver/kidney function. Two patients who had a change in ST segment before RSYRT had normal electrocardiogram results after treatment.

Publications
Topics

Herbs and spices

Photo by Alexander Baxevanis

An herbal mixture used in traditional Chinese medicine can reduce fatigue in cancer patients, results of a phase 1/2 study suggest.

The mixture, Ren Shen Yangrong Tang (RSYRT), is a soup containing 12 herbs.

In the study, cancer patients suffering from moderate to severe fatigue reported significantly less fatigue after taking RSYRT for 2 to 3 weeks.

Researchers reported these results in the Journal of Alternative and Complementary Medicine.

Yichen Xu, MD, of Beijing Cancer Hospital & Institute in China, and colleagues evaluated RSYRT in 33 patients who had completed cancer treatment. The patients had stable disease and no anemia.

Eleven patients had moderate fatigue (a score of 4-6 on a 0-10 scale), and 22 had severe fatigue (a score of 7-10). All patients had experienced fatigue for at least 4 months.

Patients took RSYRT twice a day for 6 weeks and experienced a significant decrease in fatigue severity. The mean fatigue score decreased from 7.06 at baseline to 3.30 at the 6-week mark (P<0.001).

The fatigue category also changed significantly (P=0.024). Among the 22 patients who had severe fatigue before RSYRT, half had mild fatigue after therapy, and half had moderate fatigue.

Among the 11 patients who had moderate fatigue at baseline, only 1 still had moderate fatigue after receiving RSYRT. The rest had mild fatigue.

All of the patients said they felt better after taking RSYRT for 4 weeks.

There were no “uncomfortable events” related to RSYRT, such as gastrointestinal upset, insomnia, headache, or rash. None of the patients required a dose reduction or dose interruption.

None of the patients had blood chemistry abnormalities or abnormal liver/kidney function. Two patients who had a change in ST segment before RSYRT had normal electrocardiogram results after treatment.

Herbs and spices

Photo by Alexander Baxevanis

An herbal mixture used in traditional Chinese medicine can reduce fatigue in cancer patients, results of a phase 1/2 study suggest.

The mixture, Ren Shen Yangrong Tang (RSYRT), is a soup containing 12 herbs.

In the study, cancer patients suffering from moderate to severe fatigue reported significantly less fatigue after taking RSYRT for 2 to 3 weeks.

Researchers reported these results in the Journal of Alternative and Complementary Medicine.

Yichen Xu, MD, of Beijing Cancer Hospital & Institute in China, and colleagues evaluated RSYRT in 33 patients who had completed cancer treatment. The patients had stable disease and no anemia.

Eleven patients had moderate fatigue (a score of 4-6 on a 0-10 scale), and 22 had severe fatigue (a score of 7-10). All patients had experienced fatigue for at least 4 months.

Patients took RSYRT twice a day for 6 weeks and experienced a significant decrease in fatigue severity. The mean fatigue score decreased from 7.06 at baseline to 3.30 at the 6-week mark (P<0.001).

The fatigue category also changed significantly (P=0.024). Among the 22 patients who had severe fatigue before RSYRT, half had mild fatigue after therapy, and half had moderate fatigue.

Among the 11 patients who had moderate fatigue at baseline, only 1 still had moderate fatigue after receiving RSYRT. The rest had mild fatigue.

All of the patients said they felt better after taking RSYRT for 4 weeks.

There were no “uncomfortable events” related to RSYRT, such as gastrointestinal upset, insomnia, headache, or rash. None of the patients required a dose reduction or dose interruption.

None of the patients had blood chemistry abnormalities or abnormal liver/kidney function. Two patients who had a change in ST segment before RSYRT had normal electrocardiogram results after treatment.

Publications
Publications
Topics
Article Type
Display Headline
Herbs reduce fatigue in cancer patients
Display Headline
Herbs reduce fatigue in cancer patients
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

CHMP recommends drug for WM

Article Type
Changed
Display Headline
CHMP recommends drug for WM

Micrograph showing WM

The European Medicines Agency’s Committee for Medicinal Products for Human Use (CHMP) is  recommending that ibrutinib (Imbruvica) be approved to treat Waldenström’s macroglobulinemia (WM).

The CHMP is recommending the drug for use in WM patients who have received at least 1 prior therapy as well as previously untreated WM patients who are not suitable candidates for chemo-immunotherapy.

The European Commission will review this recommendation and should make a decision later this year.

Ibrutinib is already approved to treat WM in the US. The drug is also approved in the European Union, the US, and other countries to treat chronic lymphocytic leukemia and mantle cell lymphoma.

Janssen-Cilag International NV (Janssen) holds the marketing authorization for ibrutinib in Europe, and its affiliates market the drug in Europe and the rest of the world. In the US, ibrutinib is under joint development by Pharmacyclics and Janssen Biotech, Inc.

Phase 2 study

The CHMP’s recommendation for ibrutinib was based on a multicenter, phase 2 study in which researchers tested the drug in 63 patients with previously treated WM. Initial data showed an overall response rate of 87.3% in patients who received the drug for a median of 11.7 months.

Updated results from the study were published in NEJM in April. After a median treatment duration of 19.1 months, the overall response rate was 91%.

At 24 months, the estimated rate of progression-free survival was 69%, and the estimated rate of overall survival was 95%.

The most common grade 2-4 adverse events were neutropenia (22%) and thrombocytopenia (14%). Ibrutinib-related neutropenia and thrombocytopenia were reversible but required a dose reduction in 3 patients and treatment discontinuation in 4 patients.

Grade 2 or higher bleeding events occurred in 4 patients, and there were 15 infections considered possibly related to ibrutinib.

Treatment-related atrial fibrillation (AFib) occurred in 3 patients, all of whom had a prior history of paroxysmal AFib. AFib resolved when treatment was withheld, and all 3 patients were able to continue on therapy per protocol without an additional event.

Publications
Topics

Micrograph showing WM

The European Medicines Agency’s Committee for Medicinal Products for Human Use (CHMP) is  recommending that ibrutinib (Imbruvica) be approved to treat Waldenström’s macroglobulinemia (WM).

The CHMP is recommending the drug for use in WM patients who have received at least 1 prior therapy as well as previously untreated WM patients who are not suitable candidates for chemo-immunotherapy.

The European Commission will review this recommendation and should make a decision later this year.

Ibrutinib is already approved to treat WM in the US. The drug is also approved in the European Union, the US, and other countries to treat chronic lymphocytic leukemia and mantle cell lymphoma.

Janssen-Cilag International NV (Janssen) holds the marketing authorization for ibrutinib in Europe, and its affiliates market the drug in Europe and the rest of the world. In the US, ibrutinib is under joint development by Pharmacyclics and Janssen Biotech, Inc.

Phase 2 study

The CHMP’s recommendation for ibrutinib was based on a multicenter, phase 2 study in which researchers tested the drug in 63 patients with previously treated WM. Initial data showed an overall response rate of 87.3% in patients who received the drug for a median of 11.7 months.

Updated results from the study were published in NEJM in April. After a median treatment duration of 19.1 months, the overall response rate was 91%.

At 24 months, the estimated rate of progression-free survival was 69%, and the estimated rate of overall survival was 95%.

The most common grade 2-4 adverse events were neutropenia (22%) and thrombocytopenia (14%). Ibrutinib-related neutropenia and thrombocytopenia were reversible but required a dose reduction in 3 patients and treatment discontinuation in 4 patients.

Grade 2 or higher bleeding events occurred in 4 patients, and there were 15 infections considered possibly related to ibrutinib.

Treatment-related atrial fibrillation (AFib) occurred in 3 patients, all of whom had a prior history of paroxysmal AFib. AFib resolved when treatment was withheld, and all 3 patients were able to continue on therapy per protocol without an additional event.

Micrograph showing WM

The European Medicines Agency’s Committee for Medicinal Products for Human Use (CHMP) is  recommending that ibrutinib (Imbruvica) be approved to treat Waldenström’s macroglobulinemia (WM).

The CHMP is recommending the drug for use in WM patients who have received at least 1 prior therapy as well as previously untreated WM patients who are not suitable candidates for chemo-immunotherapy.

The European Commission will review this recommendation and should make a decision later this year.

Ibrutinib is already approved to treat WM in the US. The drug is also approved in the European Union, the US, and other countries to treat chronic lymphocytic leukemia and mantle cell lymphoma.

Janssen-Cilag International NV (Janssen) holds the marketing authorization for ibrutinib in Europe, and its affiliates market the drug in Europe and the rest of the world. In the US, ibrutinib is under joint development by Pharmacyclics and Janssen Biotech, Inc.

Phase 2 study

The CHMP’s recommendation for ibrutinib was based on a multicenter, phase 2 study in which researchers tested the drug in 63 patients with previously treated WM. Initial data showed an overall response rate of 87.3% in patients who received the drug for a median of 11.7 months.

Updated results from the study were published in NEJM in April. After a median treatment duration of 19.1 months, the overall response rate was 91%.

At 24 months, the estimated rate of progression-free survival was 69%, and the estimated rate of overall survival was 95%.

The most common grade 2-4 adverse events were neutropenia (22%) and thrombocytopenia (14%). Ibrutinib-related neutropenia and thrombocytopenia were reversible but required a dose reduction in 3 patients and treatment discontinuation in 4 patients.

Grade 2 or higher bleeding events occurred in 4 patients, and there were 15 infections considered possibly related to ibrutinib.

Treatment-related atrial fibrillation (AFib) occurred in 3 patients, all of whom had a prior history of paroxysmal AFib. AFib resolved when treatment was withheld, and all 3 patients were able to continue on therapy per protocol without an additional event.

Publications
Publications
Topics
Article Type
Display Headline
CHMP recommends drug for WM
Display Headline
CHMP recommends drug for WM
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Improving targeted therapy for leukemia, other diseases

Article Type
Changed
Display Headline
Improving targeted therapy for leukemia, other diseases

James Bradner, MD

Photo by Sam Ogden

A chemical strategy may allow researchers to target “undruggable” proteins and overcome resistance to current targeted therapies, according to a report published in Science.

The strategy uses tumor cells’ own protein-elimination system to break down and dispose of the proteins that drive cancer growth.

When tested in vitro and in vivo, the approach caused leukemia cells to die more quickly than they do with conventional targeted

therapies.

“One of the reasons [treatment] resistance occurs is that cancer-related proteins often have multiple functions within the cell, and conventional targeted therapies inhibit just one or a few of those functions,” said study author James Bradner, MD, of the Dana-Farber Cancer Institute in Boston, Massachusetts.

“Conventional drugs allow the targeted protein to adapt to the drug, and the cell finds alternate routes for its growth signals. We began designing approaches that cause the target protein to disintegrate, rather than merely be inhibited. It would be very powerful if we could chemically convert an inhibitor drug into a degrader drug.”

With this in mind, Dr Bradner’s team designed a chemical adapter that attaches to a targeted drug molecule. The adapter enables the drug to tow the cell’s protein-degradation machinery directly to the protein of interest. Once bound to the protein, the combination drug-and-protein-degrader essentially demolishes it.

The investigators tested the technology in leukemia cells. They built an adapter out of phthalimide, a chemical derivative of the drug thalidomide, and attached it to the BRD4 inhibitor JQ1. The phthalimide was designed to “hijack” the cereblon E3 ubiquitin ligase complex.

When the researchers treated the leukemia cells with a JQ1-phthalimide conjugate called dBET1, the BRD4 protein within the cells was degraded in less than an hour. The team said such rapid and extensive degradation suggests conjugates may be able to prevent or hinder cancer cells from developing resistance to targeted therapies.

“The potency, selectivity, and rapidity of this approach—namely, the ability to home in specifically on BRD4—are unprecedented in clinical approaches to protein degradation,” Dr Bradner said.

To determine how selective dBET1 actually is, the investigators measured the levels of all proteins in leukemia cells at 1 hour and 2 hours after treatment.

“We were stunned to find that only 3 proteins of more than 7000 in the entire cell were degraded: BRD2, 3, and 4, an exceptional degree of selectivity guided by the intended targets of JQ1,” Dr Bradner said. “It’s as though dBET1 is laser-guided to deliver protein-degrading machinery to targeted proteins.”

The researchers then tested dBET1 in mice bearing leukemia. As in the cell samples, there was a rapid degradation of BRD4 in the tumor cells and a potent anti-leukemic effect, with few noticeable side effects.

To see if compounds other than JQ1 can be used as a guidance system for a conjugate, the investigators created a set of molecules that lock the protein-degradation machinery onto a compound called SLF, which targets the protein FKBP12.

When they treated cancer cells with SLF, the team found it degraded the vast majority of FKBP12 in the cells within a few hours.

Buoyed by these results, the researchers are working to create a derivative of dBET1 that can be used as a drug in humans and to extend the conjugate strategy for the treatment of other diseases.

“The dBET1 and the dFKBP12 compounds are presently in a late stage of lead optimization for therapeutic development in both cancer and non-malignant diseases,” said Prem Das, PhD, chief research business development officer at Dana-Farber.

“Composition-of-matter and method-of-use patent applications have been filed on these and other additional targeted agents, as well as on the chemistry platform. They will be licensed for commercialization to an appropriate company according to standard Dana-Farber practice.”

Publications
Topics

James Bradner, MD

Photo by Sam Ogden

A chemical strategy may allow researchers to target “undruggable” proteins and overcome resistance to current targeted therapies, according to a report published in Science.

The strategy uses tumor cells’ own protein-elimination system to break down and dispose of the proteins that drive cancer growth.

When tested in vitro and in vivo, the approach caused leukemia cells to die more quickly than they do with conventional targeted

therapies.

“One of the reasons [treatment] resistance occurs is that cancer-related proteins often have multiple functions within the cell, and conventional targeted therapies inhibit just one or a few of those functions,” said study author James Bradner, MD, of the Dana-Farber Cancer Institute in Boston, Massachusetts.

“Conventional drugs allow the targeted protein to adapt to the drug, and the cell finds alternate routes for its growth signals. We began designing approaches that cause the target protein to disintegrate, rather than merely be inhibited. It would be very powerful if we could chemically convert an inhibitor drug into a degrader drug.”

With this in mind, Dr Bradner’s team designed a chemical adapter that attaches to a targeted drug molecule. The adapter enables the drug to tow the cell’s protein-degradation machinery directly to the protein of interest. Once bound to the protein, the combination drug-and-protein-degrader essentially demolishes it.

The investigators tested the technology in leukemia cells. They built an adapter out of phthalimide, a chemical derivative of the drug thalidomide, and attached it to the BRD4 inhibitor JQ1. The phthalimide was designed to “hijack” the cereblon E3 ubiquitin ligase complex.

When the researchers treated the leukemia cells with a JQ1-phthalimide conjugate called dBET1, the BRD4 protein within the cells was degraded in less than an hour. The team said such rapid and extensive degradation suggests conjugates may be able to prevent or hinder cancer cells from developing resistance to targeted therapies.

“The potency, selectivity, and rapidity of this approach—namely, the ability to home in specifically on BRD4—are unprecedented in clinical approaches to protein degradation,” Dr Bradner said.

To determine how selective dBET1 actually is, the investigators measured the levels of all proteins in leukemia cells at 1 hour and 2 hours after treatment.

“We were stunned to find that only 3 proteins of more than 7000 in the entire cell were degraded: BRD2, 3, and 4, an exceptional degree of selectivity guided by the intended targets of JQ1,” Dr Bradner said. “It’s as though dBET1 is laser-guided to deliver protein-degrading machinery to targeted proteins.”

The researchers then tested dBET1 in mice bearing leukemia. As in the cell samples, there was a rapid degradation of BRD4 in the tumor cells and a potent anti-leukemic effect, with few noticeable side effects.

To see if compounds other than JQ1 can be used as a guidance system for a conjugate, the investigators created a set of molecules that lock the protein-degradation machinery onto a compound called SLF, which targets the protein FKBP12.

When they treated cancer cells with SLF, the team found it degraded the vast majority of FKBP12 in the cells within a few hours.

Buoyed by these results, the researchers are working to create a derivative of dBET1 that can be used as a drug in humans and to extend the conjugate strategy for the treatment of other diseases.

“The dBET1 and the dFKBP12 compounds are presently in a late stage of lead optimization for therapeutic development in both cancer and non-malignant diseases,” said Prem Das, PhD, chief research business development officer at Dana-Farber.

“Composition-of-matter and method-of-use patent applications have been filed on these and other additional targeted agents, as well as on the chemistry platform. They will be licensed for commercialization to an appropriate company according to standard Dana-Farber practice.”

James Bradner, MD

Photo by Sam Ogden

A chemical strategy may allow researchers to target “undruggable” proteins and overcome resistance to current targeted therapies, according to a report published in Science.

The strategy uses tumor cells’ own protein-elimination system to break down and dispose of the proteins that drive cancer growth.

When tested in vitro and in vivo, the approach caused leukemia cells to die more quickly than they do with conventional targeted

therapies.

“One of the reasons [treatment] resistance occurs is that cancer-related proteins often have multiple functions within the cell, and conventional targeted therapies inhibit just one or a few of those functions,” said study author James Bradner, MD, of the Dana-Farber Cancer Institute in Boston, Massachusetts.

“Conventional drugs allow the targeted protein to adapt to the drug, and the cell finds alternate routes for its growth signals. We began designing approaches that cause the target protein to disintegrate, rather than merely be inhibited. It would be very powerful if we could chemically convert an inhibitor drug into a degrader drug.”

With this in mind, Dr Bradner’s team designed a chemical adapter that attaches to a targeted drug molecule. The adapter enables the drug to tow the cell’s protein-degradation machinery directly to the protein of interest. Once bound to the protein, the combination drug-and-protein-degrader essentially demolishes it.

The investigators tested the technology in leukemia cells. They built an adapter out of phthalimide, a chemical derivative of the drug thalidomide, and attached it to the BRD4 inhibitor JQ1. The phthalimide was designed to “hijack” the cereblon E3 ubiquitin ligase complex.

When the researchers treated the leukemia cells with a JQ1-phthalimide conjugate called dBET1, the BRD4 protein within the cells was degraded in less than an hour. The team said such rapid and extensive degradation suggests conjugates may be able to prevent or hinder cancer cells from developing resistance to targeted therapies.

“The potency, selectivity, and rapidity of this approach—namely, the ability to home in specifically on BRD4—are unprecedented in clinical approaches to protein degradation,” Dr Bradner said.

To determine how selective dBET1 actually is, the investigators measured the levels of all proteins in leukemia cells at 1 hour and 2 hours after treatment.

“We were stunned to find that only 3 proteins of more than 7000 in the entire cell were degraded: BRD2, 3, and 4, an exceptional degree of selectivity guided by the intended targets of JQ1,” Dr Bradner said. “It’s as though dBET1 is laser-guided to deliver protein-degrading machinery to targeted proteins.”

The researchers then tested dBET1 in mice bearing leukemia. As in the cell samples, there was a rapid degradation of BRD4 in the tumor cells and a potent anti-leukemic effect, with few noticeable side effects.

To see if compounds other than JQ1 can be used as a guidance system for a conjugate, the investigators created a set of molecules that lock the protein-degradation machinery onto a compound called SLF, which targets the protein FKBP12.

When they treated cancer cells with SLF, the team found it degraded the vast majority of FKBP12 in the cells within a few hours.

Buoyed by these results, the researchers are working to create a derivative of dBET1 that can be used as a drug in humans and to extend the conjugate strategy for the treatment of other diseases.

“The dBET1 and the dFKBP12 compounds are presently in a late stage of lead optimization for therapeutic development in both cancer and non-malignant diseases,” said Prem Das, PhD, chief research business development officer at Dana-Farber.

“Composition-of-matter and method-of-use patent applications have been filed on these and other additional targeted agents, as well as on the chemistry platform. They will be licensed for commercialization to an appropriate company according to standard Dana-Farber practice.”

Publications
Publications
Topics
Article Type
Display Headline
Improving targeted therapy for leukemia, other diseases
Display Headline
Improving targeted therapy for leukemia, other diseases
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Anticoagulant type doesn’t affect stent thrombosis risk

Article Type
Changed
Display Headline
Anticoagulant type doesn’t affect stent thrombosis risk

Vial of heparin

PARIS—New research suggests that patients who have undergone primary percutaneous coronary intervention (PCI) have a low risk of stent thrombosis, regardless of the anticoagulant therapy they receive.

In a large, registry-based study, stent thrombosis occurred in less than 1% of patients, regardless of whether they received bivalirudin with or without heparin, heparin alone, or a GP IIb/IIIa inhibitor (GPI) with or without heparin.

The study also showed that patients who experienced stent thrombosis between days 2 and 30, regardless of drug regimen, were more likely to die within a year than patients who developed stent thrombosis within the first 24 hours of their procedure.

Per Grimfjard, of Vasteras Hospital/Uppsala University in Sweden, presented these findings at EuroPCR 2015.

A number of recent studies have raised concerns that bivalirudin may increase the risk of stent thrombosis compared with heparin. But rates of stent thrombosis have differed substantially between studies.

So Dr Grimfjard and his colleagues decided to review stent thrombosis rates by drug choice among more than 30,000 patients who were treated with primary PCI for ST-elevation myocardial infarction (STEMI) between January 2007 and July 2014 in the Swedish Coronary Angiography and Angioplasty Register (SCAAR).

The researchers divided patients into 3 treatment groups: bivalirudin, heparin, and GPI. However, 77% of patients in the bivalirudin group also received heparin, and 3.6% received a GPI prior to or during the PCI procedure. In the GPI group, 77% of patients also received heparin.

The rates of stent thrombosis were low in all 3 groups—0.84% in the bivalirudin group, 0.94% in the heparin group, and 0.83% in the GPI group.

For all 3 drugs, mortality at 1 year was numerically higher if the stent thrombosis occurred between 2 and 30 days, as compared with day 0 to 1 post-PCI.

“[A] possible explanation is that a stent thrombosis that happens once the patient has left the hospital is likely to cause a more substantial infarction, the reason being longer delay from symptoms to revascularization,” Dr Grimfjard said.

He added that a more substantial myocardial infarction typically leads to more heart failure and arrhythmia long-term. Unfortunately, the findings regarding the timing of stent thrombosis do not offer any guidance for choosing optimal antithrombotic treatment.

He and his colleagues are currently enrolling patients in a 6000-patient, registry-based, randomized clinical trial called SWEDEHART-Validate. The team will compare heparin alone to bivalirudin and optional low-dose heparin in STEMI and non-STEMI patients undergoing PCI.

“Hopefully, this large, randomized trial will bring clarity to the choice of antithrombotic treatment strategy in these patients,” Dr Grimfjard said.

Publications
Topics

Vial of heparin

PARIS—New research suggests that patients who have undergone primary percutaneous coronary intervention (PCI) have a low risk of stent thrombosis, regardless of the anticoagulant therapy they receive.

In a large, registry-based study, stent thrombosis occurred in less than 1% of patients, regardless of whether they received bivalirudin with or without heparin, heparin alone, or a GP IIb/IIIa inhibitor (GPI) with or without heparin.

The study also showed that patients who experienced stent thrombosis between days 2 and 30, regardless of drug regimen, were more likely to die within a year than patients who developed stent thrombosis within the first 24 hours of their procedure.

Per Grimfjard, of Vasteras Hospital/Uppsala University in Sweden, presented these findings at EuroPCR 2015.

A number of recent studies have raised concerns that bivalirudin may increase the risk of stent thrombosis compared with heparin. But rates of stent thrombosis have differed substantially between studies.

So Dr Grimfjard and his colleagues decided to review stent thrombosis rates by drug choice among more than 30,000 patients who were treated with primary PCI for ST-elevation myocardial infarction (STEMI) between January 2007 and July 2014 in the Swedish Coronary Angiography and Angioplasty Register (SCAAR).

The researchers divided patients into 3 treatment groups: bivalirudin, heparin, and GPI. However, 77% of patients in the bivalirudin group also received heparin, and 3.6% received a GPI prior to or during the PCI procedure. In the GPI group, 77% of patients also received heparin.

The rates of stent thrombosis were low in all 3 groups—0.84% in the bivalirudin group, 0.94% in the heparin group, and 0.83% in the GPI group.

For all 3 drugs, mortality at 1 year was numerically higher if the stent thrombosis occurred between 2 and 30 days, as compared with day 0 to 1 post-PCI.

“[A] possible explanation is that a stent thrombosis that happens once the patient has left the hospital is likely to cause a more substantial infarction, the reason being longer delay from symptoms to revascularization,” Dr Grimfjard said.

He added that a more substantial myocardial infarction typically leads to more heart failure and arrhythmia long-term. Unfortunately, the findings regarding the timing of stent thrombosis do not offer any guidance for choosing optimal antithrombotic treatment.

He and his colleagues are currently enrolling patients in a 6000-patient, registry-based, randomized clinical trial called SWEDEHART-Validate. The team will compare heparin alone to bivalirudin and optional low-dose heparin in STEMI and non-STEMI patients undergoing PCI.

“Hopefully, this large, randomized trial will bring clarity to the choice of antithrombotic treatment strategy in these patients,” Dr Grimfjard said.

Vial of heparin

PARIS—New research suggests that patients who have undergone primary percutaneous coronary intervention (PCI) have a low risk of stent thrombosis, regardless of the anticoagulant therapy they receive.

In a large, registry-based study, stent thrombosis occurred in less than 1% of patients, regardless of whether they received bivalirudin with or without heparin, heparin alone, or a GP IIb/IIIa inhibitor (GPI) with or without heparin.

The study also showed that patients who experienced stent thrombosis between days 2 and 30, regardless of drug regimen, were more likely to die within a year than patients who developed stent thrombosis within the first 24 hours of their procedure.

Per Grimfjard, of Vasteras Hospital/Uppsala University in Sweden, presented these findings at EuroPCR 2015.

A number of recent studies have raised concerns that bivalirudin may increase the risk of stent thrombosis compared with heparin. But rates of stent thrombosis have differed substantially between studies.

So Dr Grimfjard and his colleagues decided to review stent thrombosis rates by drug choice among more than 30,000 patients who were treated with primary PCI for ST-elevation myocardial infarction (STEMI) between January 2007 and July 2014 in the Swedish Coronary Angiography and Angioplasty Register (SCAAR).

The researchers divided patients into 3 treatment groups: bivalirudin, heparin, and GPI. However, 77% of patients in the bivalirudin group also received heparin, and 3.6% received a GPI prior to or during the PCI procedure. In the GPI group, 77% of patients also received heparin.

The rates of stent thrombosis were low in all 3 groups—0.84% in the bivalirudin group, 0.94% in the heparin group, and 0.83% in the GPI group.

For all 3 drugs, mortality at 1 year was numerically higher if the stent thrombosis occurred between 2 and 30 days, as compared with day 0 to 1 post-PCI.

“[A] possible explanation is that a stent thrombosis that happens once the patient has left the hospital is likely to cause a more substantial infarction, the reason being longer delay from symptoms to revascularization,” Dr Grimfjard said.

He added that a more substantial myocardial infarction typically leads to more heart failure and arrhythmia long-term. Unfortunately, the findings regarding the timing of stent thrombosis do not offer any guidance for choosing optimal antithrombotic treatment.

He and his colleagues are currently enrolling patients in a 6000-patient, registry-based, randomized clinical trial called SWEDEHART-Validate. The team will compare heparin alone to bivalirudin and optional low-dose heparin in STEMI and non-STEMI patients undergoing PCI.

“Hopefully, this large, randomized trial will bring clarity to the choice of antithrombotic treatment strategy in these patients,” Dr Grimfjard said.

Publications
Publications
Topics
Article Type
Display Headline
Anticoagulant type doesn’t affect stent thrombosis risk
Display Headline
Anticoagulant type doesn’t affect stent thrombosis risk
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Team reports new method to identify immune cells

Article Type
Changed
Display Headline
Team reports new method to identify immune cells

Blood samples

Photo by Graham Colm

A new method for identifying immune cells could pave the way for rapid detection of hematologic malignancies from a small blood sample, according to researchers.

The team found they could use wavelength modulated Raman spectroscopy (WMRS) to identify subsets of T cells, natural killer cells, and dendritic cells.

Traditional methods of identifying these cells usually involve labeling them with fluorescent or magnetically labeled antibodies.

Using WMRS, the researchers were able to identify immune cells with no labeling at all, thus permitting rapid identification and further analysis to take place with no potential alteration to the cells.

Simon Powis, PhD, of the University of St Andrews in Fife, Scotland, and his colleagues described this work in PLOS ONE.

Raman scattering refers to light scattering from molecules in a sample where the light energy can be shifted up or down and recorded as a “molecular fingerprint” that can be used for identification. Normally, this process is very weak and further hampered by other background light (eg, fluorescence).

WMRS subtly changes the incident laser light that, in turn, results in a modulation of the Raman signal, allowing it to be extracted from any (stationary) interfering signal.

Using WMRS, Dr Powis and his colleagues found they could identify CD4+ T cells, CD8+ T cells, CD56+ natural killer cells, CD303+ lymphoid/plasmacytoid dendritic cells, and CD1c+ myeloid dendritic cells.

“Under a normal light microscope, these immune cells essentially all look identical,” Dr Powis said. “With this new method, we can identify key cell types without any labeling.”

“Our next goal is to make a full catalogue of all the normal cell types of the immune system that can be detected in the bloodstream. Once we have this completed, we can then collaborate with our clinical colleagues to start identifying when these immune cells are altered, in conditions such as leukemia and lymphoma, potentially providing a rapid detection system from just a small blood sample.”

Publications
Topics

Blood samples

Photo by Graham Colm

A new method for identifying immune cells could pave the way for rapid detection of hematologic malignancies from a small blood sample, according to researchers.

The team found they could use wavelength modulated Raman spectroscopy (WMRS) to identify subsets of T cells, natural killer cells, and dendritic cells.

Traditional methods of identifying these cells usually involve labeling them with fluorescent or magnetically labeled antibodies.

Using WMRS, the researchers were able to identify immune cells with no labeling at all, thus permitting rapid identification and further analysis to take place with no potential alteration to the cells.

Simon Powis, PhD, of the University of St Andrews in Fife, Scotland, and his colleagues described this work in PLOS ONE.

Raman scattering refers to light scattering from molecules in a sample where the light energy can be shifted up or down and recorded as a “molecular fingerprint” that can be used for identification. Normally, this process is very weak and further hampered by other background light (eg, fluorescence).

WMRS subtly changes the incident laser light that, in turn, results in a modulation of the Raman signal, allowing it to be extracted from any (stationary) interfering signal.

Using WMRS, Dr Powis and his colleagues found they could identify CD4+ T cells, CD8+ T cells, CD56+ natural killer cells, CD303+ lymphoid/plasmacytoid dendritic cells, and CD1c+ myeloid dendritic cells.

“Under a normal light microscope, these immune cells essentially all look identical,” Dr Powis said. “With this new method, we can identify key cell types without any labeling.”

“Our next goal is to make a full catalogue of all the normal cell types of the immune system that can be detected in the bloodstream. Once we have this completed, we can then collaborate with our clinical colleagues to start identifying when these immune cells are altered, in conditions such as leukemia and lymphoma, potentially providing a rapid detection system from just a small blood sample.”

Blood samples

Photo by Graham Colm

A new method for identifying immune cells could pave the way for rapid detection of hematologic malignancies from a small blood sample, according to researchers.

The team found they could use wavelength modulated Raman spectroscopy (WMRS) to identify subsets of T cells, natural killer cells, and dendritic cells.

Traditional methods of identifying these cells usually involve labeling them with fluorescent or magnetically labeled antibodies.

Using WMRS, the researchers were able to identify immune cells with no labeling at all, thus permitting rapid identification and further analysis to take place with no potential alteration to the cells.

Simon Powis, PhD, of the University of St Andrews in Fife, Scotland, and his colleagues described this work in PLOS ONE.

Raman scattering refers to light scattering from molecules in a sample where the light energy can be shifted up or down and recorded as a “molecular fingerprint” that can be used for identification. Normally, this process is very weak and further hampered by other background light (eg, fluorescence).

WMRS subtly changes the incident laser light that, in turn, results in a modulation of the Raman signal, allowing it to be extracted from any (stationary) interfering signal.

Using WMRS, Dr Powis and his colleagues found they could identify CD4+ T cells, CD8+ T cells, CD56+ natural killer cells, CD303+ lymphoid/plasmacytoid dendritic cells, and CD1c+ myeloid dendritic cells.

“Under a normal light microscope, these immune cells essentially all look identical,” Dr Powis said. “With this new method, we can identify key cell types without any labeling.”

“Our next goal is to make a full catalogue of all the normal cell types of the immune system that can be detected in the bloodstream. Once we have this completed, we can then collaborate with our clinical colleagues to start identifying when these immune cells are altered, in conditions such as leukemia and lymphoma, potentially providing a rapid detection system from just a small blood sample.”

Publications
Publications
Topics
Article Type
Display Headline
Team reports new method to identify immune cells
Display Headline
Team reports new method to identify immune cells
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Co-infection may boost malaria mortality

Article Type
Changed
Display Headline
Co-infection may boost malaria mortality

Lab mouse

Co-infection with malaria and a virus closely related to the Epstein-Barr virus (EBV) may make the malaria lethal, according to preclinical research published in PLOS Pathogens.

Children in sub-Saharan Africa become infected with EBV in infancy.

Within the same time period, they become susceptible to malaria parasite infection because protective antibodies from their mothers fade away.

“Where we think kids get into trouble is when both infections are happening at the same time, because case reports show EBV can produce a weeks-long suppression of the immune system,” said Tracey Lamb, PhD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lamb and her colleagues studied mice infected by the malaria parasite Plasmodium yoelii, which is usually non-lethal because the mice develop antibodies that control the parasites.

The researchers found that co-infection with murine gammaherpesvirus 68 (MHV68), a close relative of EBV that infects mice, made P yoelii lethal.

However, mice that had entered the chronic phase of MHV68 infection (several weeks to months after primary infection) were not affected.

The experiments indicated that MHV68 infection hinders the immune system in developing antibodies against P yoelii.

“These results are part of a pattern of evidence suggesting that clinicians treating severe malaria should check for acute EBV co-infection, and that ongoing malaria studies should include EBV as a potential risk factor for more severe forms of the disease,” said Caline Matar, a graduate student at Emory University School of Medicine.

“This phenomenon may not be unique to EBV,” added Sam Speck, PhD, also of Emory University School of Medicine.

“[I]nfections with other pathogens may also exacerbate malarial disease, since many pathogens have the capacity to suppress various components of the host immune response.”

Publications
Topics

Lab mouse

Co-infection with malaria and a virus closely related to the Epstein-Barr virus (EBV) may make the malaria lethal, according to preclinical research published in PLOS Pathogens.

Children in sub-Saharan Africa become infected with EBV in infancy.

Within the same time period, they become susceptible to malaria parasite infection because protective antibodies from their mothers fade away.

“Where we think kids get into trouble is when both infections are happening at the same time, because case reports show EBV can produce a weeks-long suppression of the immune system,” said Tracey Lamb, PhD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lamb and her colleagues studied mice infected by the malaria parasite Plasmodium yoelii, which is usually non-lethal because the mice develop antibodies that control the parasites.

The researchers found that co-infection with murine gammaherpesvirus 68 (MHV68), a close relative of EBV that infects mice, made P yoelii lethal.

However, mice that had entered the chronic phase of MHV68 infection (several weeks to months after primary infection) were not affected.

The experiments indicated that MHV68 infection hinders the immune system in developing antibodies against P yoelii.

“These results are part of a pattern of evidence suggesting that clinicians treating severe malaria should check for acute EBV co-infection, and that ongoing malaria studies should include EBV as a potential risk factor for more severe forms of the disease,” said Caline Matar, a graduate student at Emory University School of Medicine.

“This phenomenon may not be unique to EBV,” added Sam Speck, PhD, also of Emory University School of Medicine.

“[I]nfections with other pathogens may also exacerbate malarial disease, since many pathogens have the capacity to suppress various components of the host immune response.”

Lab mouse

Co-infection with malaria and a virus closely related to the Epstein-Barr virus (EBV) may make the malaria lethal, according to preclinical research published in PLOS Pathogens.

Children in sub-Saharan Africa become infected with EBV in infancy.

Within the same time period, they become susceptible to malaria parasite infection because protective antibodies from their mothers fade away.

“Where we think kids get into trouble is when both infections are happening at the same time, because case reports show EBV can produce a weeks-long suppression of the immune system,” said Tracey Lamb, PhD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lamb and her colleagues studied mice infected by the malaria parasite Plasmodium yoelii, which is usually non-lethal because the mice develop antibodies that control the parasites.

The researchers found that co-infection with murine gammaherpesvirus 68 (MHV68), a close relative of EBV that infects mice, made P yoelii lethal.

However, mice that had entered the chronic phase of MHV68 infection (several weeks to months after primary infection) were not affected.

The experiments indicated that MHV68 infection hinders the immune system in developing antibodies against P yoelii.

“These results are part of a pattern of evidence suggesting that clinicians treating severe malaria should check for acute EBV co-infection, and that ongoing malaria studies should include EBV as a potential risk factor for more severe forms of the disease,” said Caline Matar, a graduate student at Emory University School of Medicine.

“This phenomenon may not be unique to EBV,” added Sam Speck, PhD, also of Emory University School of Medicine.

“[I]nfections with other pathogens may also exacerbate malarial disease, since many pathogens have the capacity to suppress various components of the host immune response.”

Publications
Publications
Topics
Article Type
Display Headline
Co-infection may boost malaria mortality
Display Headline
Co-infection may boost malaria mortality
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica