DNA finding has implications for MPNs

Article Type
Changed
Display Headline
DNA finding has implications for MPNs

DNA helices

Credit: NIGMS

A new study suggests the timing of DNA replication—including where the origin points are and in what order DNA segments are copied—varies from person to person.

The research also revealed the first genetic variants that orchestrate replication timing.

And researchers found evidence suggesting that differences in replication timing may explain why some people are more prone than others to developing myeloproliferative neoplasms (MPNs).

“Everyone’s cells have a plan for copying the genome,” said study author Steven McCarroll, PhD, of Harvard Medical School in Boston. “The idea that we don’t all have the same plan is surprising and interesting.”

Dr McCarroll and his colleagues described this research in Cell.

Replication timing and MPNs

The researchers noted that DNA replication is one of the most fundamental cellular processes, and any variation among people is likely to affect genetic inheritance, including individual disease risk as well as human evolution.

Replication timing is known to affect mutation rates. DNA segments that are copied too late or too early tend to have more errors.

The new study indicates that people with different timing programs therefore have different patterns of mutation risk across their genomes. For example, differences in replication timing could explain predisposition to MPNs.

Researchers previously showed that acquired mutations in JAK2 lead to MPNs. They also noticed that people with JAK2 mutations tend to have a distinctive set of inherited genetic variants nearby, but they weren’t sure how the inherited variants and the new mutations were connected.

Dr McCarroll’s team found that the inherited variants are associated with an “unusually early” replication origin point and proposed that JAK2 is more likely to develop mutations in people with that very early origin point.

“Replication timing may be a way that inherited variation contributes to the risk of later mutations and diseases that we usually think of as arising by chance,” Dr McCarroll said.

A new method of study

Dr McCarroll and his colleagues were able to make these discoveries, in large part, because they invented a new way to obtain DNA replication timing data. They turned to the 1000 Genomes Project, which maintains an online database of sequencing data collected from hundreds of people around the world.

Because much of the DNA in the 1000 Genomes Project had been extracted from actively dividing cells, the team hypothesized that information about replication timing lurked within, and they were right.

They counted the number of copies of individual genes in each genome. Because early replication origins had created more segment copies at the time the sample was taken than late replication origins had, the researchers were able to create a personalized replication timing map for each person.

“People had seen these patterns before but just dismissed them as artifacts of sequencing technology,” Dr McCarroll said. After conducting numerous tests to rule out that possibility, “we found that they reflect real biology.”

The researchers then compared each person’s copy number information with his or her genetic sequence data to see if they could match specific genetic variants to replication timing differences. From 161 samples, the team identified 16 variants. The variants were short, and most were common.

“I think this is the first time we can pinpoint genetic influences on replication timing in any organism,” said study author Amnon Koren, PhD, also of Harvard Medical School.

The variants were located near replication origin points, leading the researchers to wonder if they affect replication timing by altering where a person’s origin points are. The team also suspects the variants work by altering chromatin structure, exposing local sequences to replication machinery.

 

 

The group intends to find out. They also want to search for additional variants that control replication timing.

“These 16 variants are almost certainly just the tip of the iceberg,” Dr Koren said.

He and his colleagues believe that, as more variants come to light in future studies, researchers should be better able to manipulate replication timing in the lab and learn more about how it works and its biological significance.

“All you need to do to study replication timing is grow cells and sequence their DNA, which everyone is doing these days,” Dr Koren said. “[This new method] is much easier, faster, and cheaper, and I think it will transform the field because we can now do experiments in large scale.”

“We found that there is biological information in genome sequence data,” Dr McCarroll added. “But this was still an accidental biological experiment. Now imagine the results when we and others actually design experiments to study this phenomenon.”

Publications
Topics

DNA helices

Credit: NIGMS

A new study suggests the timing of DNA replication—including where the origin points are and in what order DNA segments are copied—varies from person to person.

The research also revealed the first genetic variants that orchestrate replication timing.

And researchers found evidence suggesting that differences in replication timing may explain why some people are more prone than others to developing myeloproliferative neoplasms (MPNs).

“Everyone’s cells have a plan for copying the genome,” said study author Steven McCarroll, PhD, of Harvard Medical School in Boston. “The idea that we don’t all have the same plan is surprising and interesting.”

Dr McCarroll and his colleagues described this research in Cell.

Replication timing and MPNs

The researchers noted that DNA replication is one of the most fundamental cellular processes, and any variation among people is likely to affect genetic inheritance, including individual disease risk as well as human evolution.

Replication timing is known to affect mutation rates. DNA segments that are copied too late or too early tend to have more errors.

The new study indicates that people with different timing programs therefore have different patterns of mutation risk across their genomes. For example, differences in replication timing could explain predisposition to MPNs.

Researchers previously showed that acquired mutations in JAK2 lead to MPNs. They also noticed that people with JAK2 mutations tend to have a distinctive set of inherited genetic variants nearby, but they weren’t sure how the inherited variants and the new mutations were connected.

Dr McCarroll’s team found that the inherited variants are associated with an “unusually early” replication origin point and proposed that JAK2 is more likely to develop mutations in people with that very early origin point.

“Replication timing may be a way that inherited variation contributes to the risk of later mutations and diseases that we usually think of as arising by chance,” Dr McCarroll said.

A new method of study

Dr McCarroll and his colleagues were able to make these discoveries, in large part, because they invented a new way to obtain DNA replication timing data. They turned to the 1000 Genomes Project, which maintains an online database of sequencing data collected from hundreds of people around the world.

Because much of the DNA in the 1000 Genomes Project had been extracted from actively dividing cells, the team hypothesized that information about replication timing lurked within, and they were right.

They counted the number of copies of individual genes in each genome. Because early replication origins had created more segment copies at the time the sample was taken than late replication origins had, the researchers were able to create a personalized replication timing map for each person.

“People had seen these patterns before but just dismissed them as artifacts of sequencing technology,” Dr McCarroll said. After conducting numerous tests to rule out that possibility, “we found that they reflect real biology.”

The researchers then compared each person’s copy number information with his or her genetic sequence data to see if they could match specific genetic variants to replication timing differences. From 161 samples, the team identified 16 variants. The variants were short, and most were common.

“I think this is the first time we can pinpoint genetic influences on replication timing in any organism,” said study author Amnon Koren, PhD, also of Harvard Medical School.

The variants were located near replication origin points, leading the researchers to wonder if they affect replication timing by altering where a person’s origin points are. The team also suspects the variants work by altering chromatin structure, exposing local sequences to replication machinery.

 

 

The group intends to find out. They also want to search for additional variants that control replication timing.

“These 16 variants are almost certainly just the tip of the iceberg,” Dr Koren said.

He and his colleagues believe that, as more variants come to light in future studies, researchers should be better able to manipulate replication timing in the lab and learn more about how it works and its biological significance.

“All you need to do to study replication timing is grow cells and sequence their DNA, which everyone is doing these days,” Dr Koren said. “[This new method] is much easier, faster, and cheaper, and I think it will transform the field because we can now do experiments in large scale.”

“We found that there is biological information in genome sequence data,” Dr McCarroll added. “But this was still an accidental biological experiment. Now imagine the results when we and others actually design experiments to study this phenomenon.”

DNA helices

Credit: NIGMS

A new study suggests the timing of DNA replication—including where the origin points are and in what order DNA segments are copied—varies from person to person.

The research also revealed the first genetic variants that orchestrate replication timing.

And researchers found evidence suggesting that differences in replication timing may explain why some people are more prone than others to developing myeloproliferative neoplasms (MPNs).

“Everyone’s cells have a plan for copying the genome,” said study author Steven McCarroll, PhD, of Harvard Medical School in Boston. “The idea that we don’t all have the same plan is surprising and interesting.”

Dr McCarroll and his colleagues described this research in Cell.

Replication timing and MPNs

The researchers noted that DNA replication is one of the most fundamental cellular processes, and any variation among people is likely to affect genetic inheritance, including individual disease risk as well as human evolution.

Replication timing is known to affect mutation rates. DNA segments that are copied too late or too early tend to have more errors.

The new study indicates that people with different timing programs therefore have different patterns of mutation risk across their genomes. For example, differences in replication timing could explain predisposition to MPNs.

Researchers previously showed that acquired mutations in JAK2 lead to MPNs. They also noticed that people with JAK2 mutations tend to have a distinctive set of inherited genetic variants nearby, but they weren’t sure how the inherited variants and the new mutations were connected.

Dr McCarroll’s team found that the inherited variants are associated with an “unusually early” replication origin point and proposed that JAK2 is more likely to develop mutations in people with that very early origin point.

“Replication timing may be a way that inherited variation contributes to the risk of later mutations and diseases that we usually think of as arising by chance,” Dr McCarroll said.

A new method of study

Dr McCarroll and his colleagues were able to make these discoveries, in large part, because they invented a new way to obtain DNA replication timing data. They turned to the 1000 Genomes Project, which maintains an online database of sequencing data collected from hundreds of people around the world.

Because much of the DNA in the 1000 Genomes Project had been extracted from actively dividing cells, the team hypothesized that information about replication timing lurked within, and they were right.

They counted the number of copies of individual genes in each genome. Because early replication origins had created more segment copies at the time the sample was taken than late replication origins had, the researchers were able to create a personalized replication timing map for each person.

“People had seen these patterns before but just dismissed them as artifacts of sequencing technology,” Dr McCarroll said. After conducting numerous tests to rule out that possibility, “we found that they reflect real biology.”

The researchers then compared each person’s copy number information with his or her genetic sequence data to see if they could match specific genetic variants to replication timing differences. From 161 samples, the team identified 16 variants. The variants were short, and most were common.

“I think this is the first time we can pinpoint genetic influences on replication timing in any organism,” said study author Amnon Koren, PhD, also of Harvard Medical School.

The variants were located near replication origin points, leading the researchers to wonder if they affect replication timing by altering where a person’s origin points are. The team also suspects the variants work by altering chromatin structure, exposing local sequences to replication machinery.

 

 

The group intends to find out. They also want to search for additional variants that control replication timing.

“These 16 variants are almost certainly just the tip of the iceberg,” Dr Koren said.

He and his colleagues believe that, as more variants come to light in future studies, researchers should be better able to manipulate replication timing in the lab and learn more about how it works and its biological significance.

“All you need to do to study replication timing is grow cells and sequence their DNA, which everyone is doing these days,” Dr Koren said. “[This new method] is much easier, faster, and cheaper, and I think it will transform the field because we can now do experiments in large scale.”

“We found that there is biological information in genome sequence data,” Dr McCarroll added. “But this was still an accidental biological experiment. Now imagine the results when we and others actually design experiments to study this phenomenon.”

Publications
Publications
Topics
Article Type
Display Headline
DNA finding has implications for MPNs
Display Headline
DNA finding has implications for MPNs
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Strategy could reduce TRALI after platelet transfusion

Article Type
Changed
Display Headline
Strategy could reduce TRALI after platelet transfusion

PHILADELPHIA—Researchers believe a simple screening strategy could reduce the risk of transfusion-related acute lung injury (TRALI) in patients receiving apheresis platelets (APs) by about 60%.

Studying TRALI cases reported to the American Red Cross, the investigators found evidence to support the idea that testing female AP donors who report prior pregnancy and deferring those with human leukocyte antigen (HLA) antibodies could greatly decrease the risk of TRALI.

Anne Eder, MD, of the American Red Cross in Rockville, Maryland, presented this research at the AABB Annual Meeting 2014 (abstract S82-040B).

Dr Eder and her colleagues assessed cases of TRALI and possible TRALI reported to the American Red Cross’s national hemovigilance program. The researchers compared the incidence of TRALI according to the type of blood component transfused as well as the sex of the donor.

TRALI cases due to APs and red blood cells (RBCs) from 2006 to 2013 and male-donor-predominant plasma from 2008 to 2013 were calculated as rates per 106 distributed units.

The blood center distributed 6.6 million AP units (>70% from male donors, excluding platelet additive solution), 9.6 million plasma units (>95% from male donors), and 48.6 million RBC units (54% from male donors).

In all, there were 224 cases of TRALI, 175 among patients who received a single type of blood component within 6 hours. There were 36 TRALI cases among plasma recipients, 92 among RBC recipients, and 41 among AP recipients.

The TRALI risk was about 3-fold greater for AP recipients than for RBC recipients or recipients of male-predominant plasma. The odds ratios (ORs) were 3.2, 1.0, and 0.8, respectively. The OR for all plasma recipients (including group AB female plasma) was 2.0.

The rate of fatalities was higher for AP recipients than RBC recipients, at 0.6 per 106 and 0.2 per 106, respectively (P=0.04).

When the researchers analyzed TRALI cases according to donor, they found a nearly 6-fold predilection for female donors among AP recipients (OR=5.6) and a nearly 5-fold predilection for female donors in RBC recipients (OR=4.5).

The investigators also considered the 41 AP TRALI cases individually to assess how effective a screening program might have been for reducing the risk of TRALI.

In 12 cases, patients had received AP from a male donor. Of the 29 female donors, 26 had reported a prior pregnancy, and 2 had test results suggesting a prior pregnancy.

Of those 28 donors, 3 were negative for HLA antibodies, leaving 25 cases, or 61%, positive for HLA antibodies.

Seventeen of the female donors had HLA class I and II antibodies, including 3 whose donation resulted in a fatality. One had HLA class I only, 2 had HLA class II only, 5 had HLA I or II and a specific human neutrophil antigen (HNA) antibody, and 1 had a specific HNA antibody only.

The researchers evaluated 7 cases in which donors had HLA class I or II antibodies. And they found that all 7 had signal-to-cutoff ratios much higher than any cutoff discussed for screening donors (greater than 100).

“So we predict that a strategy to test female apheresis donors who report prior pregnancy and to defer those with HLA antibodies may reduce the risk of TRALI by about 60% and prevent cases from human neutrophil antibodies as well,” Dr Eder concluded.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

PHILADELPHIA—Researchers believe a simple screening strategy could reduce the risk of transfusion-related acute lung injury (TRALI) in patients receiving apheresis platelets (APs) by about 60%.

Studying TRALI cases reported to the American Red Cross, the investigators found evidence to support the idea that testing female AP donors who report prior pregnancy and deferring those with human leukocyte antigen (HLA) antibodies could greatly decrease the risk of TRALI.

Anne Eder, MD, of the American Red Cross in Rockville, Maryland, presented this research at the AABB Annual Meeting 2014 (abstract S82-040B).

Dr Eder and her colleagues assessed cases of TRALI and possible TRALI reported to the American Red Cross’s national hemovigilance program. The researchers compared the incidence of TRALI according to the type of blood component transfused as well as the sex of the donor.

TRALI cases due to APs and red blood cells (RBCs) from 2006 to 2013 and male-donor-predominant plasma from 2008 to 2013 were calculated as rates per 106 distributed units.

The blood center distributed 6.6 million AP units (>70% from male donors, excluding platelet additive solution), 9.6 million plasma units (>95% from male donors), and 48.6 million RBC units (54% from male donors).

In all, there were 224 cases of TRALI, 175 among patients who received a single type of blood component within 6 hours. There were 36 TRALI cases among plasma recipients, 92 among RBC recipients, and 41 among AP recipients.

The TRALI risk was about 3-fold greater for AP recipients than for RBC recipients or recipients of male-predominant plasma. The odds ratios (ORs) were 3.2, 1.0, and 0.8, respectively. The OR for all plasma recipients (including group AB female plasma) was 2.0.

The rate of fatalities was higher for AP recipients than RBC recipients, at 0.6 per 106 and 0.2 per 106, respectively (P=0.04).

When the researchers analyzed TRALI cases according to donor, they found a nearly 6-fold predilection for female donors among AP recipients (OR=5.6) and a nearly 5-fold predilection for female donors in RBC recipients (OR=4.5).

The investigators also considered the 41 AP TRALI cases individually to assess how effective a screening program might have been for reducing the risk of TRALI.

In 12 cases, patients had received AP from a male donor. Of the 29 female donors, 26 had reported a prior pregnancy, and 2 had test results suggesting a prior pregnancy.

Of those 28 donors, 3 were negative for HLA antibodies, leaving 25 cases, or 61%, positive for HLA antibodies.

Seventeen of the female donors had HLA class I and II antibodies, including 3 whose donation resulted in a fatality. One had HLA class I only, 2 had HLA class II only, 5 had HLA I or II and a specific human neutrophil antigen (HNA) antibody, and 1 had a specific HNA antibody only.

The researchers evaluated 7 cases in which donors had HLA class I or II antibodies. And they found that all 7 had signal-to-cutoff ratios much higher than any cutoff discussed for screening donors (greater than 100).

“So we predict that a strategy to test female apheresis donors who report prior pregnancy and to defer those with HLA antibodies may reduce the risk of TRALI by about 60% and prevent cases from human neutrophil antibodies as well,” Dr Eder concluded.

PHILADELPHIA—Researchers believe a simple screening strategy could reduce the risk of transfusion-related acute lung injury (TRALI) in patients receiving apheresis platelets (APs) by about 60%.

Studying TRALI cases reported to the American Red Cross, the investigators found evidence to support the idea that testing female AP donors who report prior pregnancy and deferring those with human leukocyte antigen (HLA) antibodies could greatly decrease the risk of TRALI.

Anne Eder, MD, of the American Red Cross in Rockville, Maryland, presented this research at the AABB Annual Meeting 2014 (abstract S82-040B).

Dr Eder and her colleagues assessed cases of TRALI and possible TRALI reported to the American Red Cross’s national hemovigilance program. The researchers compared the incidence of TRALI according to the type of blood component transfused as well as the sex of the donor.

TRALI cases due to APs and red blood cells (RBCs) from 2006 to 2013 and male-donor-predominant plasma from 2008 to 2013 were calculated as rates per 106 distributed units.

The blood center distributed 6.6 million AP units (>70% from male donors, excluding platelet additive solution), 9.6 million plasma units (>95% from male donors), and 48.6 million RBC units (54% from male donors).

In all, there were 224 cases of TRALI, 175 among patients who received a single type of blood component within 6 hours. There were 36 TRALI cases among plasma recipients, 92 among RBC recipients, and 41 among AP recipients.

The TRALI risk was about 3-fold greater for AP recipients than for RBC recipients or recipients of male-predominant plasma. The odds ratios (ORs) were 3.2, 1.0, and 0.8, respectively. The OR for all plasma recipients (including group AB female plasma) was 2.0.

The rate of fatalities was higher for AP recipients than RBC recipients, at 0.6 per 106 and 0.2 per 106, respectively (P=0.04).

When the researchers analyzed TRALI cases according to donor, they found a nearly 6-fold predilection for female donors among AP recipients (OR=5.6) and a nearly 5-fold predilection for female donors in RBC recipients (OR=4.5).

The investigators also considered the 41 AP TRALI cases individually to assess how effective a screening program might have been for reducing the risk of TRALI.

In 12 cases, patients had received AP from a male donor. Of the 29 female donors, 26 had reported a prior pregnancy, and 2 had test results suggesting a prior pregnancy.

Of those 28 donors, 3 were negative for HLA antibodies, leaving 25 cases, or 61%, positive for HLA antibodies.

Seventeen of the female donors had HLA class I and II antibodies, including 3 whose donation resulted in a fatality. One had HLA class I only, 2 had HLA class II only, 5 had HLA I or II and a specific human neutrophil antigen (HNA) antibody, and 1 had a specific HNA antibody only.

The researchers evaluated 7 cases in which donors had HLA class I or II antibodies. And they found that all 7 had signal-to-cutoff ratios much higher than any cutoff discussed for screening donors (greater than 100).

“So we predict that a strategy to test female apheresis donors who report prior pregnancy and to defer those with HLA antibodies may reduce the risk of TRALI by about 60% and prevent cases from human neutrophil antibodies as well,” Dr Eder concluded.

Publications
Publications
Topics
Article Type
Display Headline
Strategy could reduce TRALI after platelet transfusion
Display Headline
Strategy could reduce TRALI after platelet transfusion
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Sickle cell trait linked to increased risk of CKD

Article Type
Changed
Display Headline
Sickle cell trait linked to increased risk of CKD

One doctor examines a patient

while another looks on

Credit: NCI

Sickle cell trait may increase the risk of chronic kidney disease (CKD) and poor kidney function, according to a study published in JAMA.

Researchers evaluated nearly 16,000 African Americans and found that subjects with sickle cell trait had a greater risk of CKD and incident CKD than subjects who did not have the trait.

Trait carriers were also more likely to have albuminuria and a decrease in estimated glomerular filtration rate (eGFR), both characteristics of poor kidney function.

This study was released to coincide with its presentation at the American Society of Nephrology’s Kidney Week Annual Meeting.

Rakhi P. Naik, MD, of Johns Hopkins University in Baltimore, and her colleagues conducted this research to investigate the relationship between sickle cell trait and kidney impairment.

The team looked at data from 5 large, population-based studies. They evaluated 15,975 self-identified African Americans—1248 of whom had sickle cell trait and 14,727 who did not.

The researchers assessed the incidence of CKD, which was defined as an eGFR of <60 mL/min/1.73m2 at baseline or follow-up, and incident CKD. They also assessed the rate of albuminuria, which was defined as a spot urine albumin:creatinine ratio of >30mg/g or albumin excretion rate >30mg/24 hours, and decline in eGFR, which was defined as a decrease of >3 mL/min/1.73m2 per year.

CKD and incident CKD were more common among sickle cell trait carriers than noncarriers. CKD was present in 19.2% (239/1247) of carriers and 13.5% (1994/14,722) of noncarriers. And incident CKD was present in 20.7% (140/675) of carriers and 13.7% (1158/8481) of noncarriers.

Sickle cell trait was associated with a faster decline in eGFR, as 22.6% (150/665) of carriers and 19.0% (1569/8249) of noncarriers met the definition of eGFR decline.

And the trait was associated with a higher incidence of albuminuria, as 31.8% (154/485) of carriers had albuminuria, compared to 19.6% (1168/5947) of noncarriers.

So subjects with sickle cell trait had a greater risk of CKD (odds ratio [OR], 1.57), incident CKD (OR, 1.79), decline in eGFR (OR, 1.32), and albuminuria (OR, 1.86).

The researchers said the associations found in this study may offer an additional genetic explanation for the increased risk of CKD observed among African Americans compared with other racial groups.

They added that the study also highlights the need for further research into the renal complications of sickle cell trait. Because screening for the trait is widely performed, accurate characterization of disease associations with sickle cell trait is needed to inform policy and treatment recommendations.

Publications
Topics

One doctor examines a patient

while another looks on

Credit: NCI

Sickle cell trait may increase the risk of chronic kidney disease (CKD) and poor kidney function, according to a study published in JAMA.

Researchers evaluated nearly 16,000 African Americans and found that subjects with sickle cell trait had a greater risk of CKD and incident CKD than subjects who did not have the trait.

Trait carriers were also more likely to have albuminuria and a decrease in estimated glomerular filtration rate (eGFR), both characteristics of poor kidney function.

This study was released to coincide with its presentation at the American Society of Nephrology’s Kidney Week Annual Meeting.

Rakhi P. Naik, MD, of Johns Hopkins University in Baltimore, and her colleagues conducted this research to investigate the relationship between sickle cell trait and kidney impairment.

The team looked at data from 5 large, population-based studies. They evaluated 15,975 self-identified African Americans—1248 of whom had sickle cell trait and 14,727 who did not.

The researchers assessed the incidence of CKD, which was defined as an eGFR of <60 mL/min/1.73m2 at baseline or follow-up, and incident CKD. They also assessed the rate of albuminuria, which was defined as a spot urine albumin:creatinine ratio of >30mg/g or albumin excretion rate >30mg/24 hours, and decline in eGFR, which was defined as a decrease of >3 mL/min/1.73m2 per year.

CKD and incident CKD were more common among sickle cell trait carriers than noncarriers. CKD was present in 19.2% (239/1247) of carriers and 13.5% (1994/14,722) of noncarriers. And incident CKD was present in 20.7% (140/675) of carriers and 13.7% (1158/8481) of noncarriers.

Sickle cell trait was associated with a faster decline in eGFR, as 22.6% (150/665) of carriers and 19.0% (1569/8249) of noncarriers met the definition of eGFR decline.

And the trait was associated with a higher incidence of albuminuria, as 31.8% (154/485) of carriers had albuminuria, compared to 19.6% (1168/5947) of noncarriers.

So subjects with sickle cell trait had a greater risk of CKD (odds ratio [OR], 1.57), incident CKD (OR, 1.79), decline in eGFR (OR, 1.32), and albuminuria (OR, 1.86).

The researchers said the associations found in this study may offer an additional genetic explanation for the increased risk of CKD observed among African Americans compared with other racial groups.

They added that the study also highlights the need for further research into the renal complications of sickle cell trait. Because screening for the trait is widely performed, accurate characterization of disease associations with sickle cell trait is needed to inform policy and treatment recommendations.

One doctor examines a patient

while another looks on

Credit: NCI

Sickle cell trait may increase the risk of chronic kidney disease (CKD) and poor kidney function, according to a study published in JAMA.

Researchers evaluated nearly 16,000 African Americans and found that subjects with sickle cell trait had a greater risk of CKD and incident CKD than subjects who did not have the trait.

Trait carriers were also more likely to have albuminuria and a decrease in estimated glomerular filtration rate (eGFR), both characteristics of poor kidney function.

This study was released to coincide with its presentation at the American Society of Nephrology’s Kidney Week Annual Meeting.

Rakhi P. Naik, MD, of Johns Hopkins University in Baltimore, and her colleagues conducted this research to investigate the relationship between sickle cell trait and kidney impairment.

The team looked at data from 5 large, population-based studies. They evaluated 15,975 self-identified African Americans—1248 of whom had sickle cell trait and 14,727 who did not.

The researchers assessed the incidence of CKD, which was defined as an eGFR of <60 mL/min/1.73m2 at baseline or follow-up, and incident CKD. They also assessed the rate of albuminuria, which was defined as a spot urine albumin:creatinine ratio of >30mg/g or albumin excretion rate >30mg/24 hours, and decline in eGFR, which was defined as a decrease of >3 mL/min/1.73m2 per year.

CKD and incident CKD were more common among sickle cell trait carriers than noncarriers. CKD was present in 19.2% (239/1247) of carriers and 13.5% (1994/14,722) of noncarriers. And incident CKD was present in 20.7% (140/675) of carriers and 13.7% (1158/8481) of noncarriers.

Sickle cell trait was associated with a faster decline in eGFR, as 22.6% (150/665) of carriers and 19.0% (1569/8249) of noncarriers met the definition of eGFR decline.

And the trait was associated with a higher incidence of albuminuria, as 31.8% (154/485) of carriers had albuminuria, compared to 19.6% (1168/5947) of noncarriers.

So subjects with sickle cell trait had a greater risk of CKD (odds ratio [OR], 1.57), incident CKD (OR, 1.79), decline in eGFR (OR, 1.32), and albuminuria (OR, 1.86).

The researchers said the associations found in this study may offer an additional genetic explanation for the increased risk of CKD observed among African Americans compared with other racial groups.

They added that the study also highlights the need for further research into the renal complications of sickle cell trait. Because screening for the trait is widely performed, accurate characterization of disease associations with sickle cell trait is needed to inform policy and treatment recommendations.

Publications
Publications
Topics
Article Type
Display Headline
Sickle cell trait linked to increased risk of CKD
Display Headline
Sickle cell trait linked to increased risk of CKD
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Optimizing the Primary Care Management of Chronic Pain Through Telecare

Article Type
Changed
Display Headline
Optimizing the Primary Care Management of Chronic Pain Through Telecare

Study Overview

Objective. To evaluate the effectiveness of a collaborative telecare intervention on chronic pain management.

Design. Randomized clinical trial.

Settings and participants. Participants were recruited over a 2-year period from 5 primary care clinics within a single Veterans Affairs medical center. Patients aged 18 to 65 years were eligible if they had chronic (≥ 3 months) musculoskeletal pain of at least moderate intensity (Brief Pain Inventory [BPI] score ≥ 5). Patients were  excluded if they had a pending disability claim or a diagnosis of bipolar disorder, schizophrenia, moderately severe cognitive impairment, active suicidal ideation, current illicit drug use or a terminal illness or received primary care outside of the VA. Participants were randomized to either the telephone-delivered collaborative care management intervention group or usual care. Usual care was defined as continuing to receive care from their primary care provider for management of chronic, musculoskeletal pain.

Intervention. The telecare intervention comprised automated symptom monitoring (ASM) and optimized analgesic management through an algorithm-guided stepped care approach delivered by a nurse case manager. ASM was delivered either by an interactive voice-recorded telephone call (51%) or by internet (49%), set according to patient preference. Intervention calls occurred at 1 and 3 months. Additional contact with participants from the intervention group was generated in response to ASM trend reports.

Main outcome measures. The primary outcome was the BPI total score. The BPI scale ranges from 0 to 10, with higher scores indicating worsening pain. A 1-point change is considered clinically important. Secondary pain outcomes included BPI interference and severity, global pain improvement, treatment satisfaction, and use of opioids and other analgesics. Patients were interviewed at 1, 3, 6, and 12 months.

Main results. A total of 250 participants were enrolled, 124 assigned to the intervention group and 126 assigned to usual care. The mean (SD) baseline BPI scores were 5.31 (1.81) for the intervention group and 5.12 (1.80) for usual care. Compared with usual care, the intervention group had a 1.02-point lower BPI score at 12 months (95% confidence interval [CI], −1.58 to −0.47) (P < 0.001). Patients in the intervention group were nearly twice as likely to report at least a 30% improvement in their pain score by 12 months (51.7% vs. 27.1%; relative risk [RR], 1.9 [95% CI, 1.4 to 2.7]), with a number needed to treat of 4.1 (95% CI, 3.0 to 6.4) for a 30% improvement.

Patients in the intervention group were more likely to rate as good to excellent the medication prescribed for their pain (73.9% vs 50.9%; RR, 1.5 [95% CI, 1.2 to 1.8]). Patients in the usual care group were more likely to experience worsening of pain by 6 months compared with the intervention group. A greater number of analgesics were prescribed to patients in the intervention group; however, opioid use between groups did not differ at baseline or at any point during the trial period. For the secondary outcomes, the intervention group reported greater improvement in depression compared with the usual care group, and this difference was statistically significant (P < 0.001). They also reported fewer days of disability (P = 0.34).

Conclusion. Telecare collaborative management was more effective in improving chronic pain outcomes than usual care. This was accomplished through the optimization of non-opioid analgesic therapy facilitated by a stepped care algorithm and automated symptom monitoring.

Commentary

Chronic pain affects up to 116 million American adults and is recognized as an emerging public health problem that costs the United States a half trillion dollars annually, with disability and hospitalization as the largest burdens [1].The physical and psychological complexities of chronic pain require comprehensive individualized care from interdisciplinary teams who will facilitate prevention, treatment, and routine assessment in chronic pain sufferers [2]. However, enhancing pain management in primary care requires overcoming the high costs and considerable time needed to continually support patients in pain. Telecare represents an improved means by which doctors and nurses can provide primary care services to patients in need of comprehensive pain management. However, the effectiveness of interventions delivered to patients suffering from chronic pain, via telecare, is largely unknown.

This study had several strengths, including a distinct and well-defined intervention, population, comparator, and outcome. The inclusion criteria were broad enough to account for various age-groups, and therefore various pain experiences, yet excluded patients with characteristics likely to confound pain outcomes, such as severe mental health disorders. Participants were randomized in blinded fashion to 1 of 2 clearly defined groups. The stepped algorithm used in the study, SCOPE [3], is a validated and reliable method for assessing chronic pain outcomes. The statistical analyses were appropriate and included analyses of variance to detect between-group differences for continuous variables. The rate of follow-up was excellent, with 95% of participants providing measurable outcome assessments at 12 months. The scientific background and rationale for this study were explicit and relevant to current advances in medicine.

The study is not without limitations, however. It is unclear whether the 2 trial groups were treated equally. Data received through ASM from the intervention group prompted physicians to adjust a patient’s medication regimen, essentially providing caregivers updates on a patient’s status. This occurred in addition to the 4 monthly interviews that both groups received per protocol. The study did not elucidate exactly what care was provided to the usual care group and, therefore, does not allow for the disaggregation of the relative effects of optimizing analgesics and continuous provider monitoring. It is difficult to distinguish if additional care or the intervention was more effective in managing pain than usual care. Another limitation, noted by the authors, is the study’s use of a single VA medical center. Demographics reveal a skewed population, 83% male and 77% white, limiting the trial’s generalizability. Most clinical outcomes were considered, though cost-effectiveness of the intervention was not analyzed. As the VA is a cost-sensitive environment, it is important that interventions assessed are not more costly than usual care. Further cost analysis beyond health resource utilization reported in the study would provide a nuanced assessment of telecare’s feasibility as a replacement for usual primary care. Statistically, the study shows significant improvements in chronic pain in those who received the intervention via telecare, therefore, cost analysis is indeed warranted.

Applications for Clinical Practice

This study illuminates the need for a more intensive pain management program that allows for continuous monitoring. Though the intervention was successfully delivered via telecare, further research is needed to assess whether other programs would be as effective when delivered through telecare, and more importantly, to investigate what characteristics of interventions make telecare successful. Telecare has the potential to improve outcomes, reduce costs, and reduce strains on understaffed facilities, though it is still unknown which conditions would gain from this innovation. This study shows that chronic disease, a predominately self-managed condition, would benefit from a more accessible management program [4]. This, however, may not be the case for other health issues, which require continual testing and equipment usage, such as infectious diseases. Further studies should focus on populations that command a patient-centered intervention delivered using a potentially low-cost tool, like the telephone or internet. Finally, a significant cost driver with chronic pain is disability, and though change in disability days was not statistically significant in this trial, patients in the intervention group self-reported a decrease in disability days, where as patients in the usual care group self-reported an increase. A clinical improvement in pain management has the potential to shave millions of dollars from the U.S. economy, this hypothesis deserves further investigation.

—Sara Tierce-Hazard, BA, and Tina Sadarangani, MSN, ANP-BC, GNP-BC

Issue
Journal of Clinical Outcomes Management - NOVEMBER 2014, VOL. 21, NO. 11
Publications
Topics
Sections

Study Overview

Objective. To evaluate the effectiveness of a collaborative telecare intervention on chronic pain management.

Design. Randomized clinical trial.

Settings and participants. Participants were recruited over a 2-year period from 5 primary care clinics within a single Veterans Affairs medical center. Patients aged 18 to 65 years were eligible if they had chronic (≥ 3 months) musculoskeletal pain of at least moderate intensity (Brief Pain Inventory [BPI] score ≥ 5). Patients were  excluded if they had a pending disability claim or a diagnosis of bipolar disorder, schizophrenia, moderately severe cognitive impairment, active suicidal ideation, current illicit drug use or a terminal illness or received primary care outside of the VA. Participants were randomized to either the telephone-delivered collaborative care management intervention group or usual care. Usual care was defined as continuing to receive care from their primary care provider for management of chronic, musculoskeletal pain.

Intervention. The telecare intervention comprised automated symptom monitoring (ASM) and optimized analgesic management through an algorithm-guided stepped care approach delivered by a nurse case manager. ASM was delivered either by an interactive voice-recorded telephone call (51%) or by internet (49%), set according to patient preference. Intervention calls occurred at 1 and 3 months. Additional contact with participants from the intervention group was generated in response to ASM trend reports.

Main outcome measures. The primary outcome was the BPI total score. The BPI scale ranges from 0 to 10, with higher scores indicating worsening pain. A 1-point change is considered clinically important. Secondary pain outcomes included BPI interference and severity, global pain improvement, treatment satisfaction, and use of opioids and other analgesics. Patients were interviewed at 1, 3, 6, and 12 months.

Main results. A total of 250 participants were enrolled, 124 assigned to the intervention group and 126 assigned to usual care. The mean (SD) baseline BPI scores were 5.31 (1.81) for the intervention group and 5.12 (1.80) for usual care. Compared with usual care, the intervention group had a 1.02-point lower BPI score at 12 months (95% confidence interval [CI], −1.58 to −0.47) (P < 0.001). Patients in the intervention group were nearly twice as likely to report at least a 30% improvement in their pain score by 12 months (51.7% vs. 27.1%; relative risk [RR], 1.9 [95% CI, 1.4 to 2.7]), with a number needed to treat of 4.1 (95% CI, 3.0 to 6.4) for a 30% improvement.

Patients in the intervention group were more likely to rate as good to excellent the medication prescribed for their pain (73.9% vs 50.9%; RR, 1.5 [95% CI, 1.2 to 1.8]). Patients in the usual care group were more likely to experience worsening of pain by 6 months compared with the intervention group. A greater number of analgesics were prescribed to patients in the intervention group; however, opioid use between groups did not differ at baseline or at any point during the trial period. For the secondary outcomes, the intervention group reported greater improvement in depression compared with the usual care group, and this difference was statistically significant (P < 0.001). They also reported fewer days of disability (P = 0.34).

Conclusion. Telecare collaborative management was more effective in improving chronic pain outcomes than usual care. This was accomplished through the optimization of non-opioid analgesic therapy facilitated by a stepped care algorithm and automated symptom monitoring.

Commentary

Chronic pain affects up to 116 million American adults and is recognized as an emerging public health problem that costs the United States a half trillion dollars annually, with disability and hospitalization as the largest burdens [1].The physical and psychological complexities of chronic pain require comprehensive individualized care from interdisciplinary teams who will facilitate prevention, treatment, and routine assessment in chronic pain sufferers [2]. However, enhancing pain management in primary care requires overcoming the high costs and considerable time needed to continually support patients in pain. Telecare represents an improved means by which doctors and nurses can provide primary care services to patients in need of comprehensive pain management. However, the effectiveness of interventions delivered to patients suffering from chronic pain, via telecare, is largely unknown.

This study had several strengths, including a distinct and well-defined intervention, population, comparator, and outcome. The inclusion criteria were broad enough to account for various age-groups, and therefore various pain experiences, yet excluded patients with characteristics likely to confound pain outcomes, such as severe mental health disorders. Participants were randomized in blinded fashion to 1 of 2 clearly defined groups. The stepped algorithm used in the study, SCOPE [3], is a validated and reliable method for assessing chronic pain outcomes. The statistical analyses were appropriate and included analyses of variance to detect between-group differences for continuous variables. The rate of follow-up was excellent, with 95% of participants providing measurable outcome assessments at 12 months. The scientific background and rationale for this study were explicit and relevant to current advances in medicine.

The study is not without limitations, however. It is unclear whether the 2 trial groups were treated equally. Data received through ASM from the intervention group prompted physicians to adjust a patient’s medication regimen, essentially providing caregivers updates on a patient’s status. This occurred in addition to the 4 monthly interviews that both groups received per protocol. The study did not elucidate exactly what care was provided to the usual care group and, therefore, does not allow for the disaggregation of the relative effects of optimizing analgesics and continuous provider monitoring. It is difficult to distinguish if additional care or the intervention was more effective in managing pain than usual care. Another limitation, noted by the authors, is the study’s use of a single VA medical center. Demographics reveal a skewed population, 83% male and 77% white, limiting the trial’s generalizability. Most clinical outcomes were considered, though cost-effectiveness of the intervention was not analyzed. As the VA is a cost-sensitive environment, it is important that interventions assessed are not more costly than usual care. Further cost analysis beyond health resource utilization reported in the study would provide a nuanced assessment of telecare’s feasibility as a replacement for usual primary care. Statistically, the study shows significant improvements in chronic pain in those who received the intervention via telecare, therefore, cost analysis is indeed warranted.

Applications for Clinical Practice

This study illuminates the need for a more intensive pain management program that allows for continuous monitoring. Though the intervention was successfully delivered via telecare, further research is needed to assess whether other programs would be as effective when delivered through telecare, and more importantly, to investigate what characteristics of interventions make telecare successful. Telecare has the potential to improve outcomes, reduce costs, and reduce strains on understaffed facilities, though it is still unknown which conditions would gain from this innovation. This study shows that chronic disease, a predominately self-managed condition, would benefit from a more accessible management program [4]. This, however, may not be the case for other health issues, which require continual testing and equipment usage, such as infectious diseases. Further studies should focus on populations that command a patient-centered intervention delivered using a potentially low-cost tool, like the telephone or internet. Finally, a significant cost driver with chronic pain is disability, and though change in disability days was not statistically significant in this trial, patients in the intervention group self-reported a decrease in disability days, where as patients in the usual care group self-reported an increase. A clinical improvement in pain management has the potential to shave millions of dollars from the U.S. economy, this hypothesis deserves further investigation.

—Sara Tierce-Hazard, BA, and Tina Sadarangani, MSN, ANP-BC, GNP-BC

Study Overview

Objective. To evaluate the effectiveness of a collaborative telecare intervention on chronic pain management.

Design. Randomized clinical trial.

Settings and participants. Participants were recruited over a 2-year period from 5 primary care clinics within a single Veterans Affairs medical center. Patients aged 18 to 65 years were eligible if they had chronic (≥ 3 months) musculoskeletal pain of at least moderate intensity (Brief Pain Inventory [BPI] score ≥ 5). Patients were  excluded if they had a pending disability claim or a diagnosis of bipolar disorder, schizophrenia, moderately severe cognitive impairment, active suicidal ideation, current illicit drug use or a terminal illness or received primary care outside of the VA. Participants were randomized to either the telephone-delivered collaborative care management intervention group or usual care. Usual care was defined as continuing to receive care from their primary care provider for management of chronic, musculoskeletal pain.

Intervention. The telecare intervention comprised automated symptom monitoring (ASM) and optimized analgesic management through an algorithm-guided stepped care approach delivered by a nurse case manager. ASM was delivered either by an interactive voice-recorded telephone call (51%) or by internet (49%), set according to patient preference. Intervention calls occurred at 1 and 3 months. Additional contact with participants from the intervention group was generated in response to ASM trend reports.

Main outcome measures. The primary outcome was the BPI total score. The BPI scale ranges from 0 to 10, with higher scores indicating worsening pain. A 1-point change is considered clinically important. Secondary pain outcomes included BPI interference and severity, global pain improvement, treatment satisfaction, and use of opioids and other analgesics. Patients were interviewed at 1, 3, 6, and 12 months.

Main results. A total of 250 participants were enrolled, 124 assigned to the intervention group and 126 assigned to usual care. The mean (SD) baseline BPI scores were 5.31 (1.81) for the intervention group and 5.12 (1.80) for usual care. Compared with usual care, the intervention group had a 1.02-point lower BPI score at 12 months (95% confidence interval [CI], −1.58 to −0.47) (P < 0.001). Patients in the intervention group were nearly twice as likely to report at least a 30% improvement in their pain score by 12 months (51.7% vs. 27.1%; relative risk [RR], 1.9 [95% CI, 1.4 to 2.7]), with a number needed to treat of 4.1 (95% CI, 3.0 to 6.4) for a 30% improvement.

Patients in the intervention group were more likely to rate as good to excellent the medication prescribed for their pain (73.9% vs 50.9%; RR, 1.5 [95% CI, 1.2 to 1.8]). Patients in the usual care group were more likely to experience worsening of pain by 6 months compared with the intervention group. A greater number of analgesics were prescribed to patients in the intervention group; however, opioid use between groups did not differ at baseline or at any point during the trial period. For the secondary outcomes, the intervention group reported greater improvement in depression compared with the usual care group, and this difference was statistically significant (P < 0.001). They also reported fewer days of disability (P = 0.34).

Conclusion. Telecare collaborative management was more effective in improving chronic pain outcomes than usual care. This was accomplished through the optimization of non-opioid analgesic therapy facilitated by a stepped care algorithm and automated symptom monitoring.

Commentary

Chronic pain affects up to 116 million American adults and is recognized as an emerging public health problem that costs the United States a half trillion dollars annually, with disability and hospitalization as the largest burdens [1].The physical and psychological complexities of chronic pain require comprehensive individualized care from interdisciplinary teams who will facilitate prevention, treatment, and routine assessment in chronic pain sufferers [2]. However, enhancing pain management in primary care requires overcoming the high costs and considerable time needed to continually support patients in pain. Telecare represents an improved means by which doctors and nurses can provide primary care services to patients in need of comprehensive pain management. However, the effectiveness of interventions delivered to patients suffering from chronic pain, via telecare, is largely unknown.

This study had several strengths, including a distinct and well-defined intervention, population, comparator, and outcome. The inclusion criteria were broad enough to account for various age-groups, and therefore various pain experiences, yet excluded patients with characteristics likely to confound pain outcomes, such as severe mental health disorders. Participants were randomized in blinded fashion to 1 of 2 clearly defined groups. The stepped algorithm used in the study, SCOPE [3], is a validated and reliable method for assessing chronic pain outcomes. The statistical analyses were appropriate and included analyses of variance to detect between-group differences for continuous variables. The rate of follow-up was excellent, with 95% of participants providing measurable outcome assessments at 12 months. The scientific background and rationale for this study were explicit and relevant to current advances in medicine.

The study is not without limitations, however. It is unclear whether the 2 trial groups were treated equally. Data received through ASM from the intervention group prompted physicians to adjust a patient’s medication regimen, essentially providing caregivers updates on a patient’s status. This occurred in addition to the 4 monthly interviews that both groups received per protocol. The study did not elucidate exactly what care was provided to the usual care group and, therefore, does not allow for the disaggregation of the relative effects of optimizing analgesics and continuous provider monitoring. It is difficult to distinguish if additional care or the intervention was more effective in managing pain than usual care. Another limitation, noted by the authors, is the study’s use of a single VA medical center. Demographics reveal a skewed population, 83% male and 77% white, limiting the trial’s generalizability. Most clinical outcomes were considered, though cost-effectiveness of the intervention was not analyzed. As the VA is a cost-sensitive environment, it is important that interventions assessed are not more costly than usual care. Further cost analysis beyond health resource utilization reported in the study would provide a nuanced assessment of telecare’s feasibility as a replacement for usual primary care. Statistically, the study shows significant improvements in chronic pain in those who received the intervention via telecare, therefore, cost analysis is indeed warranted.

Applications for Clinical Practice

This study illuminates the need for a more intensive pain management program that allows for continuous monitoring. Though the intervention was successfully delivered via telecare, further research is needed to assess whether other programs would be as effective when delivered through telecare, and more importantly, to investigate what characteristics of interventions make telecare successful. Telecare has the potential to improve outcomes, reduce costs, and reduce strains on understaffed facilities, though it is still unknown which conditions would gain from this innovation. This study shows that chronic disease, a predominately self-managed condition, would benefit from a more accessible management program [4]. This, however, may not be the case for other health issues, which require continual testing and equipment usage, such as infectious diseases. Further studies should focus on populations that command a patient-centered intervention delivered using a potentially low-cost tool, like the telephone or internet. Finally, a significant cost driver with chronic pain is disability, and though change in disability days was not statistically significant in this trial, patients in the intervention group self-reported a decrease in disability days, where as patients in the usual care group self-reported an increase. A clinical improvement in pain management has the potential to shave millions of dollars from the U.S. economy, this hypothesis deserves further investigation.

—Sara Tierce-Hazard, BA, and Tina Sadarangani, MSN, ANP-BC, GNP-BC

Issue
Journal of Clinical Outcomes Management - NOVEMBER 2014, VOL. 21, NO. 11
Issue
Journal of Clinical Outcomes Management - NOVEMBER 2014, VOL. 21, NO. 11
Publications
Publications
Topics
Article Type
Display Headline
Optimizing the Primary Care Management of Chronic Pain Through Telecare
Display Headline
Optimizing the Primary Care Management of Chronic Pain Through Telecare
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

LISTEN NOW: Emergency Medicine and Hospitalist Collaboration

Article Type
Changed
Display Headline
LISTEN NOW: Emergency Medicine and Hospitalist Collaboration

This

month's podcast features Dr. Ken Epstein, chief medical officer for Hospitalist Consultants, a division of ECI Healthcare Partners; and Dr. Ken Heinrich, regional director with ECI Healthcare Partners and chief medical officer for emergency services for the ECI Advisory Group, discussing their ongoing work helping emergency physicians and hospitalists form collaborative teams.

The focus for emergency physicians, says Dr. Heinrich,  is triage and disposition. Differing incentives for hospitalists and emergency physicians can cause stress between the groups, and dialogue is needed to defray the tension, he notes. Dr. Epstein says he thinks that collaboration can be an effective tactic against becoming a “30 day readmission rule” statistic. Shared metrics, developed in partnership, can also improve patient care, he adds.

For more features, visit The Hospitalist's podcast archive.

Audio / Podcast
Issue
The Hospitalist - 2014(11)
Publications
Sections
Audio / Podcast
Audio / Podcast

This

month's podcast features Dr. Ken Epstein, chief medical officer for Hospitalist Consultants, a division of ECI Healthcare Partners; and Dr. Ken Heinrich, regional director with ECI Healthcare Partners and chief medical officer for emergency services for the ECI Advisory Group, discussing their ongoing work helping emergency physicians and hospitalists form collaborative teams.

The focus for emergency physicians, says Dr. Heinrich,  is triage and disposition. Differing incentives for hospitalists and emergency physicians can cause stress between the groups, and dialogue is needed to defray the tension, he notes. Dr. Epstein says he thinks that collaboration can be an effective tactic against becoming a “30 day readmission rule” statistic. Shared metrics, developed in partnership, can also improve patient care, he adds.

For more features, visit The Hospitalist's podcast archive.

This

month's podcast features Dr. Ken Epstein, chief medical officer for Hospitalist Consultants, a division of ECI Healthcare Partners; and Dr. Ken Heinrich, regional director with ECI Healthcare Partners and chief medical officer for emergency services for the ECI Advisory Group, discussing their ongoing work helping emergency physicians and hospitalists form collaborative teams.

The focus for emergency physicians, says Dr. Heinrich,  is triage and disposition. Differing incentives for hospitalists and emergency physicians can cause stress between the groups, and dialogue is needed to defray the tension, he notes. Dr. Epstein says he thinks that collaboration can be an effective tactic against becoming a “30 day readmission rule” statistic. Shared metrics, developed in partnership, can also improve patient care, he adds.

For more features, visit The Hospitalist's podcast archive.

Issue
The Hospitalist - 2014(11)
Issue
The Hospitalist - 2014(11)
Publications
Publications
Article Type
Display Headline
LISTEN NOW: Emergency Medicine and Hospitalist Collaboration
Display Headline
LISTEN NOW: Emergency Medicine and Hospitalist Collaboration
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Enhanced thyroid cancer guidelines expected in 2015

Article Type
Changed
Display Headline
Enhanced thyroid cancer guidelines expected in 2015

CORONADO, CALIF. – Expect significant enhancements to the updated thyroid cancer management guidelines from the American Thyroid Association, due to be released in early 2015.

Last updated in 2009, the goal of the new guidelines is to “be evidence based and helpful,” guidelines task force chair Dr. Bryan R. Haugen said at the annual meeting of the American Thyroid Association. For example, the new guidelines will contain 101 recommendations, up from 80 in the 2009 version; 175 subrecommendations, up from 103; and 998 references, up from 437. “Still, 59 of the existing 80 recommendations are not substantially changed, showing a general stability in our field over the past 5 to 6 years,” he said.

Dr. Bryan R. Haugen

One enhancement is a definition of risk of structural disease recurrence in patients without structurally identifiable disease after initial therapy for thyroid cancer. Low risk is defined as intrathyroidal differentiated thyroid cancer involving up to five metastases less than 0.2 cm in size. Intermediate risk is defined as the presence of aggressive histology, minor extrathyroidal extension, vascular invasion, or more than five involved lymph nodes with metastases 0.2-0.3 cm in size. High risk is defined as the presence of gross extrathyroidal extension, incomplete tumor resection, distant metastases, or lymph node metastases greater than 3 cm in size.

The guidelines also include a table that defines a patient’s response to therapy as a dynamic risk assessment. “This best applies to the low- to intermediate-risk patients, although it definitely applies to high risk as well,” said Dr. Haugen, who heads the division of endocrinology, metabolism, and diabetes at the University of Colorado Health Sciences Center, Denver. “It’s [a] strong recommendation based on low-quality evidence to use this risk-based response to therapy. A lot of this data is generated from patients who’ve had a thyroidectomy and have received radioiodine. So we’re on a bit more shaky ground right now in a patient who’s had a thyroidectomy but no radioiodine, or a patient who’s had a lobectomy.”

Other changes include the concept that it’s not necessary to biopsy every nodule more than 1 cm in size. “We’re going to be guided by the sonographic pattern in who we biopsy and how we monitor them,” Dr. Haugen explained. “A new recommendation adds follow-up guidance for nodules that do not meet FNA [fine-needle aspiration] criteria. We’re also recommending use of the Bethesda Cytology Classification System for cytology.”

Changes in the initial management of thyroid cancer include a recommendation for cross-sectional imaging with contrast for higher-risk disease and the consideration of lobectomy for some patients with tumors 1-4 cm in size. “This is a controversial recommendation,” Dr. Haugen said. “We got some feedback from members asking if you do it, what’s the TSH target? Should we give them synthetic levothyroxine? We are revising the guidelines based on this feedback to help guide clinicians.”

The new guidelines also call for more detailed/standardized pathology reports, with inclusion of lymph node size, extranodal invasion, and the number of invaded vessels. “I’ve talked to a number of pathologists and clinicians who are very happy about this guidance,” he said. “We also need to look at tumor stage, recurrence risk, and response to therapy in our patients, and the use of selective radioiodine. There is some more information on considering lower administered activities, especially in the lower-risk patients.”

For the first time, the guidelines include a section on radioiodine treatment for refractory differentiated thyroid cancer, including tips on directed therapy, clinical trials, systemic therapy, and bone-specific therapy.

Dr. Haugen disclosed that he has received grants and research support from Veracyte and Genzyme.

[email protected]

On Twitter @dougbrunk

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Thyroid cancer, guidelines, Dr. Ryan Haugen
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

CORONADO, CALIF. – Expect significant enhancements to the updated thyroid cancer management guidelines from the American Thyroid Association, due to be released in early 2015.

Last updated in 2009, the goal of the new guidelines is to “be evidence based and helpful,” guidelines task force chair Dr. Bryan R. Haugen said at the annual meeting of the American Thyroid Association. For example, the new guidelines will contain 101 recommendations, up from 80 in the 2009 version; 175 subrecommendations, up from 103; and 998 references, up from 437. “Still, 59 of the existing 80 recommendations are not substantially changed, showing a general stability in our field over the past 5 to 6 years,” he said.

Dr. Bryan R. Haugen

One enhancement is a definition of risk of structural disease recurrence in patients without structurally identifiable disease after initial therapy for thyroid cancer. Low risk is defined as intrathyroidal differentiated thyroid cancer involving up to five metastases less than 0.2 cm in size. Intermediate risk is defined as the presence of aggressive histology, minor extrathyroidal extension, vascular invasion, or more than five involved lymph nodes with metastases 0.2-0.3 cm in size. High risk is defined as the presence of gross extrathyroidal extension, incomplete tumor resection, distant metastases, or lymph node metastases greater than 3 cm in size.

The guidelines also include a table that defines a patient’s response to therapy as a dynamic risk assessment. “This best applies to the low- to intermediate-risk patients, although it definitely applies to high risk as well,” said Dr. Haugen, who heads the division of endocrinology, metabolism, and diabetes at the University of Colorado Health Sciences Center, Denver. “It’s [a] strong recommendation based on low-quality evidence to use this risk-based response to therapy. A lot of this data is generated from patients who’ve had a thyroidectomy and have received radioiodine. So we’re on a bit more shaky ground right now in a patient who’s had a thyroidectomy but no radioiodine, or a patient who’s had a lobectomy.”

Other changes include the concept that it’s not necessary to biopsy every nodule more than 1 cm in size. “We’re going to be guided by the sonographic pattern in who we biopsy and how we monitor them,” Dr. Haugen explained. “A new recommendation adds follow-up guidance for nodules that do not meet FNA [fine-needle aspiration] criteria. We’re also recommending use of the Bethesda Cytology Classification System for cytology.”

Changes in the initial management of thyroid cancer include a recommendation for cross-sectional imaging with contrast for higher-risk disease and the consideration of lobectomy for some patients with tumors 1-4 cm in size. “This is a controversial recommendation,” Dr. Haugen said. “We got some feedback from members asking if you do it, what’s the TSH target? Should we give them synthetic levothyroxine? We are revising the guidelines based on this feedback to help guide clinicians.”

The new guidelines also call for more detailed/standardized pathology reports, with inclusion of lymph node size, extranodal invasion, and the number of invaded vessels. “I’ve talked to a number of pathologists and clinicians who are very happy about this guidance,” he said. “We also need to look at tumor stage, recurrence risk, and response to therapy in our patients, and the use of selective radioiodine. There is some more information on considering lower administered activities, especially in the lower-risk patients.”

For the first time, the guidelines include a section on radioiodine treatment for refractory differentiated thyroid cancer, including tips on directed therapy, clinical trials, systemic therapy, and bone-specific therapy.

Dr. Haugen disclosed that he has received grants and research support from Veracyte and Genzyme.

[email protected]

On Twitter @dougbrunk

CORONADO, CALIF. – Expect significant enhancements to the updated thyroid cancer management guidelines from the American Thyroid Association, due to be released in early 2015.

Last updated in 2009, the goal of the new guidelines is to “be evidence based and helpful,” guidelines task force chair Dr. Bryan R. Haugen said at the annual meeting of the American Thyroid Association. For example, the new guidelines will contain 101 recommendations, up from 80 in the 2009 version; 175 subrecommendations, up from 103; and 998 references, up from 437. “Still, 59 of the existing 80 recommendations are not substantially changed, showing a general stability in our field over the past 5 to 6 years,” he said.

Dr. Bryan R. Haugen

One enhancement is a definition of risk of structural disease recurrence in patients without structurally identifiable disease after initial therapy for thyroid cancer. Low risk is defined as intrathyroidal differentiated thyroid cancer involving up to five metastases less than 0.2 cm in size. Intermediate risk is defined as the presence of aggressive histology, minor extrathyroidal extension, vascular invasion, or more than five involved lymph nodes with metastases 0.2-0.3 cm in size. High risk is defined as the presence of gross extrathyroidal extension, incomplete tumor resection, distant metastases, or lymph node metastases greater than 3 cm in size.

The guidelines also include a table that defines a patient’s response to therapy as a dynamic risk assessment. “This best applies to the low- to intermediate-risk patients, although it definitely applies to high risk as well,” said Dr. Haugen, who heads the division of endocrinology, metabolism, and diabetes at the University of Colorado Health Sciences Center, Denver. “It’s [a] strong recommendation based on low-quality evidence to use this risk-based response to therapy. A lot of this data is generated from patients who’ve had a thyroidectomy and have received radioiodine. So we’re on a bit more shaky ground right now in a patient who’s had a thyroidectomy but no radioiodine, or a patient who’s had a lobectomy.”

Other changes include the concept that it’s not necessary to biopsy every nodule more than 1 cm in size. “We’re going to be guided by the sonographic pattern in who we biopsy and how we monitor them,” Dr. Haugen explained. “A new recommendation adds follow-up guidance for nodules that do not meet FNA [fine-needle aspiration] criteria. We’re also recommending use of the Bethesda Cytology Classification System for cytology.”

Changes in the initial management of thyroid cancer include a recommendation for cross-sectional imaging with contrast for higher-risk disease and the consideration of lobectomy for some patients with tumors 1-4 cm in size. “This is a controversial recommendation,” Dr. Haugen said. “We got some feedback from members asking if you do it, what’s the TSH target? Should we give them synthetic levothyroxine? We are revising the guidelines based on this feedback to help guide clinicians.”

The new guidelines also call for more detailed/standardized pathology reports, with inclusion of lymph node size, extranodal invasion, and the number of invaded vessels. “I’ve talked to a number of pathologists and clinicians who are very happy about this guidance,” he said. “We also need to look at tumor stage, recurrence risk, and response to therapy in our patients, and the use of selective radioiodine. There is some more information on considering lower administered activities, especially in the lower-risk patients.”

For the first time, the guidelines include a section on radioiodine treatment for refractory differentiated thyroid cancer, including tips on directed therapy, clinical trials, systemic therapy, and bone-specific therapy.

Dr. Haugen disclosed that he has received grants and research support from Veracyte and Genzyme.

[email protected]

On Twitter @dougbrunk

References

References

Publications
Publications
Topics
Article Type
Display Headline
Enhanced thyroid cancer guidelines expected in 2015
Display Headline
Enhanced thyroid cancer guidelines expected in 2015
Legacy Keywords
Thyroid cancer, guidelines, Dr. Ryan Haugen
Legacy Keywords
Thyroid cancer, guidelines, Dr. Ryan Haugen
Sections
Article Source

EXPERT ANALYSIS FROM THE ATA ANNUAL MEETING

PURLs Copyright

Inside the Article

Infection may cause implant-associated ALCL

Article Type
Changed
Display Headline
Infection may cause implant-associated ALCL

Breast implant

Credit: FDA

Bacterial infection on the surface of textured breast implants may increase the risk of developing breast implant-associated anaplastic large-cell lymphoma (BIA-ALCL), according to research published in Plastic & Reconstructive Surgery.

Previous studies have shown that biofilm infection around breast implants is a major cause of capsular contracture, a painful hardening of the tissue around the implant that can cause physical deformity and pain.

Now, researchers have found that chronic infection around implants can also lead to an activation of the immune system and the patient’s lymphocytes. And long-term stimulation of lymphocytes by this infection may prompt the transformation of these cells into BIA-ALCL.

The infection was shown to be highest around textured breast implants, and this may provide an explanation as to why BIA-ALCL seems to be more common in patients with textured implants.

“Our previous research has shown that, 24 hours after bacteria come into contact with breast implants, textured implants had 72 times the number of bacteria attached to their surface as compared with the smooth implants,” said Anand Deva, MBBS, of Macquarie University in Sydney, Australia.

“This latest study has shown that the textured implants with the highest numbers of bacteria also had the highest number of activated lymphocytes around them. This finding is important and has now become even more relevant since the reporting of BIA-ALCL, as it provides us with a possible biological explanation of how this rare cancer could arise.”

To uncover these findings, Dr Devan and his colleagues first examined implants in pigs. The team inserted 12 textured and 12 smooth implants into submammary pockets in 3 adult pigs.

After a mean of 8.75 months, all of the samples were positive for bacterial biofilm. And there was a significant correlation between bacterial numbers and the grade of capsular contracture (P=0.04).

Lymphocyte numbers were significantly higher on textured implants (P<0.001), with T cells accounting for the majority of the lymphocytic infiltrate.

The researchers then examined implants in humans, collecting 57 capsules from patients with Baker grade 4 capsules over a 4-year period. The team analyzed biofilm and the surrounding lymphocytes.

As in the pigs, all of the capsules were positive for biofilm, and T cells were the predominant lymphocyte (P<0.001).

The researchers also discovered a significant linear correlation between the number of T and B cells and the number of detected bacteria (P<0.001). And there was a significantly higher number of bacteria for polyurethane implants (P<0.005).

These results suggest a possible link between bacterial biofilm and T-cell hyperplasia, a finding that may have implications for BIA-ALCL, the researchers said.

Dr Deva and his colleagues have published a 14-step guide to reduce the risk of breast implant infection, based on evidence of best practice to educate surgeons on how to reduce the risk of bacterial contamination.

A number of clinical studies have applied these principles and successfully reduced the rate of capsular contracture by a factor of 10 in their patients.

“This is a great validation of our research and a demonstration that good science in the laboratory can be translated into real benefits to patients at the bedside,” Dr Deva said.

“Now, with our greater understanding of the importance of preventing infection, we, as surgeons, can reduce the risk of capsular contracture and thereby reduce the risk of lymphocyte activation and possible transformation into BIA-ALCL.”

Publications
Topics

Breast implant

Credit: FDA

Bacterial infection on the surface of textured breast implants may increase the risk of developing breast implant-associated anaplastic large-cell lymphoma (BIA-ALCL), according to research published in Plastic & Reconstructive Surgery.

Previous studies have shown that biofilm infection around breast implants is a major cause of capsular contracture, a painful hardening of the tissue around the implant that can cause physical deformity and pain.

Now, researchers have found that chronic infection around implants can also lead to an activation of the immune system and the patient’s lymphocytes. And long-term stimulation of lymphocytes by this infection may prompt the transformation of these cells into BIA-ALCL.

The infection was shown to be highest around textured breast implants, and this may provide an explanation as to why BIA-ALCL seems to be more common in patients with textured implants.

“Our previous research has shown that, 24 hours after bacteria come into contact with breast implants, textured implants had 72 times the number of bacteria attached to their surface as compared with the smooth implants,” said Anand Deva, MBBS, of Macquarie University in Sydney, Australia.

“This latest study has shown that the textured implants with the highest numbers of bacteria also had the highest number of activated lymphocytes around them. This finding is important and has now become even more relevant since the reporting of BIA-ALCL, as it provides us with a possible biological explanation of how this rare cancer could arise.”

To uncover these findings, Dr Devan and his colleagues first examined implants in pigs. The team inserted 12 textured and 12 smooth implants into submammary pockets in 3 adult pigs.

After a mean of 8.75 months, all of the samples were positive for bacterial biofilm. And there was a significant correlation between bacterial numbers and the grade of capsular contracture (P=0.04).

Lymphocyte numbers were significantly higher on textured implants (P<0.001), with T cells accounting for the majority of the lymphocytic infiltrate.

The researchers then examined implants in humans, collecting 57 capsules from patients with Baker grade 4 capsules over a 4-year period. The team analyzed biofilm and the surrounding lymphocytes.

As in the pigs, all of the capsules were positive for biofilm, and T cells were the predominant lymphocyte (P<0.001).

The researchers also discovered a significant linear correlation between the number of T and B cells and the number of detected bacteria (P<0.001). And there was a significantly higher number of bacteria for polyurethane implants (P<0.005).

These results suggest a possible link between bacterial biofilm and T-cell hyperplasia, a finding that may have implications for BIA-ALCL, the researchers said.

Dr Deva and his colleagues have published a 14-step guide to reduce the risk of breast implant infection, based on evidence of best practice to educate surgeons on how to reduce the risk of bacterial contamination.

A number of clinical studies have applied these principles and successfully reduced the rate of capsular contracture by a factor of 10 in their patients.

“This is a great validation of our research and a demonstration that good science in the laboratory can be translated into real benefits to patients at the bedside,” Dr Deva said.

“Now, with our greater understanding of the importance of preventing infection, we, as surgeons, can reduce the risk of capsular contracture and thereby reduce the risk of lymphocyte activation and possible transformation into BIA-ALCL.”

Breast implant

Credit: FDA

Bacterial infection on the surface of textured breast implants may increase the risk of developing breast implant-associated anaplastic large-cell lymphoma (BIA-ALCL), according to research published in Plastic & Reconstructive Surgery.

Previous studies have shown that biofilm infection around breast implants is a major cause of capsular contracture, a painful hardening of the tissue around the implant that can cause physical deformity and pain.

Now, researchers have found that chronic infection around implants can also lead to an activation of the immune system and the patient’s lymphocytes. And long-term stimulation of lymphocytes by this infection may prompt the transformation of these cells into BIA-ALCL.

The infection was shown to be highest around textured breast implants, and this may provide an explanation as to why BIA-ALCL seems to be more common in patients with textured implants.

“Our previous research has shown that, 24 hours after bacteria come into contact with breast implants, textured implants had 72 times the number of bacteria attached to their surface as compared with the smooth implants,” said Anand Deva, MBBS, of Macquarie University in Sydney, Australia.

“This latest study has shown that the textured implants with the highest numbers of bacteria also had the highest number of activated lymphocytes around them. This finding is important and has now become even more relevant since the reporting of BIA-ALCL, as it provides us with a possible biological explanation of how this rare cancer could arise.”

To uncover these findings, Dr Devan and his colleagues first examined implants in pigs. The team inserted 12 textured and 12 smooth implants into submammary pockets in 3 adult pigs.

After a mean of 8.75 months, all of the samples were positive for bacterial biofilm. And there was a significant correlation between bacterial numbers and the grade of capsular contracture (P=0.04).

Lymphocyte numbers were significantly higher on textured implants (P<0.001), with T cells accounting for the majority of the lymphocytic infiltrate.

The researchers then examined implants in humans, collecting 57 capsules from patients with Baker grade 4 capsules over a 4-year period. The team analyzed biofilm and the surrounding lymphocytes.

As in the pigs, all of the capsules were positive for biofilm, and T cells were the predominant lymphocyte (P<0.001).

The researchers also discovered a significant linear correlation between the number of T and B cells and the number of detected bacteria (P<0.001). And there was a significantly higher number of bacteria for polyurethane implants (P<0.005).

These results suggest a possible link between bacterial biofilm and T-cell hyperplasia, a finding that may have implications for BIA-ALCL, the researchers said.

Dr Deva and his colleagues have published a 14-step guide to reduce the risk of breast implant infection, based on evidence of best practice to educate surgeons on how to reduce the risk of bacterial contamination.

A number of clinical studies have applied these principles and successfully reduced the rate of capsular contracture by a factor of 10 in their patients.

“This is a great validation of our research and a demonstration that good science in the laboratory can be translated into real benefits to patients at the bedside,” Dr Deva said.

“Now, with our greater understanding of the importance of preventing infection, we, as surgeons, can reduce the risk of capsular contracture and thereby reduce the risk of lymphocyte activation and possible transformation into BIA-ALCL.”

Publications
Publications
Topics
Article Type
Display Headline
Infection may cause implant-associated ALCL
Display Headline
Infection may cause implant-associated ALCL
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Product approved for hemophilia A in Canada, Australia

Article Type
Changed
Display Headline
Product approved for hemophilia A in Canada, Australia

Antihemophilic factor

Health Canada and Australia’s Therapeutic Goods Administration (TGA) have both approved a recombinant FVIII product known as simoctocog alfa (Nuwiq).

Health Canada has approved the product to treat and prevent bleeding in hemophilia A patients of all ages.

And the TGA has approved simoctocog alfa for the treatment and prevention of bleeding in previously treated pediatric (≥ 2 years) and adult patients with

hemophilia A.

Simoctocog alfa is a recombinant FVIII product produced in a human cell line cultured without additives of human or animal origin or any exposure to human blood or plasma, making it inherently free from blood-borne pathogens.

Simoctocog alfa is also devoid of antigenic non-human protein epitopes, similar to FVIII produced in healthy humans. It has a high affinity for the von Willebrand coagulation factor.

“The way Nuwiq is produced is exciting, as it allows the molecule to closely resemble the naturally occurring FVIII,” said Anthony Chan, MBBS, Director of the Hemophilia Program at McMaster Children’s Hospital in Hamilton, Ontario.

“Health Canada’s approval of Nuwiq provides patients with hemophilia A a new recombinant product option that will allow further customization of hemophilia treatment on an individual basis.”

Researchers have evaluated the immunogenicity of simoctocog alfa in 135 previously treated patients with hemophilia A (74 adults and 61 children). And none of the patients developed inhibitors.

In the ongoing, phase 3 NuProtect study, researchers are investigating 100 previously untreated patients, a group typically characterized by a higher risk of inhibitor development. The researchers will assess whether the molecular properties of simoctocog alfa will result in lower inhibitor development.

Two additional phase 3 studies in previously treated patients are ongoing. The NuPreviq study and the Canadian Gena-21b study were designed to assess the efficacy and safety of individually tailored prophylaxis.

The goal of these studies is to provide optimal treatment for each patient based on his or her own pharmacokinetic properties, with a potential reduction in the frequency of FVIII infusions.

Simoctocog alfa was approved in the European Union earlier this year and is under review by regulatory authorities in the US. For more details on simoctocog alfa, see the prescribing information.

Publications
Topics

Antihemophilic factor

Health Canada and Australia’s Therapeutic Goods Administration (TGA) have both approved a recombinant FVIII product known as simoctocog alfa (Nuwiq).

Health Canada has approved the product to treat and prevent bleeding in hemophilia A patients of all ages.

And the TGA has approved simoctocog alfa for the treatment and prevention of bleeding in previously treated pediatric (≥ 2 years) and adult patients with

hemophilia A.

Simoctocog alfa is a recombinant FVIII product produced in a human cell line cultured without additives of human or animal origin or any exposure to human blood or plasma, making it inherently free from blood-borne pathogens.

Simoctocog alfa is also devoid of antigenic non-human protein epitopes, similar to FVIII produced in healthy humans. It has a high affinity for the von Willebrand coagulation factor.

“The way Nuwiq is produced is exciting, as it allows the molecule to closely resemble the naturally occurring FVIII,” said Anthony Chan, MBBS, Director of the Hemophilia Program at McMaster Children’s Hospital in Hamilton, Ontario.

“Health Canada’s approval of Nuwiq provides patients with hemophilia A a new recombinant product option that will allow further customization of hemophilia treatment on an individual basis.”

Researchers have evaluated the immunogenicity of simoctocog alfa in 135 previously treated patients with hemophilia A (74 adults and 61 children). And none of the patients developed inhibitors.

In the ongoing, phase 3 NuProtect study, researchers are investigating 100 previously untreated patients, a group typically characterized by a higher risk of inhibitor development. The researchers will assess whether the molecular properties of simoctocog alfa will result in lower inhibitor development.

Two additional phase 3 studies in previously treated patients are ongoing. The NuPreviq study and the Canadian Gena-21b study were designed to assess the efficacy and safety of individually tailored prophylaxis.

The goal of these studies is to provide optimal treatment for each patient based on his or her own pharmacokinetic properties, with a potential reduction in the frequency of FVIII infusions.

Simoctocog alfa was approved in the European Union earlier this year and is under review by regulatory authorities in the US. For more details on simoctocog alfa, see the prescribing information.

Antihemophilic factor

Health Canada and Australia’s Therapeutic Goods Administration (TGA) have both approved a recombinant FVIII product known as simoctocog alfa (Nuwiq).

Health Canada has approved the product to treat and prevent bleeding in hemophilia A patients of all ages.

And the TGA has approved simoctocog alfa for the treatment and prevention of bleeding in previously treated pediatric (≥ 2 years) and adult patients with

hemophilia A.

Simoctocog alfa is a recombinant FVIII product produced in a human cell line cultured without additives of human or animal origin or any exposure to human blood or plasma, making it inherently free from blood-borne pathogens.

Simoctocog alfa is also devoid of antigenic non-human protein epitopes, similar to FVIII produced in healthy humans. It has a high affinity for the von Willebrand coagulation factor.

“The way Nuwiq is produced is exciting, as it allows the molecule to closely resemble the naturally occurring FVIII,” said Anthony Chan, MBBS, Director of the Hemophilia Program at McMaster Children’s Hospital in Hamilton, Ontario.

“Health Canada’s approval of Nuwiq provides patients with hemophilia A a new recombinant product option that will allow further customization of hemophilia treatment on an individual basis.”

Researchers have evaluated the immunogenicity of simoctocog alfa in 135 previously treated patients with hemophilia A (74 adults and 61 children). And none of the patients developed inhibitors.

In the ongoing, phase 3 NuProtect study, researchers are investigating 100 previously untreated patients, a group typically characterized by a higher risk of inhibitor development. The researchers will assess whether the molecular properties of simoctocog alfa will result in lower inhibitor development.

Two additional phase 3 studies in previously treated patients are ongoing. The NuPreviq study and the Canadian Gena-21b study were designed to assess the efficacy and safety of individually tailored prophylaxis.

The goal of these studies is to provide optimal treatment for each patient based on his or her own pharmacokinetic properties, with a potential reduction in the frequency of FVIII infusions.

Simoctocog alfa was approved in the European Union earlier this year and is under review by regulatory authorities in the US. For more details on simoctocog alfa, see the prescribing information.

Publications
Publications
Topics
Article Type
Display Headline
Product approved for hemophilia A in Canada, Australia
Display Headline
Product approved for hemophilia A in Canada, Australia
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

How ‘urban’ mosquitoes transmit malaria

Article Type
Changed
Display Headline
How ‘urban’ mosquitoes transmit malaria

Anopheles stephensi

Credit: CDC

Researchers believe that by analyzing the genome of a mosquito species, they have discovered how that mosquito evolves to withstand a variety of environmental conditions.

The results, published in Genome Biology, provide a better understanding of Anopheles stephensi, a common carrier of malaria in urban environments.

Anopheles stephensi is emerging as a model mosquito species for genetic and molecular studies,” said Zhijian Jake Tu, PhD, of Virginia Tech in Blacksburg.

He and his colleagues believe their genome map of An stephensi will be an important tool for scientists to identify potential targets for mosquito control. In addition, studies of immunity genes can provide insight into mosquito biology and mosquito-parasite interactions.

“Genome mapping of Anopheles stephensi revealed genetic differences between it and a species especially dangerous for transmitting malaria in Africa, Anopheles gambiae,” said Igor Sharakhov, PhD, also of Virginia Tech.

“This tells us that the sex chromosome is especially prone to mutations that flip chromosomal segments, which, in turn, may promote new, evolved species.”

The researchers assembled more than 92% of the An stephensi genome, and physical mapping assigned 62% of the genome onto chromosomes.

When the team compared An stephensi and An gambiae, they discovered the rate of gene order reshuffling on the Anopheles X chromosome was 3 times higher than that on the autosomes.

An stephensi had more repeat-rich heterochromatin in pericentric regions than An gambiae but fewer repetitive sequences in chromosome arms.

The researchers also identified Y-chromosome contigs and BACs, which represent the most abundant set of Y sequences in any Anopheles species.

Lastly, the team noted that RNA-sequencing and studies of immunity genes provided new insights into mosquito biology and mosquito-parasite interactions.

For instance, FKBP12, a protein that interacts with TOR and TGF-β signaling pathways, showed abundant mRNA expression in a wide range of tissues. This information could help improve our understanding of TOR and TGF-β signaling in mosquitoes.

Publications
Topics

Anopheles stephensi

Credit: CDC

Researchers believe that by analyzing the genome of a mosquito species, they have discovered how that mosquito evolves to withstand a variety of environmental conditions.

The results, published in Genome Biology, provide a better understanding of Anopheles stephensi, a common carrier of malaria in urban environments.

Anopheles stephensi is emerging as a model mosquito species for genetic and molecular studies,” said Zhijian Jake Tu, PhD, of Virginia Tech in Blacksburg.

He and his colleagues believe their genome map of An stephensi will be an important tool for scientists to identify potential targets for mosquito control. In addition, studies of immunity genes can provide insight into mosquito biology and mosquito-parasite interactions.

“Genome mapping of Anopheles stephensi revealed genetic differences between it and a species especially dangerous for transmitting malaria in Africa, Anopheles gambiae,” said Igor Sharakhov, PhD, also of Virginia Tech.

“This tells us that the sex chromosome is especially prone to mutations that flip chromosomal segments, which, in turn, may promote new, evolved species.”

The researchers assembled more than 92% of the An stephensi genome, and physical mapping assigned 62% of the genome onto chromosomes.

When the team compared An stephensi and An gambiae, they discovered the rate of gene order reshuffling on the Anopheles X chromosome was 3 times higher than that on the autosomes.

An stephensi had more repeat-rich heterochromatin in pericentric regions than An gambiae but fewer repetitive sequences in chromosome arms.

The researchers also identified Y-chromosome contigs and BACs, which represent the most abundant set of Y sequences in any Anopheles species.

Lastly, the team noted that RNA-sequencing and studies of immunity genes provided new insights into mosquito biology and mosquito-parasite interactions.

For instance, FKBP12, a protein that interacts with TOR and TGF-β signaling pathways, showed abundant mRNA expression in a wide range of tissues. This information could help improve our understanding of TOR and TGF-β signaling in mosquitoes.

Anopheles stephensi

Credit: CDC

Researchers believe that by analyzing the genome of a mosquito species, they have discovered how that mosquito evolves to withstand a variety of environmental conditions.

The results, published in Genome Biology, provide a better understanding of Anopheles stephensi, a common carrier of malaria in urban environments.

Anopheles stephensi is emerging as a model mosquito species for genetic and molecular studies,” said Zhijian Jake Tu, PhD, of Virginia Tech in Blacksburg.

He and his colleagues believe their genome map of An stephensi will be an important tool for scientists to identify potential targets for mosquito control. In addition, studies of immunity genes can provide insight into mosquito biology and mosquito-parasite interactions.

“Genome mapping of Anopheles stephensi revealed genetic differences between it and a species especially dangerous for transmitting malaria in Africa, Anopheles gambiae,” said Igor Sharakhov, PhD, also of Virginia Tech.

“This tells us that the sex chromosome is especially prone to mutations that flip chromosomal segments, which, in turn, may promote new, evolved species.”

The researchers assembled more than 92% of the An stephensi genome, and physical mapping assigned 62% of the genome onto chromosomes.

When the team compared An stephensi and An gambiae, they discovered the rate of gene order reshuffling on the Anopheles X chromosome was 3 times higher than that on the autosomes.

An stephensi had more repeat-rich heterochromatin in pericentric regions than An gambiae but fewer repetitive sequences in chromosome arms.

The researchers also identified Y-chromosome contigs and BACs, which represent the most abundant set of Y sequences in any Anopheles species.

Lastly, the team noted that RNA-sequencing and studies of immunity genes provided new insights into mosquito biology and mosquito-parasite interactions.

For instance, FKBP12, a protein that interacts with TOR and TGF-β signaling pathways, showed abundant mRNA expression in a wide range of tissues. This information could help improve our understanding of TOR and TGF-β signaling in mosquitoes.

Publications
Publications
Topics
Article Type
Display Headline
How ‘urban’ mosquitoes transmit malaria
Display Headline
How ‘urban’ mosquitoes transmit malaria
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

TACO linked to amount and type of blood product transfused

Article Type
Changed
Display Headline
TACO linked to amount and type of blood product transfused

PHILADELPHIA—Results of a population-based study suggest that elderly adults in the US have seen an increase in the rate of transfusion-associated circulatory overload (TACO) in the last few years.

The risk of TACO increased with advancing age and with increases in the number of units transfused.

TACO rates also appeared to be related to the type of blood components transfused. Patients were more likely to develop TACO if they received red blood cells (RBCs) with plasma and/or platelets.

Mikhail Menis, PharmD, of the Center for Biologics Evaluation and Research at the US Food and Drug Administration in Rockville, Maryland, and his colleagues presented these findings in a poster (SP203) at the AABB Annual Meeting 2014.

The researchers conducted this retrospective, claims-based study to assess TACO occurrence and potential risk factors for the condition among elderly Medicare beneficiaries (aged 65 and older) who were transfused as inpatients from 2011 through 2013.

Among the 6,382,814 inpatient transfusion stays, 4405 included a record of TACO. So the overall rate of TACO was 69.0 per 100,000 stays.

TACO rates (per 100,000) increased significantly over time, from 63.0 in 2011 to 68.0 in 2012 and 77.1 in 2013 (P<0.001).

TACO rates also increased significantly with age—44.5 for patients age 65 to 69, 58.8 for patients age 70 to 74, 66.4 for patients age 75 to 79, 78.7 for patients age 80 to 84, and 91.6 for patients age 85 and older (P<0.001).

Women had a significantly higher rate of TACO than men—76.9 and 58.9, respectively (P<0.001), and whites had a significantly higher rate of TACO than non-whites—73.0 and 49.8, respectively (P<0.001).

In addition, the rate of TACO increased significantly with the number of units transfused. Rates were 30.9 for 1 unit, 63.3 for 2 to 4 units, 103.0 for 5 to 9 units, and 139.8 for more than 9 units (P<0.001).

And TACO rates differed according to the type of blood components transfused. The rate of TACO was 29.2 for patients who received only platelets, 60.8 for those received only plasma, and 73.0 for those who received only RBCs.

The rates were 37.8 for patients who received platelets and plasma; 143.5 for those who received RBCs, plasma, and platelets; 167.9 for those who received RBCs and platelets; and 191.4 for those who received RBCs and plasma.

The researchers noted that this study had its limitations, including potential under-recording or misrecording of transfusion procedures and units, as well as a lack of clinical details to validate TACO diagnoses.

In addition, the rate comparisons were not adjusted for potential confounders, but the researchers are planning to perform adjusted analyses to confirm potential risk factors for TACO in the elderly.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

PHILADELPHIA—Results of a population-based study suggest that elderly adults in the US have seen an increase in the rate of transfusion-associated circulatory overload (TACO) in the last few years.

The risk of TACO increased with advancing age and with increases in the number of units transfused.

TACO rates also appeared to be related to the type of blood components transfused. Patients were more likely to develop TACO if they received red blood cells (RBCs) with plasma and/or platelets.

Mikhail Menis, PharmD, of the Center for Biologics Evaluation and Research at the US Food and Drug Administration in Rockville, Maryland, and his colleagues presented these findings in a poster (SP203) at the AABB Annual Meeting 2014.

The researchers conducted this retrospective, claims-based study to assess TACO occurrence and potential risk factors for the condition among elderly Medicare beneficiaries (aged 65 and older) who were transfused as inpatients from 2011 through 2013.

Among the 6,382,814 inpatient transfusion stays, 4405 included a record of TACO. So the overall rate of TACO was 69.0 per 100,000 stays.

TACO rates (per 100,000) increased significantly over time, from 63.0 in 2011 to 68.0 in 2012 and 77.1 in 2013 (P<0.001).

TACO rates also increased significantly with age—44.5 for patients age 65 to 69, 58.8 for patients age 70 to 74, 66.4 for patients age 75 to 79, 78.7 for patients age 80 to 84, and 91.6 for patients age 85 and older (P<0.001).

Women had a significantly higher rate of TACO than men—76.9 and 58.9, respectively (P<0.001), and whites had a significantly higher rate of TACO than non-whites—73.0 and 49.8, respectively (P<0.001).

In addition, the rate of TACO increased significantly with the number of units transfused. Rates were 30.9 for 1 unit, 63.3 for 2 to 4 units, 103.0 for 5 to 9 units, and 139.8 for more than 9 units (P<0.001).

And TACO rates differed according to the type of blood components transfused. The rate of TACO was 29.2 for patients who received only platelets, 60.8 for those received only plasma, and 73.0 for those who received only RBCs.

The rates were 37.8 for patients who received platelets and plasma; 143.5 for those who received RBCs, plasma, and platelets; 167.9 for those who received RBCs and platelets; and 191.4 for those who received RBCs and plasma.

The researchers noted that this study had its limitations, including potential under-recording or misrecording of transfusion procedures and units, as well as a lack of clinical details to validate TACO diagnoses.

In addition, the rate comparisons were not adjusted for potential confounders, but the researchers are planning to perform adjusted analyses to confirm potential risk factors for TACO in the elderly.

PHILADELPHIA—Results of a population-based study suggest that elderly adults in the US have seen an increase in the rate of transfusion-associated circulatory overload (TACO) in the last few years.

The risk of TACO increased with advancing age and with increases in the number of units transfused.

TACO rates also appeared to be related to the type of blood components transfused. Patients were more likely to develop TACO if they received red blood cells (RBCs) with plasma and/or platelets.

Mikhail Menis, PharmD, of the Center for Biologics Evaluation and Research at the US Food and Drug Administration in Rockville, Maryland, and his colleagues presented these findings in a poster (SP203) at the AABB Annual Meeting 2014.

The researchers conducted this retrospective, claims-based study to assess TACO occurrence and potential risk factors for the condition among elderly Medicare beneficiaries (aged 65 and older) who were transfused as inpatients from 2011 through 2013.

Among the 6,382,814 inpatient transfusion stays, 4405 included a record of TACO. So the overall rate of TACO was 69.0 per 100,000 stays.

TACO rates (per 100,000) increased significantly over time, from 63.0 in 2011 to 68.0 in 2012 and 77.1 in 2013 (P<0.001).

TACO rates also increased significantly with age—44.5 for patients age 65 to 69, 58.8 for patients age 70 to 74, 66.4 for patients age 75 to 79, 78.7 for patients age 80 to 84, and 91.6 for patients age 85 and older (P<0.001).

Women had a significantly higher rate of TACO than men—76.9 and 58.9, respectively (P<0.001), and whites had a significantly higher rate of TACO than non-whites—73.0 and 49.8, respectively (P<0.001).

In addition, the rate of TACO increased significantly with the number of units transfused. Rates were 30.9 for 1 unit, 63.3 for 2 to 4 units, 103.0 for 5 to 9 units, and 139.8 for more than 9 units (P<0.001).

And TACO rates differed according to the type of blood components transfused. The rate of TACO was 29.2 for patients who received only platelets, 60.8 for those received only plasma, and 73.0 for those who received only RBCs.

The rates were 37.8 for patients who received platelets and plasma; 143.5 for those who received RBCs, plasma, and platelets; 167.9 for those who received RBCs and platelets; and 191.4 for those who received RBCs and plasma.

The researchers noted that this study had its limitations, including potential under-recording or misrecording of transfusion procedures and units, as well as a lack of clinical details to validate TACO diagnoses.

In addition, the rate comparisons were not adjusted for potential confounders, but the researchers are planning to perform adjusted analyses to confirm potential risk factors for TACO in the elderly.

Publications
Publications
Topics
Article Type
Display Headline
TACO linked to amount and type of blood product transfused
Display Headline
TACO linked to amount and type of blood product transfused
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica