Antimalarial drug unavailable, CDC says

Article Type
Changed
Mon, 05/11/2015 - 05:00
Display Headline
Antimalarial drug unavailable, CDC says

Pill production

Photo courtesy of the FDA

The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).

The agency said it will provide updates as more information becomes available from the Food and Drug Administration.

Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.

Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).

For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.

Publications
Topics

Pill production

Photo courtesy of the FDA

The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).

The agency said it will provide updates as more information becomes available from the Food and Drug Administration.

Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.

Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).

For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.

Pill production

Photo courtesy of the FDA

The antimalarial drug chloroquine is not currently available from US suppliers, according to the Centers for Disease Control and Prevention (CDC).

The agency said it will provide updates as more information becomes available from the Food and Drug Administration.

Chloroquine is used as malaria treatment and prophylaxis, but hydroxychloroquine sulfate can be prescribed in place of chloroquine when indicated.

Healthcare providers who need assistance diagnosing or managing suspected or confirmed cases of malaria can call the CDC Malaria Hotline at 1-855-856-4713 (Monday through Friday, 9 am to 5pm, Eastern time).

For emergency consultation after hours, providers can call 1-770-488-7100 and ask to speak with a CDC Malaria Branch clinician.

Publications
Publications
Topics
Article Type
Display Headline
Antimalarial drug unavailable, CDC says
Display Headline
Antimalarial drug unavailable, CDC says
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

ABA: Childhood burn survivors risk more physical, mental disorders

Article Type
Changed
Fri, 01/18/2019 - 14:48
Display Headline
ABA: Childhood burn survivors risk more physical, mental disorders

CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.

“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.

Patrice Wendling/Frontline Medical News
Mr. James Stone

He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).

In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).

After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.

Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.

“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”

The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).

All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.

The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.

The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).

Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.

[email protected]

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
pediatric burn, burn survivorship, suicide, American Burn Association
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.

“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.

Patrice Wendling/Frontline Medical News
Mr. James Stone

He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).

In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).

After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.

Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.

“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”

The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).

All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.

The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.

The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).

Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.

[email protected]

CHICAGO – Adult survivors of childhood burns have significantly higher rates of Axis I mental and physical disorders years after their injury, a population-based study shows.

“We think it is really important to screen for, identify, and treat these illnesses not only in that acute period and shortly after the burn injury, but well into adulthood,” study author James Stone said at the annual meeting of the American Burn Association.

Patrice Wendling/Frontline Medical News
Mr. James Stone

He reported on 745 adult burn survivors identified using administrative data from a regional pediatric burn center registry in Manitoba, Canada, who were matched 1:5 with 3,725 controls from the general Manitoba population based on age, sex, and geographic location. The burn survivors had an average age of 5.9 years at the time of burn injury, burns involved an average 12% of total body surface area, and 65% of burn survivors were male. The average follow-up was nearly 15 years (range 2.8-24.7 years).

In unadjusted univariate analysis, adult survivors had significantly higher rates than matched controls for any lifetime physical disorder (rate ratio, 1.17), arthritis (RR, 1.23), cancer (RR, 1.94), diabetes (RR, 1.69), fractures (RR, 1.45), and total respiratory morbidity (RR, 1.15).

After adjustment for gender, geography, and income, any physical disorder (RR, 1.15; P value < .01), arthritis (RR, 1.24; P < .01), fractures (RR, 1.37; P .001), and total respiratory morbidity (RR, 1.13; P < .05) remained significant, Mr. Stone, from the University of Manitoba, Winnipeg, Canada, reported.

Further, 81% of burn survivors had a lifetime physical illness compared with 69% of controls.

“The fact that 81% of our burn cohort was diagnosed with a physical illness is definitely concerning,” he said. “We hypothesize that the prolonged hyperinflammatory and hypermetabolic state that has been previously reported makes these individuals more susceptible to these illnesses down the road.”

The burn cohort also had significantly higher unadjusted rate ratios for any Axis 1 mental disorder (RR, 1.62), major depressive disorder (RR, 1.64), anxiety (RR, 1.57), substance abuse (RR, 2.86), and suicide attempts (RR, 5.00).

All disorders remained statistically significant after adjustment with rate ratios of 1.54 (P < .001), 1.54 (P < .001), 1.50 (P < .001), 2.35 (P < .001), and 4.33 (P < .01), respectively.

The high rates of substance abuse and suicide attempts are consistent with previous clinical interview studies, but are still cause for great concern, Mr. Stone said.

The risk for any mental or physical disorder was not significantly impacted by burn location or by burns that affected more than 30% of total body surface area. Age older than 5 years at the time of the burn significantly increased the risk of any mental disorder (relative risk, 1.92; P < .001).

Limitations of the study include the potential for bias because the data relied on individuals presenting to physicians, discrepancies between ICD codes for physician billings and hospital claims, and some survivors may have moved out of the province, Mr. Stone said. The study, however, had a sample size three times greater than the next largest study of its kind, and importantly, matched burn patients to the general population.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
ABA: Childhood burn survivors risk more physical, mental disorders
Display Headline
ABA: Childhood burn survivors risk more physical, mental disorders
Legacy Keywords
pediatric burn, burn survivorship, suicide, American Burn Association
Legacy Keywords
pediatric burn, burn survivorship, suicide, American Burn Association
Sections
Article Source

AT THE ABA ANNUAL MEETING

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Adult survivors of childhood burn injuries have increased rates of Axis I mental and physical disorders.

Major finding: 81% of burn survivors had a physical disorder vs. 69% of matched controls.

Data source: Population-based study in 745 adult survivors of childhood burns.

Disclosures: The study was funded by grants from the University of Manitoba and the Manitoba Firefighters Burn Fund. The authors declared no conflicts of interest.

HCV spike in four Appalachian states tied to drug abuse

Article Type
Changed
Fri, 01/18/2019 - 14:48
Display Headline
HCV spike in four Appalachian states tied to drug abuse

Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.

“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.

Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).

Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.

During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”

The investigators declared no funding sources or financial conflicts of interest.

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.

“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.

Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).

Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.

During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”

The investigators declared no funding sources or financial conflicts of interest.

Acute hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia between 2006 and 2012, investigators reported online May 8 in Morbidity and Mortality Weekly Report.

“The increase in acute HCV infections in central Appalachia is highly correlated with the region’s epidemic of prescription opioid abuse and facilitated by an upsurge in the number of persons who inject drugs,” said Dr. Jon Zibbell at the Centers for Disease Control and Prevention and his associates.

Nationally, acute HCV infections have risen most steeply in states east of the Mississippi. To further explore the trend, the researchers examined HCV case data from the National Notifiable Disease Surveillance System, and data on 217,789 admissions to substance abuse treatment centers related to opioid or injection drug abuse (MMWR 2015;64:453-8).

Confirmed HCV cases among individuals aged 30 years and younger rose by 364% in the four Appalachian states during 2006-2012, the investigators found. “The increasing incidence among nonurban residents was at least double that of urban residents each year,” they said. Among patients with known risk factors for HCV infection, 73% reported injection drug use.

During the same time, treatment admissions for opioid dependency among individuals aged 12-29 years rose by 21% in the four states, and self-reported injection drug use rose by more than 12%, the researchers said. “Evidence-based strategies as well as integrated-service provision are urgently needed in drug treatment programs to ensure patients are tested for HCV, and persons found to be HCV infected are linked to care and receive appropriate treatment,” they concluded. “These efforts will require further collaboration among federal partners and state and local health departments to better address the syndemic of opioid abuse and HCV infection.”

The investigators declared no funding sources or financial conflicts of interest.

References

References

Publications
Publications
Topics
Article Type
Display Headline
HCV spike in four Appalachian states tied to drug abuse
Display Headline
HCV spike in four Appalachian states tied to drug abuse
Article Source

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Hepatitis C virus infections more than tripled among young people in Kentucky, Tennessee, Virginia, and West Virginia, and were strongly tied to rises in opioid and injection drug abuse.

Major finding: From 2006 to 2012, the number of acute HCV infections increased by 364% among individuals aged 30 years or less.

Data source: Analysis of HCV case data from the National Notifiable Disease Surveillance System and of substance abuse admissions data from the Treatment Episode Data Set.

Disclosures: The investigators reported no funding sources or financial conflicts of interest.

Be true to yourself

Article Type
Changed
Tue, 12/13/2016 - 10:27
Display Headline
Be true to yourself

How often have nonphysicians told you that they could never work the hours you do?

Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.

Dr. A. Maria Hester

An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”

Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.

Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.

References

Author and Disclosure Information

Sections
Author and Disclosure Information

Author and Disclosure Information

How often have nonphysicians told you that they could never work the hours you do?

Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.

Dr. A. Maria Hester

An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”

Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.

Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.

How often have nonphysicians told you that they could never work the hours you do?

Most people think physicians are a unique breed, and in some respects, we are. But in important ways we are just like everyone else. When we work long hours under stressful conditions and go without adequate sleep or nourishment, we cannot function at peak performance. Just like everyone else, we can become irritable, grumpy, and cynical when our basic needs are not met. We are human too, and we are at higher risk than most people for burnout, depression, and even suicide.

Dr. A. Maria Hester

An article in the Journal of Hospital Medicine in 2014 noted that slightly over 50% of hospitalists were affected by burnout. We scored high on the emotional exhaustion subscale, and 40.3% of us had symptoms of depression, with a surprising 9.2% rate of recent suicidality. Hospital medicine definitely has its advantages over many other fields of medicine, but as this study demonstrates, there is still much to be desired in our “work-life balance.”

Each practice has its own perks and negatives, and what will enhance the lives of hospitalists in one group may make intolerable the lives of members of another group. For instance, it is no surprise that 12-hour shifts with 7-on, 7-off block scheduling can be exhausting. If you have a family, this schedule leaves plenty of fun time on the weeks you are off, but you may still be missing 50% of your family’s life if you leave for work before your kids wake up and return after they go to bed.

Whatever your concerns and stressors may be, rest assured, you are not alone, and if enough of the members of your group have similar issues, you may be successful addressing them with your director or hospital administrator. Retaining good hospitalists is vital to the financial success of many hospitals, and being flexible enough to truly meet their reasonable needs can literally make or break a hospitalist team.

References

References

Article Type
Display Headline
Be true to yourself
Display Headline
Be true to yourself
Sections
Article Source

PURLs Copyright

Inside the Article

Impotence drug could prevent malaria transmission

Article Type
Changed
Sun, 05/10/2015 - 05:00
Display Headline
Impotence drug could prevent malaria transmission

A gametocyte-infected red

blood cell that has stiffened

after treatment

© 2015 Ramdani et al.

The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.

Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.

This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.

The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.

Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.

The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.

Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.

With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.

The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.

Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.

One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.

This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.

Publications
Topics

A gametocyte-infected red

blood cell that has stiffened

after treatment

© 2015 Ramdani et al.

The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.

Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.

This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.

The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.

Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.

The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.

Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.

With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.

The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.

Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.

One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.

This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.

A gametocyte-infected red

blood cell that has stiffened

after treatment

© 2015 Ramdani et al.

The erectile dysfunction drug sildenafil (Viagra) could prevent transmission of the malaria parasite, according to research published in PLOS Pathogens.

Investigators found that sildenafil increases the stiffness of erythrocytes infected by the parasite Plasmodium falciparum.

This allows the cells to be eliminated from the bloodstream and may therefore reduce transmission of the malaria parasite from humans to mosquitoes.

The investigators noted that P falciparum has a complex developmental cycle that is completed partly in humans and partly in mosquitoes. Treatments for malaria target the asexual forms of this parasite that cause symptoms, but not the sexual forms transmitted from a human to a mosquito.

Malaria eradication therefore necessitates new types of treatments against sexual forms of the parasite in order to block transmission and prevent dissemination of the disease.

The sexual forms of P falciparum develop in human erythrocytes sequestered in the bone marrow before they are released into the blood. They are then accessible to mosquitoes, which can ingest them when they bite.

Circulating erythrocytes are deformable, thus preventing their clearance via the spleen. And gametocyte-infected erythrocytes can easily pass through the spleen and persist for several days in the blood circulation.

With this in mind, Ghania Ramdani, of Université Paris Descartes in France, and colleagues sought to stiffen the infected erythrocytes so they would be removed from circulation.

The team found that the deformability of gametocyte-infected erythrocytes is regulated by a signaling pathway that involves cAMP. When cAMP molecules accumulate, an erythrocyte becomes stiffer. And cAMP is degraded by the enzyme phosphodiesterase, which promotes erythrocyte deformability.

Using an in vitro model reproducing filtration by the spleen, the investigators were able to identify several pharmacological agents that inhibit phosophodiesterases and can therefore increase the stiffness of infected erythrocytes.

One of these agents is sildenafil. The team showed that a standard dose of the drug had the potential to increase the stiffness of sexual forms of the parasite and therefore favor the elimination of infected erythrocytes from the circulation.

This discovery could lead to new ways to stop the spread of malaria, the investigators said. They believe that modifying the active substance in sildenafil to block its erectile effect, or testing similar agents devoid of this effect, could indeed result in a treatment to prevent transmission of the parasite from humans to mosquitoes.

Publications
Publications
Topics
Article Type
Display Headline
Impotence drug could prevent malaria transmission
Display Headline
Impotence drug could prevent malaria transmission
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Eat slowly to reduce consumed calories

Article Type
Changed
Mon, 05/06/2019 - 12:13
Display Headline
Eat slowly to reduce consumed calories

I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.

But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.

Dr. Jon O. Ebbert

Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.

Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.

Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.

Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.

Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
diet, food, calories, eating
Sections
Author and Disclosure Information

Author and Disclosure Information

I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.

But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.

Dr. Jon O. Ebbert

Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.

Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.

Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.

Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.

Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.

I freely admit I am obsessed with research articles about eating habits. I hold out hope that this will eventually unlock the magic bullet to cure us of the modern plague of obesity. At a certain level, our patients need us to be captivated by such literature. We should feel fairly comfortable with the common knowledge that diets are effective if you stay on them and reducing the caloric density of foods can result in meaningful weight loss.

But what about how quickly we eat? In our fast-paced, heavily caffeinated society, we seem to shovel rather than chew. Ever since I was a medical resident, I have practically inhaled my food. Perchance I am operating under the erroneous and illogical assumption that if I don’t taste the food it won’t register as calories. True science has now enlightened me to the error in my thinking.

Dr. Jon O. Ebbert

Dr. Eric Robinson and his colleagues conducted a brilliant systematic review of the impact of eating rate on energy intake and hunger (Am. J. Clin. Nutr. 2014;100:123-51). They included studies for which there was at least one study arm in which participants ate a meal at a statistically significant slower rate than that of a different arm. Twenty-two studies met the criteria for inclusion.

Available evidence suggests that a slower eating rate is associated with lower intake, compared with faster eating. The effect on caloric intake was observed regardless of the intervention used to modify the eating rate, such as modifying food from soft (fast rate) to hard (slow rate) or verbal instruction. No relationship was observed between eating rate and hunger at the end of the meal or several hours later.

Intriguing to me is the hypothesis that eating rate likely affects intake through the duration and intensity of oral exposure to taste. Previous studies have shown that, when eating rate is held constant, increasing sensory exposure leads to a lower energy intake. This seems to relate to our innate wiring that gives us a “sensory specific satiety.” In my understanding, sensory specific satiety turns off appetitive drive when you have had too much chocolate or too many potato chips and you feel slightly ill. Unfortunately, the food industry is on to this game and they have designed foods to be perfectly balanced to not render satiety. These foods can tragically be eaten ceaselessly.

Take-home message: If your patients cannot control the bad foods they eat, they should try to eat them more slowly.

Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Eat slowly to reduce consumed calories
Display Headline
Eat slowly to reduce consumed calories
Legacy Keywords
diet, food, calories, eating
Legacy Keywords
diet, food, calories, eating
Sections
Article Source

PURLs Copyright

Inside the Article

Inhibitor may benefit certain ALL patients

Article Type
Changed
Sat, 05/09/2015 - 05:00
Display Headline
Inhibitor may benefit certain ALL patients

PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).

In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.

However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.

Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.

Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).

ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.

As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.

“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”

To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).

ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).

Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).

As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.

“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”

With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.

And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.

The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.

*Information in the abstract differs from that presented at the meeting.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).

In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.

However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.

Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.

Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).

ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.

As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.

“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”

To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).

ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).

Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).

As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.

“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”

With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.

And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.

The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.

*Information in the abstract differs from that presented at the meeting.

PHILADELPHIA—Results of preclinical research suggest the BCL-2 inhibitor ABT-199 (venetoclax) may be effective in certain pediatric patients with acute lymphoblastic leukemia (ALL).

In xenograft models of various ALL subtypes, ABT-199 produced an objective response rate below 30%.

However, additional analyses unearthed information that could potentially help us identify which ALL patients might respond to the drug.

Santi Suryani, PhD, of the Children’s Cancer Institute in Sydney, New South Wales, Australia, and her colleagues presented this research at the AACR Annual Meeting 2015 (abstract 3276*). The work was supported by AbbVie, one of the companies developing ABT-199.

Dr Suryani and her colleagues decided to investigate ABT-199 in pediatric ALL after observing mixed results with the BCL-2/BCL-W/BCL-XL inhibitor ABT-263 (navitoclax).

ABT-263 delayed ALL progression in nearly all of the xenograft models the team tested and produced a 61% response rate. However, the drug also induced BCL-XL-mediated thrombocytopenia.

As ABT-199 doesn’t target BCL-XL, the researchers thought the drug might produce similar responses as ABT-263 without inducing thrombocytopenia.

“When ABT-199 came into the picture, we were very excited,” Dr Suryani said. “We thought, ‘This is a wonder drug. This will cure pediatric ALL.’”

To test this hypothesis, the team compared ABT-199 (100 mg/kg x 21 days) and vehicle control in 19 pediatric ALL patient-derived xenografts, including infant mixed-lineage leukemia (MLL) ALL (n=4), B-cell precursor (BCP) ALL (n=5), BCP-ALL categorized as Ph-like (n=4), T-cell ALL (n=4), and early T-cell precursor (ETP) ALL (n=2).

ABT-199 significantly delayed progression in 12 xenografts (63%) for periods ranging from 0.4 days to 28 days. And the drug produced objective responses in 5 xenografts (26%).

Responses occurred in MLL-ALL, BCP-ALL, and Ph-like BCP ALL, but not T-cell ALL or ETP-ALL. Complete responses were seen in MLL-ALL (n=1) and BCP-ALL (n=2), and partial responses occurred in MLL-ALL (n=1) and Ph-like BCP-ALL (n=1).

As the response rate with ABT-263 was more than double that of ABT-199 (61% vs 26%), the researchers found the results with ABT-199 “a little bit disappointing,” according to Dr Suryani.

“But we thought, ‘That’s okay. That already tells us the science behind it—that pediatric ALL is probably more BCL-XL-dependent, rather than BCL-2-dependent,’” she said. “We wondered if there was any way we could come up with a predictive biomarker so we could select patients who will benefit from this treatment.”

With that in mind, the researchers evaluated the link between protein expression and response. They looked at BCL-2 and BCL-XL, as well as a range of other proteins, including BCL-W, MCL1, BAK1, and BAX, among others.

And they found that high BCL-XL and low BCL-2 expression were significantly associated with ABT-199 resistance.

The researchers are still investigating ways to guide treatment with ABT-199 in ALL. They are also hoping to improve responses by administering the drug in combination with other agents.

*Information in the abstract differs from that presented at the meeting.

Publications
Publications
Topics
Article Type
Display Headline
Inhibitor may benefit certain ALL patients
Display Headline
Inhibitor may benefit certain ALL patients
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

AAN: Scheduled daily DBS effective in small Tourette syndrome study

Article Type
Changed
Mon, 01/07/2019 - 12:16
Display Headline
AAN: Scheduled daily DBS effective in small Tourette syndrome study

WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.

Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.

Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.

The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.

Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”

The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.

At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.

This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.

More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.

He had no disclosures. The study was sponsored by the National Institutes of Health.

[email protected]

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Deep, brain, stimulation, DBS, Tourette, tics, thalamic
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.

Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.

Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.

The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.

Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”

The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.

At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.

This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.

More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.

He had no disclosures. The study was sponsored by the National Institutes of Health.

[email protected]

WASHINGTON – Scheduled administration of bilateral deep brain stimulation of the centromedian thalamus for less than 2 hours a day resulted in a significant reduction in tics in several patients with Tourette syndrome over 2 years in a proof-of-concept study presented at the annual meeting of the American Academy of Neurology.

Of the four patients who completed the 24-month study, three experienced significant improvements, said Justin Rossi, an MD-PhD candidate at the University of Florida in Gainesville.

Instead of using the standard continuous deep brain stimulation (DBS), Mr. Rossi and colleagues at the university's Center for Movement Disorders and Neurorestoration evaluated a scheduled, personalized stimulation approach, with stimulation of the centromedian thalamus (bilaterally) tailored to the times of the day when patients experienced the most sequelae from the tics, such as when they were driving, exercising, or working, and when the intensity of the tics was the greatest.

The rationale for investigating this approach is that instead of using the “classical continuous approach” to DBS, a tailored approach might be effective in these patients, with the potential benefits of increasing battery life (and delaying another surgical procedure to replace the battery) and reducing side effects associated with stimulation, Mr. Rossi said.

Many studies have found that DBS is effective in “select medication-refractory cases of Tourette syndrome,” he noted. “However, in contrast to Parkinson’s disease, essential tremor, and other movement disorders for which DBS has been commonly used as a therapy, Tourette syndrome is a paroxysmal disorder,” and the frequency of tics can vary from patient to patient, with individual patients reporting that the intensity of tics “waxes and wanes throughout the day, often predictably.”

The study enrolled five patients; responses were evaluated with two rating scales, the Yale Global Tic Severity Scale (YGTSS) and the Modified Rush Video-Based Tic Rating Scale (MRTRS). A patient was considered a responder if there was more than a 40% improvement in the YGTSS or MRTRS from the preoperative baseline level, at 24 months, the primary outcome. (One patient was lost to follow-up after 18 months because the center was too far away.) Patients had the opportunity to modify the schedule at each 6-month visit.

At 24 months, the YGTSS total scores improved by 46%, 58%, and 17% and the MRTRS total scores improved by 79%, 81%, and 44% in the three responders. These patients had a mean stimulation time of 1.85 hours a day, ranging from 47 to 186 minutes per day. The one patient who did not meet the primary endpoint – with a 10% response on the YGTSS and a 21% response in the MRTRS – had the greatest amount of stimulation per day (4 hours a day). At 24 months, the responders had statistically significant improvements from baseline in components of the two scales, including the number of phonic tics per minute, motor tic severity, and phonic tic severity, Mr. Rossi said.

This is a proof-of-concept study and the results and conclusions are preliminary, but the results “warrant larger studies,” he concluded.

More research is needed to understand this mechanism on a more physiological level, which is being pursued at his center, he added. The results shed some light on whether the mechanism of DBS in Tourette syndrome is a cumulative effect of stimulation over time or whether DBS has an effect around the time the tics occur, and these results support the latter explanation, Mr. Rossi speculated.

He had no disclosures. The study was sponsored by the National Institutes of Health.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
AAN: Scheduled daily DBS effective in small Tourette syndrome study
Display Headline
AAN: Scheduled daily DBS effective in small Tourette syndrome study
Legacy Keywords
Deep, brain, stimulation, DBS, Tourette, tics, thalamic
Legacy Keywords
Deep, brain, stimulation, DBS, Tourette, tics, thalamic
Sections
Article Source

AT THE AAN 2015 ANNUAL MEETING

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Promising results of a tailored approach to deep brain stimulation in three patients with Tourette syndrome merits a larger trial.

Major finding: In three of the four patients who completed the study, DBS of the centromedian thalamus for less than 2 hours a day resulted in significant improvements over 24 months.

Data source: A proof-of-concept study in five patients with Tourette syndrome, evaluating DBS of the centromedian thalamus, scheduled for times when tics interfered with activities or were most intense.

Disclosures: The National Institutes of Health sponsored the study. Mr. Rossi had no disclosures.

Obesity increases risk of bleeding on warfarin

Article Type
Changed
Fri, 01/18/2019 - 14:47
Display Headline
Obesity increases risk of bleeding on warfarin

Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.

Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.

The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.

“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.

There were no conflicts of interest disclosed.

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.

Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.

The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.

“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.

There were no conflicts of interest disclosed.

Obese patients on warfarin may be at greater risk of bleeding than those of normal weight, according to a study presented at the American Heart Association’s Arteriosclerosis, Thrombosis, and Vascular Biology/Peripheral Vascular Disease Scientific Sessions 2015.

Researchers followed 863 patients attending an anticoagulation clinic for 1 year and found that obesity (body mass index greater than 30 kg/m2) was associated with a statistically significant 84% increase in the risk of major bleeds, such as gastrointestinal, intracerebral, and retroperitoneal hemorrhage.

The study also showed that increasing obesity increased bleeding risk; there was a 30% increase in bleeding risk for patients with class I obesity but a 93% increase in patients with class III obesity.

“This result suggests that BMI plays a role in bleeding events in patients on warfarin [and] future studies are needed to understand the mechanism by which obesity increases bleeding risk for patients on warfarin, and whether similar risk exists for the novel oral anticoagulants,” said Dr. Adedotun A. Ogunsua of the University of Massachusetts, Worcester, and coauthors.

There were no conflicts of interest disclosed.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Obesity increases risk of bleeding on warfarin
Display Headline
Obesity increases risk of bleeding on warfarin
Article Source

FROM ATVB/PVD 2015

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Obesity is associated with an increased risk of major bleeding in patients taking warfarin.

Major finding: Obese patients on warfarin had an 84% increased incidence of major bleeding.

Data source: Observational study of 863 patients attending an anticoagulation clinic.

Disclosures: No conflicts of interest were disclosed.

Trauma center verification

Article Type
Changed
Wed, 01/02/2019 - 09:13
Display Headline
Trauma center verification

Despite the many changes in medicine over the past century, traumatic injury remains a surgical disease.

Trauma injury is a major public health concern in rural areas, where death rates from unintentional injuries are higher than in metropolitan areas (Am. J. Public Health 2004;10:1689-93). The rural surgeon sees more than his or her fair share of victims of automobile accidents, falls, unintentional firearms injuries, and occupational accidents (think tractor accidents and injuries involving machinery and animals).

Dr. Philip R. Caropreso

Another reality of the rural areas of the United States is that the number of broadly trained general surgeons who can treat a wide variety of trauma injuries is shrinking. Aging and retirements of the “old school rural surgeons” are accelerating and precipitating a lack of surgical coverage crisis, including trauma, in rural areas (Arch. Surg. 2005;140:74-9).

These well-documented developments have combined to reduce the availability of rural surgeons to manage injured patients in planned and consistent ways. Because of the current training paradigm of increasing subspecialization, injured rural patients may be cared for at rural hospitals with reduced capabilities and by rural surgeons with limited trauma training and experience.

What is the action plan to help counteract these developments and to provide the highest-quality patient care at facilities staffed by surgeons who have sworn to “serve all with skill and fidelity”?

The most straightforward and well-established action plan to achieve those goals is the verification process developed by the ACS Verification, Review, and Consultation Program (VRC) in 1987 to help hospitals improve trauma care. The process involves a pre-review questionnaire, a site visit, and report of findings. Verification as a trauma center guarantees that the facility has the required resources listed in the current, evidence-based guide, Resources for Optimal Care of the Injured Patient (2014). If successful, the trauma center receives a certificate of verification that is valid for 3 years.

Most rural hospitals are designated as Level III and IV verified trauma centers on the basis of their available resources. ACS verification confirms that these centers have the commitments and capabilities to manage the initial care of injured patients by providing stabilization and instituting life-saving maneuvers. In addition, verification confirms that protocols and agreements with higher-level trauma centers within a system enable the safe and efficient transfer of injured patients.

During many years of practice in the rural hospitals verified as trauma centers, including being the medical director of a Level II and Level III facility, I provided care to injured patients who presented to the emergency departments (EDs). My experiences confirmed the unequivocal value of practicing in those facilities, and I can attest to the benefits of verification within a system, like Iowa’s state program.

The following case report validates such assertions. A helicopter, unable to complete the transfer to a Level I center for a deteriorating patient with a left chest gunshot wound, landed at my Level III hospital. There was a “Hot Off Load,” which was followed by a full trauma alert for the patient in profound shock. After placing a chest tube during a 20-minute ED stay, the patient transferred to the OR for further resuscitation, and stabilization with required operative treatment. With the patient stabilized and fully resuscitated, according to established agreements, I contacted the Level I center from the OR. After 3 hours, the patient returned to the helicopter and completed the transfer to the Level I trauma center. The patient survived because of the local trauma team’s commitment, organization, and skill brought about by the trauma center verification.

Most research to date has focused on higher-level trauma centers, but recent studies have shown that ACS verification was an independent predictor of survival of trauma patients at Level II centers (J. Trauma Acute Care Surg. 2013;75:44-9; J. Trauma Acute Care Surg. 2010;69:1362-6).

I have firsthand experience with the verification process. Following my involvement with the ACS Committee on Trauma, I became a national site surveyor for the ACSVRC. I became an Advanced Trauma Life Support (ATLS) instructor and then worked as a course director. ATLS is an essential component for trauma center verification. It supports the rural surgeon by giving the local trauma team a format for consistent, life-saving care for the most severely injured patients. I subsequently completed the ACS Advanced Trauma Operative Management course and elected to become an instructor.

I have made site visits to many rural hospitals as a part of the ACSVRC process and have met with a wide range of reactions from “Let’s show off how good we are” to “We really don’t know why we’re doing this” to “Just give us the merit badge and then get out of our hair.” I am gratified to note that ACS Fellows are uniformly supportive. They understand the need for organization, standards, and performance improvement.

 

 

Opposition to the ACSVRC process by hospitals and staff is no doubt rooted in cost concerns and general resistance to change. But, as most of us know, demonstrated benefits for patient care can be highly persuasive to most medical professionals.

It is also worth noting that in an effort to decrease stress, the ACSVRC takes significant steps to support facilities that seek verification by eliminating ambiguity from application to on-site visit, by defining criteria deficiencies, and by providing evidence for the entire verification process. The complete VRC program along with an FAQ is available on the ACS website (facs.org/quality-programs/trauma/vrc).

For me, trauma care has always been about what is best for the injured patient. I often ask colleagues this question: “What care do you want for an injured member of your family?” I then answer my own question: “I want the best care possible. That means organized, efficient, and life-saving [care] if needed.” Fortunately, I experienced these benefits at my verified trauma center hospital when my second son was in a rollover motor vehicle crash. He survived.

Verified rural trauma centers do indeed offer the best opportunities for high-quality patient care and for support of the rural surgeons who render that care to “serve all with skill and fidelity.” I know. I have been there.

Dr. Caropreso is a general surgeon at Keokuk (Iowa) Area Hospital and clinical professor of surgery at the University of Iowa Carver College of Medicine, Iowa City. He has practiced surgery in the rural communities of Mason City, Iowa; Keokuk, Iowa; and Carthage, Ill., for 37 years.

References

Author and Disclosure Information

Publications
Sections
Author and Disclosure Information

Author and Disclosure Information

Despite the many changes in medicine over the past century, traumatic injury remains a surgical disease.

Trauma injury is a major public health concern in rural areas, where death rates from unintentional injuries are higher than in metropolitan areas (Am. J. Public Health 2004;10:1689-93). The rural surgeon sees more than his or her fair share of victims of automobile accidents, falls, unintentional firearms injuries, and occupational accidents (think tractor accidents and injuries involving machinery and animals).

Dr. Philip R. Caropreso

Another reality of the rural areas of the United States is that the number of broadly trained general surgeons who can treat a wide variety of trauma injuries is shrinking. Aging and retirements of the “old school rural surgeons” are accelerating and precipitating a lack of surgical coverage crisis, including trauma, in rural areas (Arch. Surg. 2005;140:74-9).

These well-documented developments have combined to reduce the availability of rural surgeons to manage injured patients in planned and consistent ways. Because of the current training paradigm of increasing subspecialization, injured rural patients may be cared for at rural hospitals with reduced capabilities and by rural surgeons with limited trauma training and experience.

What is the action plan to help counteract these developments and to provide the highest-quality patient care at facilities staffed by surgeons who have sworn to “serve all with skill and fidelity”?

The most straightforward and well-established action plan to achieve those goals is the verification process developed by the ACS Verification, Review, and Consultation Program (VRC) in 1987 to help hospitals improve trauma care. The process involves a pre-review questionnaire, a site visit, and report of findings. Verification as a trauma center guarantees that the facility has the required resources listed in the current, evidence-based guide, Resources for Optimal Care of the Injured Patient (2014). If successful, the trauma center receives a certificate of verification that is valid for 3 years.

Most rural hospitals are designated as Level III and IV verified trauma centers on the basis of their available resources. ACS verification confirms that these centers have the commitments and capabilities to manage the initial care of injured patients by providing stabilization and instituting life-saving maneuvers. In addition, verification confirms that protocols and agreements with higher-level trauma centers within a system enable the safe and efficient transfer of injured patients.

During many years of practice in the rural hospitals verified as trauma centers, including being the medical director of a Level II and Level III facility, I provided care to injured patients who presented to the emergency departments (EDs). My experiences confirmed the unequivocal value of practicing in those facilities, and I can attest to the benefits of verification within a system, like Iowa’s state program.

The following case report validates such assertions. A helicopter, unable to complete the transfer to a Level I center for a deteriorating patient with a left chest gunshot wound, landed at my Level III hospital. There was a “Hot Off Load,” which was followed by a full trauma alert for the patient in profound shock. After placing a chest tube during a 20-minute ED stay, the patient transferred to the OR for further resuscitation, and stabilization with required operative treatment. With the patient stabilized and fully resuscitated, according to established agreements, I contacted the Level I center from the OR. After 3 hours, the patient returned to the helicopter and completed the transfer to the Level I trauma center. The patient survived because of the local trauma team’s commitment, organization, and skill brought about by the trauma center verification.

Most research to date has focused on higher-level trauma centers, but recent studies have shown that ACS verification was an independent predictor of survival of trauma patients at Level II centers (J. Trauma Acute Care Surg. 2013;75:44-9; J. Trauma Acute Care Surg. 2010;69:1362-6).

I have firsthand experience with the verification process. Following my involvement with the ACS Committee on Trauma, I became a national site surveyor for the ACSVRC. I became an Advanced Trauma Life Support (ATLS) instructor and then worked as a course director. ATLS is an essential component for trauma center verification. It supports the rural surgeon by giving the local trauma team a format for consistent, life-saving care for the most severely injured patients. I subsequently completed the ACS Advanced Trauma Operative Management course and elected to become an instructor.

I have made site visits to many rural hospitals as a part of the ACSVRC process and have met with a wide range of reactions from “Let’s show off how good we are” to “We really don’t know why we’re doing this” to “Just give us the merit badge and then get out of our hair.” I am gratified to note that ACS Fellows are uniformly supportive. They understand the need for organization, standards, and performance improvement.

 

 

Opposition to the ACSVRC process by hospitals and staff is no doubt rooted in cost concerns and general resistance to change. But, as most of us know, demonstrated benefits for patient care can be highly persuasive to most medical professionals.

It is also worth noting that in an effort to decrease stress, the ACSVRC takes significant steps to support facilities that seek verification by eliminating ambiguity from application to on-site visit, by defining criteria deficiencies, and by providing evidence for the entire verification process. The complete VRC program along with an FAQ is available on the ACS website (facs.org/quality-programs/trauma/vrc).

For me, trauma care has always been about what is best for the injured patient. I often ask colleagues this question: “What care do you want for an injured member of your family?” I then answer my own question: “I want the best care possible. That means organized, efficient, and life-saving [care] if needed.” Fortunately, I experienced these benefits at my verified trauma center hospital when my second son was in a rollover motor vehicle crash. He survived.

Verified rural trauma centers do indeed offer the best opportunities for high-quality patient care and for support of the rural surgeons who render that care to “serve all with skill and fidelity.” I know. I have been there.

Dr. Caropreso is a general surgeon at Keokuk (Iowa) Area Hospital and clinical professor of surgery at the University of Iowa Carver College of Medicine, Iowa City. He has practiced surgery in the rural communities of Mason City, Iowa; Keokuk, Iowa; and Carthage, Ill., for 37 years.

Despite the many changes in medicine over the past century, traumatic injury remains a surgical disease.

Trauma injury is a major public health concern in rural areas, where death rates from unintentional injuries are higher than in metropolitan areas (Am. J. Public Health 2004;10:1689-93). The rural surgeon sees more than his or her fair share of victims of automobile accidents, falls, unintentional firearms injuries, and occupational accidents (think tractor accidents and injuries involving machinery and animals).

Dr. Philip R. Caropreso

Another reality of the rural areas of the United States is that the number of broadly trained general surgeons who can treat a wide variety of trauma injuries is shrinking. Aging and retirements of the “old school rural surgeons” are accelerating and precipitating a lack of surgical coverage crisis, including trauma, in rural areas (Arch. Surg. 2005;140:74-9).

These well-documented developments have combined to reduce the availability of rural surgeons to manage injured patients in planned and consistent ways. Because of the current training paradigm of increasing subspecialization, injured rural patients may be cared for at rural hospitals with reduced capabilities and by rural surgeons with limited trauma training and experience.

What is the action plan to help counteract these developments and to provide the highest-quality patient care at facilities staffed by surgeons who have sworn to “serve all with skill and fidelity”?

The most straightforward and well-established action plan to achieve those goals is the verification process developed by the ACS Verification, Review, and Consultation Program (VRC) in 1987 to help hospitals improve trauma care. The process involves a pre-review questionnaire, a site visit, and report of findings. Verification as a trauma center guarantees that the facility has the required resources listed in the current, evidence-based guide, Resources for Optimal Care of the Injured Patient (2014). If successful, the trauma center receives a certificate of verification that is valid for 3 years.

Most rural hospitals are designated as Level III and IV verified trauma centers on the basis of their available resources. ACS verification confirms that these centers have the commitments and capabilities to manage the initial care of injured patients by providing stabilization and instituting life-saving maneuvers. In addition, verification confirms that protocols and agreements with higher-level trauma centers within a system enable the safe and efficient transfer of injured patients.

During many years of practice in the rural hospitals verified as trauma centers, including being the medical director of a Level II and Level III facility, I provided care to injured patients who presented to the emergency departments (EDs). My experiences confirmed the unequivocal value of practicing in those facilities, and I can attest to the benefits of verification within a system, like Iowa’s state program.

The following case report validates such assertions. A helicopter, unable to complete the transfer to a Level I center for a deteriorating patient with a left chest gunshot wound, landed at my Level III hospital. There was a “Hot Off Load,” which was followed by a full trauma alert for the patient in profound shock. After placing a chest tube during a 20-minute ED stay, the patient transferred to the OR for further resuscitation, and stabilization with required operative treatment. With the patient stabilized and fully resuscitated, according to established agreements, I contacted the Level I center from the OR. After 3 hours, the patient returned to the helicopter and completed the transfer to the Level I trauma center. The patient survived because of the local trauma team’s commitment, organization, and skill brought about by the trauma center verification.

Most research to date has focused on higher-level trauma centers, but recent studies have shown that ACS verification was an independent predictor of survival of trauma patients at Level II centers (J. Trauma Acute Care Surg. 2013;75:44-9; J. Trauma Acute Care Surg. 2010;69:1362-6).

I have firsthand experience with the verification process. Following my involvement with the ACS Committee on Trauma, I became a national site surveyor for the ACSVRC. I became an Advanced Trauma Life Support (ATLS) instructor and then worked as a course director. ATLS is an essential component for trauma center verification. It supports the rural surgeon by giving the local trauma team a format for consistent, life-saving care for the most severely injured patients. I subsequently completed the ACS Advanced Trauma Operative Management course and elected to become an instructor.

I have made site visits to many rural hospitals as a part of the ACSVRC process and have met with a wide range of reactions from “Let’s show off how good we are” to “We really don’t know why we’re doing this” to “Just give us the merit badge and then get out of our hair.” I am gratified to note that ACS Fellows are uniformly supportive. They understand the need for organization, standards, and performance improvement.

 

 

Opposition to the ACSVRC process by hospitals and staff is no doubt rooted in cost concerns and general resistance to change. But, as most of us know, demonstrated benefits for patient care can be highly persuasive to most medical professionals.

It is also worth noting that in an effort to decrease stress, the ACSVRC takes significant steps to support facilities that seek verification by eliminating ambiguity from application to on-site visit, by defining criteria deficiencies, and by providing evidence for the entire verification process. The complete VRC program along with an FAQ is available on the ACS website (facs.org/quality-programs/trauma/vrc).

For me, trauma care has always been about what is best for the injured patient. I often ask colleagues this question: “What care do you want for an injured member of your family?” I then answer my own question: “I want the best care possible. That means organized, efficient, and life-saving [care] if needed.” Fortunately, I experienced these benefits at my verified trauma center hospital when my second son was in a rollover motor vehicle crash. He survived.

Verified rural trauma centers do indeed offer the best opportunities for high-quality patient care and for support of the rural surgeons who render that care to “serve all with skill and fidelity.” I know. I have been there.

Dr. Caropreso is a general surgeon at Keokuk (Iowa) Area Hospital and clinical professor of surgery at the University of Iowa Carver College of Medicine, Iowa City. He has practiced surgery in the rural communities of Mason City, Iowa; Keokuk, Iowa; and Carthage, Ill., for 37 years.

References

References

Publications
Publications
Article Type
Display Headline
Trauma center verification
Display Headline
Trauma center verification
Sections
Article Source

PURLs Copyright

Inside the Article