Is It Ringworm, Herpes— Or Something Else Entirely?

Article Type
Changed
Tue, 12/13/2016 - 12:08
Display Headline
Is It Ringworm, Herpes— Or Something Else Entirely?

ANSWER

The correct answer is impetigo (choice “c”), a superficial infection usually caused by a combination of staph and strep organisms.

Psoriasis (choice “a”) would have presented with white, tenacious scaling and would not have been acute in onset.

Eczema (choice “b”) is definitely possible, but the patient’s rash has features not seen with this condition; see Discussion for details.

Fungal infection (choice “d”) is also definitely in the differential, but it is unlikely given the negative KOH, the lack of any source for such infection, and the complete lack of response to tolnaftate cream.

DISCUSSION

Impetigo has also been called impetiginized dermatitis because it almost always starts with minor breaks in the skin as a result of conditions such as eczema, acne, contact dermatitis, or insect bite. Thus provided with access to deeper portions of the epithelial surface, bacterial organisms that normally cause no problems on intact skin are able to create a minor but annoying condition we have come to call impetigo.

Mistakenly called infantigo in large parts of the United States, impetigo is quite common but nonetheless alarming. Rarely associated with morbidity, it tends to resolve in two to three weeks at most, even without treatment.

Impetigo has the reputation of being highly contagious; given enough heat and humidity, close living conditions, and lack of regular bathing and/or adequate treatment, it can spread rapidly. Those conditions existed commonly 100 years ago, when bathing was sporadic and often cursory, and multiple family members lived and slept in close quarters. In those days before the introduction of antibiotics, there were no good topical antimicrobial agents, either.

Another factor played a major role in impetigo, bolstering its fearsome reputation. The strains of strep (group A b-hemolytic strep) that caused most impetigo in those days included several so-called nephritogenic strains that could lead to a dreaded complication: acute poststreptococcal glomerulonephritis (APSGN). Also called Bright disease, it could and did lead to fatal renal failure—about which little could be done at the time.

Fortunately, such nephritogenic strains of strep are unusual now, with APSGN occurring at a rate of about 1:1,000,000 in developed countries. In those locations, most people live far different lives today, bathing and changing clothes daily and living in much less cramped quarters.

The patient’s atopy likely had an impact, for several reasons: Since staph colonization of atopic persons is quite common, it’s more likely that an infection will develop. Also, thinner skin that is easily broken, a plethora of complicating problems (eg, dry skin, eczema, contact dermatitis, and exaggerated reactions to insect bites), and a lower threshold for itching all make atopic persons more susceptible to infection.

Most likely, our patient had a touch of eczema or dry skin and scratched it. Then, as the condition progressed, she scratched it more. The peroxide she used would have been highly irritating, serving only to worsen matters.

From a diagnostic point of view, the honey-colored crust covering the lesion and the context in which it developed led to a provisional diagnosis of impetiginized dermatitis. She was treated with oral cephalexin (500 mg tid for 7 d), topical mupirocin (applied bid), and topical hydrocortisone cream 2.5% (daily application). At one week’s follow-up, the patient’s skin was almost totally clear. It’s very unlikely she’ll have any residual scarring or blemish.

Had the diagnosis been unclear, or had the patient not responded to treatment, other diagnoses would have been considered. Among them: discoid lupus, psoriasis, contact dermatitis, and Darier disease.

References

Article PDF
Author and Disclosure Information

Joe R. Monroe, MPAS, PA, ­practices at Dawkins ­Dermatology Clinic in Oklahoma City. He is also the founder of the Society of ­Dermatology ­Physician ­Assistants.

Issue
Clinician Reviews - 24(11)
Publications
Topics
Page Number
8-9
Legacy Keywords
dermatology, impetigo, ringworm, herpes, rash, crusty lesions, itchy, acyclovir, tolnaftate
Sections
Author and Disclosure Information

Joe R. Monroe, MPAS, PA, ­practices at Dawkins ­Dermatology Clinic in Oklahoma City. He is also the founder of the Society of ­Dermatology ­Physician ­Assistants.

Author and Disclosure Information

Joe R. Monroe, MPAS, PA, ­practices at Dawkins ­Dermatology Clinic in Oklahoma City. He is also the founder of the Society of ­Dermatology ­Physician ­Assistants.

Article PDF
Article PDF

ANSWER

The correct answer is impetigo (choice “c”), a superficial infection usually caused by a combination of staph and strep organisms.

Psoriasis (choice “a”) would have presented with white, tenacious scaling and would not have been acute in onset.

Eczema (choice “b”) is definitely possible, but the patient’s rash has features not seen with this condition; see Discussion for details.

Fungal infection (choice “d”) is also definitely in the differential, but it is unlikely given the negative KOH, the lack of any source for such infection, and the complete lack of response to tolnaftate cream.

DISCUSSION

Impetigo has also been called impetiginized dermatitis because it almost always starts with minor breaks in the skin as a result of conditions such as eczema, acne, contact dermatitis, or insect bite. Thus provided with access to deeper portions of the epithelial surface, bacterial organisms that normally cause no problems on intact skin are able to create a minor but annoying condition we have come to call impetigo.

Mistakenly called infantigo in large parts of the United States, impetigo is quite common but nonetheless alarming. Rarely associated with morbidity, it tends to resolve in two to three weeks at most, even without treatment.

Impetigo has the reputation of being highly contagious; given enough heat and humidity, close living conditions, and lack of regular bathing and/or adequate treatment, it can spread rapidly. Those conditions existed commonly 100 years ago, when bathing was sporadic and often cursory, and multiple family members lived and slept in close quarters. In those days before the introduction of antibiotics, there were no good topical antimicrobial agents, either.

Another factor played a major role in impetigo, bolstering its fearsome reputation. The strains of strep (group A b-hemolytic strep) that caused most impetigo in those days included several so-called nephritogenic strains that could lead to a dreaded complication: acute poststreptococcal glomerulonephritis (APSGN). Also called Bright disease, it could and did lead to fatal renal failure—about which little could be done at the time.

Fortunately, such nephritogenic strains of strep are unusual now, with APSGN occurring at a rate of about 1:1,000,000 in developed countries. In those locations, most people live far different lives today, bathing and changing clothes daily and living in much less cramped quarters.

The patient’s atopy likely had an impact, for several reasons: Since staph colonization of atopic persons is quite common, it’s more likely that an infection will develop. Also, thinner skin that is easily broken, a plethora of complicating problems (eg, dry skin, eczema, contact dermatitis, and exaggerated reactions to insect bites), and a lower threshold for itching all make atopic persons more susceptible to infection.

Most likely, our patient had a touch of eczema or dry skin and scratched it. Then, as the condition progressed, she scratched it more. The peroxide she used would have been highly irritating, serving only to worsen matters.

From a diagnostic point of view, the honey-colored crust covering the lesion and the context in which it developed led to a provisional diagnosis of impetiginized dermatitis. She was treated with oral cephalexin (500 mg tid for 7 d), topical mupirocin (applied bid), and topical hydrocortisone cream 2.5% (daily application). At one week’s follow-up, the patient’s skin was almost totally clear. It’s very unlikely she’ll have any residual scarring or blemish.

Had the diagnosis been unclear, or had the patient not responded to treatment, other diagnoses would have been considered. Among them: discoid lupus, psoriasis, contact dermatitis, and Darier disease.

ANSWER

The correct answer is impetigo (choice “c”), a superficial infection usually caused by a combination of staph and strep organisms.

Psoriasis (choice “a”) would have presented with white, tenacious scaling and would not have been acute in onset.

Eczema (choice “b”) is definitely possible, but the patient’s rash has features not seen with this condition; see Discussion for details.

Fungal infection (choice “d”) is also definitely in the differential, but it is unlikely given the negative KOH, the lack of any source for such infection, and the complete lack of response to tolnaftate cream.

DISCUSSION

Impetigo has also been called impetiginized dermatitis because it almost always starts with minor breaks in the skin as a result of conditions such as eczema, acne, contact dermatitis, or insect bite. Thus provided with access to deeper portions of the epithelial surface, bacterial organisms that normally cause no problems on intact skin are able to create a minor but annoying condition we have come to call impetigo.

Mistakenly called infantigo in large parts of the United States, impetigo is quite common but nonetheless alarming. Rarely associated with morbidity, it tends to resolve in two to three weeks at most, even without treatment.

Impetigo has the reputation of being highly contagious; given enough heat and humidity, close living conditions, and lack of regular bathing and/or adequate treatment, it can spread rapidly. Those conditions existed commonly 100 years ago, when bathing was sporadic and often cursory, and multiple family members lived and slept in close quarters. In those days before the introduction of antibiotics, there were no good topical antimicrobial agents, either.

Another factor played a major role in impetigo, bolstering its fearsome reputation. The strains of strep (group A b-hemolytic strep) that caused most impetigo in those days included several so-called nephritogenic strains that could lead to a dreaded complication: acute poststreptococcal glomerulonephritis (APSGN). Also called Bright disease, it could and did lead to fatal renal failure—about which little could be done at the time.

Fortunately, such nephritogenic strains of strep are unusual now, with APSGN occurring at a rate of about 1:1,000,000 in developed countries. In those locations, most people live far different lives today, bathing and changing clothes daily and living in much less cramped quarters.

The patient’s atopy likely had an impact, for several reasons: Since staph colonization of atopic persons is quite common, it’s more likely that an infection will develop. Also, thinner skin that is easily broken, a plethora of complicating problems (eg, dry skin, eczema, contact dermatitis, and exaggerated reactions to insect bites), and a lower threshold for itching all make atopic persons more susceptible to infection.

Most likely, our patient had a touch of eczema or dry skin and scratched it. Then, as the condition progressed, she scratched it more. The peroxide she used would have been highly irritating, serving only to worsen matters.

From a diagnostic point of view, the honey-colored crust covering the lesion and the context in which it developed led to a provisional diagnosis of impetiginized dermatitis. She was treated with oral cephalexin (500 mg tid for 7 d), topical mupirocin (applied bid), and topical hydrocortisone cream 2.5% (daily application). At one week’s follow-up, the patient’s skin was almost totally clear. It’s very unlikely she’ll have any residual scarring or blemish.

Had the diagnosis been unclear, or had the patient not responded to treatment, other diagnoses would have been considered. Among them: discoid lupus, psoriasis, contact dermatitis, and Darier disease.

References

References

Issue
Clinician Reviews - 24(11)
Issue
Clinician Reviews - 24(11)
Page Number
8-9
Page Number
8-9
Publications
Publications
Topics
Article Type
Display Headline
Is It Ringworm, Herpes— Or Something Else Entirely?
Display Headline
Is It Ringworm, Herpes— Or Something Else Entirely?
Legacy Keywords
dermatology, impetigo, ringworm, herpes, rash, crusty lesions, itchy, acyclovir, tolnaftate
Legacy Keywords
dermatology, impetigo, ringworm, herpes, rash, crusty lesions, itchy, acyclovir, tolnaftate
Sections
Questionnaire Body

A 16-year-old girl is referred to dermatology by her pediatrician for evaluation of a rash on her face. She is currently taking acyclovir (dose unknown) as prescribed by her pediatrician for presumed herpetic infection. Previous treatment attempts with OTC tolnaftate cream and various OTC moisturizers have failed. The rash manifested several weeks ago with two scaly bumps on her left cheek and temple area, which the patient admits to “picking” at. Initially, the lesions itched a bit, but they became larger and more symptomatic after she applied hydrogen peroxide to them several times. She then began to scrub the lesions vigorously with antibacterial soap while continuing to apply the peroxide. Subsequently, she presented to an urgent care clinic, where she was diagnosed with “ringworm” (and advised to use tolnaftate cream), and then to her pediatrician, with the aforementioned result. Aside from seasonal allergies and periodic episodes of eczema, the patient’s health is excellent. She has no pets. Examination reveals large, annular, honey-colored crusts focally located on the left side of the patient’s face. Faint pinkness is noted peripherally around the lesions. Modest but palpable adenopathy is detected in the pretragal and submental nodal areas. Though symptomatic, the patient is in no distress. A KOH prep taken from the scaly periphery is negative for fungal elements.
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Turning for Ulcer Reduction: A Multi-Site Randomized Clinical Trial in Nursing Homes

Article Type
Changed
Thu, 12/15/2022 - 16:15
Display Headline
Turning for Ulcer Reduction: A Multi-Site Randomized Clinical Trial in Nursing Homes

Clinical question: Is there a difference between repositioning intervals of two, three, or four hours in pressure ulcer formation in nursing home residents on high-density foam mattresses?

Background: Pressure ulcer formation in nursing home residents is a common problem. Current standard of care requires repositioning every two hours in patients who are at risk for pressure ulcer formation. Few studies have been performed to assess a difference in repositioning interval. This study was conducted to see if there is a difference in pressure ulcer formation among residents on high-density foam mattresses at moderate to high risk (according to the Braden scale).

Study design: Multi-site, randomized, clinical trial.

Setting: Twenty U.S. and seven Canadian nursing homes using high-density foam mattresses.

Synopsis: A multi-site, randomized clinical trial was executed in 20 U.S. and seven Canadian nursing homes. More than 900 residents were randomized to two-, three-, or four-hour intervals for repositioning. All participants were at either moderate (13-14) or high (10-12) risk on the Braden scale for pressure ulcer formation. All facilities used high-density foam mattresses. All participants were monitored for pressure ulcer formation on the sacrum/coccyx, heel, or trochanter for three consecutive weeks.

There was no significant difference in pressure ulcer formation between the two-, three-, or four-hour interval repositioning groups. There was no significant difference in pressure ulcer formation between the moderate or high-risk groups. Only 2% of participants developed a pressure ulcer, all stage I or II.

It is not clear if the outcomes were purely related to the repositioning intervals, as this study group had a much lower rate of pressure ulcer formation compared to national averages and previous studies. The high-density foam mattress might have improved outcomes by evenly redistributing pressure so that less frequent repositioning was required. The level of documentation may have led to earlier recognition of early stage pressure ulcers as well. This study also was limited to nursing home residents at moderate to high risk of pressure ulcer development.

Bottom line: There is no significant difference in pressure ulcer formation between repositioning intervals of two, three, or four hours among moderate and high-risk nursing home residents using high-density foam mattresses.

Citation: Bergstrom N, Horn SD, Rapp MP, Stern A, Barrett R, Watkiss M. Turning for ulcer reduction: a multisite randomized clinical trial in nursing homes. 2013;61(10):1705-1713.

Issue
The Hospitalist - 2014(10)
Publications
Topics
Sections

Clinical question: Is there a difference between repositioning intervals of two, three, or four hours in pressure ulcer formation in nursing home residents on high-density foam mattresses?

Background: Pressure ulcer formation in nursing home residents is a common problem. Current standard of care requires repositioning every two hours in patients who are at risk for pressure ulcer formation. Few studies have been performed to assess a difference in repositioning interval. This study was conducted to see if there is a difference in pressure ulcer formation among residents on high-density foam mattresses at moderate to high risk (according to the Braden scale).

Study design: Multi-site, randomized, clinical trial.

Setting: Twenty U.S. and seven Canadian nursing homes using high-density foam mattresses.

Synopsis: A multi-site, randomized clinical trial was executed in 20 U.S. and seven Canadian nursing homes. More than 900 residents were randomized to two-, three-, or four-hour intervals for repositioning. All participants were at either moderate (13-14) or high (10-12) risk on the Braden scale for pressure ulcer formation. All facilities used high-density foam mattresses. All participants were monitored for pressure ulcer formation on the sacrum/coccyx, heel, or trochanter for three consecutive weeks.

There was no significant difference in pressure ulcer formation between the two-, three-, or four-hour interval repositioning groups. There was no significant difference in pressure ulcer formation between the moderate or high-risk groups. Only 2% of participants developed a pressure ulcer, all stage I or II.

It is not clear if the outcomes were purely related to the repositioning intervals, as this study group had a much lower rate of pressure ulcer formation compared to national averages and previous studies. The high-density foam mattress might have improved outcomes by evenly redistributing pressure so that less frequent repositioning was required. The level of documentation may have led to earlier recognition of early stage pressure ulcers as well. This study also was limited to nursing home residents at moderate to high risk of pressure ulcer development.

Bottom line: There is no significant difference in pressure ulcer formation between repositioning intervals of two, three, or four hours among moderate and high-risk nursing home residents using high-density foam mattresses.

Citation: Bergstrom N, Horn SD, Rapp MP, Stern A, Barrett R, Watkiss M. Turning for ulcer reduction: a multisite randomized clinical trial in nursing homes. 2013;61(10):1705-1713.

Clinical question: Is there a difference between repositioning intervals of two, three, or four hours in pressure ulcer formation in nursing home residents on high-density foam mattresses?

Background: Pressure ulcer formation in nursing home residents is a common problem. Current standard of care requires repositioning every two hours in patients who are at risk for pressure ulcer formation. Few studies have been performed to assess a difference in repositioning interval. This study was conducted to see if there is a difference in pressure ulcer formation among residents on high-density foam mattresses at moderate to high risk (according to the Braden scale).

Study design: Multi-site, randomized, clinical trial.

Setting: Twenty U.S. and seven Canadian nursing homes using high-density foam mattresses.

Synopsis: A multi-site, randomized clinical trial was executed in 20 U.S. and seven Canadian nursing homes. More than 900 residents were randomized to two-, three-, or four-hour intervals for repositioning. All participants were at either moderate (13-14) or high (10-12) risk on the Braden scale for pressure ulcer formation. All facilities used high-density foam mattresses. All participants were monitored for pressure ulcer formation on the sacrum/coccyx, heel, or trochanter for three consecutive weeks.

There was no significant difference in pressure ulcer formation between the two-, three-, or four-hour interval repositioning groups. There was no significant difference in pressure ulcer formation between the moderate or high-risk groups. Only 2% of participants developed a pressure ulcer, all stage I or II.

It is not clear if the outcomes were purely related to the repositioning intervals, as this study group had a much lower rate of pressure ulcer formation compared to national averages and previous studies. The high-density foam mattress might have improved outcomes by evenly redistributing pressure so that less frequent repositioning was required. The level of documentation may have led to earlier recognition of early stage pressure ulcers as well. This study also was limited to nursing home residents at moderate to high risk of pressure ulcer development.

Bottom line: There is no significant difference in pressure ulcer formation between repositioning intervals of two, three, or four hours among moderate and high-risk nursing home residents using high-density foam mattresses.

Citation: Bergstrom N, Horn SD, Rapp MP, Stern A, Barrett R, Watkiss M. Turning for ulcer reduction: a multisite randomized clinical trial in nursing homes. 2013;61(10):1705-1713.

Issue
The Hospitalist - 2014(10)
Issue
The Hospitalist - 2014(10)
Publications
Publications
Topics
Article Type
Display Headline
Turning for Ulcer Reduction: A Multi-Site Randomized Clinical Trial in Nursing Homes
Display Headline
Turning for Ulcer Reduction: A Multi-Site Randomized Clinical Trial in Nursing Homes
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Antibiotic Resistance Threats in the United States, 2013

Article Type
Changed
Thu, 12/15/2022 - 16:15
Display Headline
Antibiotic Resistance Threats in the United States, 2013

Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?

Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.

Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.

Setting: United States.

Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).

Serious threats are significant antibiotic-resistant threats. These threats will worsen and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum β-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.

Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/ or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness. Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus. Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.

The CDC recommends four core actions to fight antibiotic resistance:

  • Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
  • Tracking resistant bacteria;
  • Improving the use of antibiotics (antibiotic stewardship); and
  • Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.

Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.

Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.

Issue
The Hospitalist - 2014(10)
Publications
Topics
Sections

Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?

Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.

Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.

Setting: United States.

Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).

Serious threats are significant antibiotic-resistant threats. These threats will worsen and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum β-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.

Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/ or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness. Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus. Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.

The CDC recommends four core actions to fight antibiotic resistance:

  • Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
  • Tracking resistant bacteria;
  • Improving the use of antibiotics (antibiotic stewardship); and
  • Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.

Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.

Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.

Clinical question: What antibiotic-resistant bacteria are the greatest threats for the next 10 years?

Background: Two million people suffer antibiotic-resistant infections yearly, and 23,000 die each year as a result. Most of these infections occur in the community, but deaths usually occur in healthcare settings. Cost estimates vary but may be as high as $20 billion in excess direct healthcare costs.

Study design: The CDC used several different surveys and databanks, including the National Antimicrobial Resistance Monitoring System, to collect data. The threat level for antibiotic-resistant bacteria was determined using several factors: clinical impact, economic impact, incidence, 10-year projection of incidence, transmissibility, availability of effective antibiotics, and barriers to prevention.

Setting: United States.

Synopsis: The CDC has three classifications of antibiotic-resistant bacteria: urgent, serious, and concerning. Urgent threats are high-consequence, antibiotic-resistant threats because of significant risks identified across several criteria. These threats might not currently be widespread but have the potential to become so and require urgent public health attention to identify infections and to limit transmission. They include carbapenem-resistant Enterobacteriaceae, drug-resistant Neisseria gonorrhoeae, and Clostridium difficile (does not have true resistance, but is a consequence of antibiotic overuse).

Serious threats are significant antibiotic-resistant threats. These threats will worsen and might become urgent without ongoing public health monitoring and prevention activities. They include multidrug-resistant Acinetobacter, drug-resistant Campylobacter, fluconazole-resistant Candida (a fungus), extended-spectrum β-lactamase-producing Enterobacteriaceae, vancomycin-resistant Enterococcus, multidrug-resistant Pseudomonas aeruginosa, drug-resistant non-typhoidal Salmonella, drug-resistant Salmonella Typhimurium, drug-resistant Shigella, methicillin-resistant Staphylococcus aureus, drug-resistant Streptococcus pneumonia, and drug-resistant tuberculosis.

Concerning threats are bacteria for which the threat of antibiotic resistance is low, and/ or there are multiple therapeutic options for resistant infections. These bacterial pathogens cause severe illness. Threats in this category require monitoring and, in some cases, rapid incident or outbreak response. These include vancomycin-resistant Staphylococcus aureus, erythromycin-resistant Group A Streptococcus, and clindamycin-resistant Group B Streptococcus. Research has shown patients with resistant infections have significantly longer hospital stays, delayed recuperation, long-term disability, and higher mortality. As resistance to current antibiotics occurs, providers are forced to use antibiotics that are more toxic, more expensive, and less effective.

The CDC recommends four core actions to fight antibiotic resistance:

  • Preventing infections from occurring and preventing resistant bacteria from spreading (immunization, infection control, screening, treatment, and education);
  • Tracking resistant bacteria;
  • Improving the use of antibiotics (antibiotic stewardship); and
  • Promoting the development of new antibiotics and new diagnostic tests for resistant bacteria.

Bottom line: Antibiotics are a limited resource. The more antibiotics are used today, the less likely they will continue to be effective in the future. The CDC lists 18 antibiotic-resistant organisms as urgent, serious, or concerning and recommends actions to combat the spread of current organisms and emergence of new antibiotic organisms.

Citation: Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. CDC website. September 16, 2013. Available at: www.cdc.gov/drugresistance/threat-report-2013. Accessed Nov. 30, 2013.

Issue
The Hospitalist - 2014(10)
Issue
The Hospitalist - 2014(10)
Publications
Publications
Topics
Article Type
Display Headline
Antibiotic Resistance Threats in the United States, 2013
Display Headline
Antibiotic Resistance Threats in the United States, 2013
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Lower Extremity Injuries in Snowboarders

Article Type
Changed
Thu, 09/19/2019 - 13:39
Display Headline
Lower Extremity Injuries in Snowboarders

Epidemiology

The several studies of lower extremity injuries sustained while skiing and snowboarding have differed markedly with respect to patient demographics. Kim and colleagues1 compared snowboarding and skiing injuries over 18 seasons at a Vermont ski resort and found that the injury rate, assessed as mean number of days between injuries, was 400 for snowboarders and 345 for skiers. However, most snowboarding injuries were wrist injuries and generally of the upper extremity, whereas skiing injuries were mainly lower extremity injuries. Overall, young and inexperienced snowboarders had the highest injury rate. In a study on skiing and snowboarding injuries through 4 Utah seasons, Wasden and colleagues2 found that mean age at injury was 41 years for skiers and 23 years for snowboarders. This corroborates the finding from several studies1-3 that snowboarders tend to be younger. Snowboarding is a newer sport with many beginners. However, Ishimaru and colleagues4 found that lower extremity injuries may be associated with experienced snowboarders, who may be prone to take more risks and tackle more challenging slopes. Experienced snowboarders are also likely to sustain lower extremity injuries from falling, because of their risk-taking behavior.5

Although upper extremity injuries account for most snowboarding injuries, lower extremity injuries are a significant issue.6 Modern equipment and more challenging slopes have allowed snowboarders to attain great speeds going down slopes—leading to a surge in lower extremity injuries.7 Lower extremity injuries sustained during snowboarding are more likely to be on the leading side4; the ankle is the most frequent fracture site. Unlike snowboard equipment, modern ski equipment, including new boots and binding systems, is designed to reduce ankle injuries and lower leg fractures.6 The decline in foot, ankle, and tibia fractures can be attributed to taller and stiffer boots, which offer the lower extremities more protection.8

Mechanism of Injury

Talus Fractures

An increasingly common injury among snowboarders is a fracture of the lateral process of the talus; this injury accounts for 32% of snowboarders’ ankle fractures.6 The lateral process of the talus—wedge-shaped and covered in articular cartilage—is involved in the subtalar and ankle joints.9 A fracture here is often misdiagnosed as an ankle sprain (Figures 1–3).6,9,10 The exact mechanism of injury remains controversial, and several biomechanical factors seem to be involved. Funk and colleagues11 conducted a cadaveric study and concluded that eversion of an axially loaded, dorsiflexed ankle may be the primary injury mechanism for fracture. Furthermore, snowboarders have their feet in a position perpendicular to the board, and a fall parallel to the board could increase the eversion force on the ankle of the leading leg. Valderrabano and colleagues9 conducted a clinical study of 26 patients who sustained this injury from snowboarding. All the patients reported they had felt an axial impact from falling, jumping, or unexpectedly hitting a ground object, and 80% reported a rotational movement in the lower leg during the impact. The authors concluded that axial loading and dorsiflexion were not the only factors involved in lateral process talus fractures, and an external moment is necessary to cause this injury from a forward fall.9

Anterior Cruciate Ligament Injuries

Although snowboarders’ lower extremity injuries are primarily ankle injuries, snowboarders are also at risk for serious knee issues when landing from jumps. In skiers, anterior cruciate ligament (ACL) injuries have 5 well-established mechanisms, all involving separation of the feet and a twisting force in the knee (Figures 4, 5): boot-induced anterior drawer mechanism, phantom-foot mechanism, valgus-external rotation, forceful quadriceps muscle contraction, and a combination of internal rotation and extension.8,12 A valgus–external rotation mechanism of knee injury occurs when external rotation of the tibia results from the skier catching the inside edge of the front of the ski. A valgus force acts on the knee as the lower leg is abducted during forward momentum. The torque created on the knee joint is amplified by the length of the knee and commonly results in an ACL injury or medial collateral ligament injury.6 Reports indicate that the phantom-foot mechanism is the most common mechanism of ACL injury among skiers.6,13,14 In this situation, internal rotation of the knee results when an off-balance skier falls backward, which causes the knee to hyperflex. The skier catches an inside edge on the snow, which creates a torque that rotates the tibia relative to the femur and results in injury to the ACL.6,14 A boot-induced anterior drawer mechanism occurs during a landing, when the tail of the ski lands first and in an off-balance position, resulting in a load transmitted through the skis to the skier; this load causes an anterior drawer of the ski boot and tibia relative to the femur, straining the ACL and causing ACL rupture.6,13,14 In the forceful quadriceps muscle contraction mechanism of ACL injury, a forceful quadriceps contraction occurs after a jump to prevent a backward fall. With the knee in flexion, this quadriceps contraction causes an anterior translation of the tibia, resulting in ACL rupture.13,14

 

 

The mechanism of injury differs in snowboarding, in which both feet remain attached to the board. Davies and colleagues15 examined 35 snowboarders who sustained ACL injuries after a flat landing from a jump and concluded that snowboarders preparing for a landing exhibit more quadriceps contraction, which increases the loading force on the ACL during landing. Furthermore, the snowboarder’s stance on the board, with the front foot slightly rotated relative to the board, results in a slight internal tibial rotation of the knee and establishes a posture that makes the snowboarder susceptible to injury. However, the lower incidence of knee injuries among snowboarders compared with skiers may be attributable to the fact that there is a limited amount of torque that can be generated on either knee as both feet are fixed to the board.16

The increased quadriceps force in anticipation of a landing, combined with the internal tibial rotation of the knee caused by the snowboarder’s stance, may be the primary mechanism of ACL rupture in snowboarders.15

Injury Prevention Strategies

Prevention strategies require an identification of injury risk factors for snowboarders. Hasler and colleagues7 conducted a study with 306 patients to identify variables that presented a risk for snowboarders. Low readiness for speed, bad weather, and bad visibility, as well as snow conditions, were found to be significant risk factors.

Skiers’ overall injury rate has decreased over the past 60 years, and this decrease has been attributed in part to improved ski technique and instruction.17,18 Improperly adjusted ski bindings are the culprit in many equipment-related lower extremity injuries, and beginners are at much higher risk for such injuries. Lessons and comprehensive safety training could reduce this injury rate.17,19 Several awareness video and training programs focusing on injury prevention have reduced knee sprains in ski patrollers compared with controls by 62% in 1 study; a similar program reduced injury by 30% in nonprofessional skiers.17 A study of injured snowboarders during a winter in Scotland found that 37% of the patients had no formal instruction or training in correct snowboarding and falling technique.20 Training programs for snowboarders could yield meaningful results in injury prevention and avoidance of risk-taking behavior among snowboarders.

Advances in equipment have also had an impact on the incidence of skiing injuries. Ski bindings protect skiers in 2 ways. First, the binding keeps the boot attached to the ski and prevents unintended release on difficult terrain. Second, the binding releases the boot from the ski during extreme conditions to prevent the skier from experiencing extreme forces or moments that could result in injury. Functional failure in ski bindings has been implicated in increased incidence of knee injuries and ligament rupture. In a study of injuries sustained by recreational alpine skiers in Japan, Urabe and colleagues21 found that 96% of those injured stated that the ski bindings had not released at time of incident. The effects of binding adjustment and maintenance among snowboarders have not been fully investigated, and there are no set guidelines for individual snowboarders on appropriate binding level. However, as there is a range of binding adjustment options available, snowboarders may have an optimum level that maximizes both mobility and protection from injury.22

Soft-shelled boots may also increase injury risk for snowboarders. Such boots allow for a wider range of ankle motion and offer little protection from extreme joint movements. Soft boots are generally preferred among snowboarders because they allow for increased mobility for sharp turns and maneuvers. However, modification of the stiffness of boots that limit ankle and foot joint mobility could reduce the incidence of ankle fractures and sprains among snowboarders.22

Summary

Snowboarding has become increasingly popular worldwide. It attracts a loyal group of amateur athletes and has developed into a billion-dollar industry with a growing rank of professionals. Although most snowboarding injuries are upper extremity injuries, the foot, ankle, and knee represent commonly injured areas among recreational and experienced snowboarders. Advances in ski equipment have significantly reduced the incidence of ankle injuries, but rising knee ligament injuries continue to pose a challenge. Foot and ankle injuries remain an issue in snowboarders despite advances in equipment and safety. New snowboard designs and boot and binding modifications may hold promise in decreasing the risk for injury in these athletes.

References

1. Kim S, Endres NK, Johnson RJ, Ettlinger CF, Shealy JE. Snowboarding injuries: trends over time and comparisons with alpine skiing injuries. Am J Sports Med. 2012;40(4):770-776.

2. Wasden CC, McIntosh SE, Keith DS, McCowan C. An analysis of skiing and snowboarding injuries on Utah slopes. J Trauma. 2009;67(5):1022-1026.

3. Rust DA, Gilmore CJ, Treme G. Injury patterns at a large western United States ski resort with and without snowboarders: the Taos experience. Am J Sports Med. 2013;41(3):652-656.

4. Ishimaru D, Ogawa H, Sumi H, Sumi Y, Shimizu K. Lower extremity injuries in snowboarding. J Trauma. 2011;70(3):E48-E52.

5. Torjussen J, Bahr R. Injuries among competitive snowboarders at the national elite level. Am J Sports Med. 2005;33(3):370-377.

6. Deady LH, Salonen D. Skiing and snowboarding injuries: a review with a focus on mechanism of injury. Radiol Clin North Am. 2010;48(6):1113-1124.

7. Hasler RM, Berov S, Banneker L, et al. Are there risk factors for snowboard injuries? A case–control multicentre study of 559 snowboarders. Br J Sports Med. 2010;44(11):816-821.

8. St-Onge N, Chevalier Y, Hagemeister N, Van De Putte M, De Guise J. Effect of ski binding parameters on knee biomechanics: a three-dimensional computational study. Med Sci Sports Exerc. 2004;36(7):1218-1225.

9. Valderrabano V, Perren T, Ryf C, Rillmann P, Hintermann B. Snowboarder’s talus fracture: treatment outcome of 20 cases after 3.5 years. Am J Sports Med. 2005;33(6):871-880.

10. von Knoch F, Reckord U, von Knoch M, Sommer C. Fracture of the lateral process of the talus in snowboarders. J Bone Joint Surg Br. 2007;89(6):772-777.

11. Funk JR, Srinivasan SC, Crandall JR. Snowboarder’s talus fractures experimentally produced by eversion and dorsiflexion. Am J Sports Med. 2003;31(6):921-928.

12. Pujol N, Blanchi MP, Chambat P. The incidence of anterior cruciate ligament injuries among competitive alpine skiers: a 25-year investigation. Am J Sports Med. 2007;35(7):1070-1074.

13. Hame SL, Oakes DA, Markolf KL. Injury to the anterior cruciate ligament during alpine skiing: a biomechanical analysis of tibial torque and knee flexion angle. Am J Sports Med. 2002;30(4):537-540.

14. Bere T, Flørenes TW, Krosshaug T, Nordsletten L, Bahr R. Events leading to anterior cruciate ligament injury in World Cup alpine skiing: a systematic video analysis of 20 cases. Br J Sports Med. 2011;45(16):1294-1302.

15. Davies H, Tietjens B, Van Sterkenburg M, Mehgan A. Anterior cruciate ligament injuries in snowboarders: a quadriceps-induced injury. Knee Surg Sports Traumatol Arthrosc. 2009;17(9):1048-1051.

16. Bladin C, McCrory P, Pogorzelski A. Snowboarding injuries: current trends and future directions. Sports Med. 2004;34(2):133-139.

17. Rossi MJ, Lubowitz JH, Guttmann D. The skier’s knee. Arthroscopy. 2003;19(1):75-84.

18. Pressman A, Johnson DH. A review of ski injuries resulting in combined injury to the anterior cruciate ligament and medial collateral ligaments. Arthroscopy. 2003;19(2):194-202.

19. Hildebrandt C, Mildner E, Hotter B, Kirschner W, Höbenreich C, Raschner C. Accident prevention on ski slopes—perceptions of safety and knowledge of existing rules. Accid Anal Prev. 2011;43(4):1421-1426.

20. Langran M, Selvaraj S. Increased injury risk among first-day skiers, snowboarders, and skiboarders. Am J Sports Med. 2004;32(1):96-103.

21. Urabe Y, Ochi M, Onari K, Ikuta Y. Anterior cruciate ligament injury in recreational alpine skiers: analysis of mechanisms and strategy for prevention. J Orthop Sci. 2002;7(1):1-5.

22. McAlpine PR. Biomechanical Analysis of Snowboard Jump Landings: A Focus on the Ankle Joint Complex [doctoral thesis]. Auckland, New Zealand: University of Auckland; 2010.

Article PDF
Author and Disclosure Information

Bilal Mahmood, BA, and Naven Duggal, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 43(11)
Publications
Topics
Page Number
502-505
Legacy Keywords
american journal of orthopedics, AJO, injury, injuries, lower extremity, snowboarders, snowboarder, snow, review paper, snowboarding, winter, sports medicine, sports, skiing, foot and ankle, knee, winter sport, mahmood, duggal
Sections
Author and Disclosure Information

Bilal Mahmood, BA, and Naven Duggal, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Bilal Mahmood, BA, and Naven Duggal, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

Epidemiology

The several studies of lower extremity injuries sustained while skiing and snowboarding have differed markedly with respect to patient demographics. Kim and colleagues1 compared snowboarding and skiing injuries over 18 seasons at a Vermont ski resort and found that the injury rate, assessed as mean number of days between injuries, was 400 for snowboarders and 345 for skiers. However, most snowboarding injuries were wrist injuries and generally of the upper extremity, whereas skiing injuries were mainly lower extremity injuries. Overall, young and inexperienced snowboarders had the highest injury rate. In a study on skiing and snowboarding injuries through 4 Utah seasons, Wasden and colleagues2 found that mean age at injury was 41 years for skiers and 23 years for snowboarders. This corroborates the finding from several studies1-3 that snowboarders tend to be younger. Snowboarding is a newer sport with many beginners. However, Ishimaru and colleagues4 found that lower extremity injuries may be associated with experienced snowboarders, who may be prone to take more risks and tackle more challenging slopes. Experienced snowboarders are also likely to sustain lower extremity injuries from falling, because of their risk-taking behavior.5

Although upper extremity injuries account for most snowboarding injuries, lower extremity injuries are a significant issue.6 Modern equipment and more challenging slopes have allowed snowboarders to attain great speeds going down slopes—leading to a surge in lower extremity injuries.7 Lower extremity injuries sustained during snowboarding are more likely to be on the leading side4; the ankle is the most frequent fracture site. Unlike snowboard equipment, modern ski equipment, including new boots and binding systems, is designed to reduce ankle injuries and lower leg fractures.6 The decline in foot, ankle, and tibia fractures can be attributed to taller and stiffer boots, which offer the lower extremities more protection.8

Mechanism of Injury

Talus Fractures

An increasingly common injury among snowboarders is a fracture of the lateral process of the talus; this injury accounts for 32% of snowboarders’ ankle fractures.6 The lateral process of the talus—wedge-shaped and covered in articular cartilage—is involved in the subtalar and ankle joints.9 A fracture here is often misdiagnosed as an ankle sprain (Figures 1–3).6,9,10 The exact mechanism of injury remains controversial, and several biomechanical factors seem to be involved. Funk and colleagues11 conducted a cadaveric study and concluded that eversion of an axially loaded, dorsiflexed ankle may be the primary injury mechanism for fracture. Furthermore, snowboarders have their feet in a position perpendicular to the board, and a fall parallel to the board could increase the eversion force on the ankle of the leading leg. Valderrabano and colleagues9 conducted a clinical study of 26 patients who sustained this injury from snowboarding. All the patients reported they had felt an axial impact from falling, jumping, or unexpectedly hitting a ground object, and 80% reported a rotational movement in the lower leg during the impact. The authors concluded that axial loading and dorsiflexion were not the only factors involved in lateral process talus fractures, and an external moment is necessary to cause this injury from a forward fall.9

Anterior Cruciate Ligament Injuries

Although snowboarders’ lower extremity injuries are primarily ankle injuries, snowboarders are also at risk for serious knee issues when landing from jumps. In skiers, anterior cruciate ligament (ACL) injuries have 5 well-established mechanisms, all involving separation of the feet and a twisting force in the knee (Figures 4, 5): boot-induced anterior drawer mechanism, phantom-foot mechanism, valgus-external rotation, forceful quadriceps muscle contraction, and a combination of internal rotation and extension.8,12 A valgus–external rotation mechanism of knee injury occurs when external rotation of the tibia results from the skier catching the inside edge of the front of the ski. A valgus force acts on the knee as the lower leg is abducted during forward momentum. The torque created on the knee joint is amplified by the length of the knee and commonly results in an ACL injury or medial collateral ligament injury.6 Reports indicate that the phantom-foot mechanism is the most common mechanism of ACL injury among skiers.6,13,14 In this situation, internal rotation of the knee results when an off-balance skier falls backward, which causes the knee to hyperflex. The skier catches an inside edge on the snow, which creates a torque that rotates the tibia relative to the femur and results in injury to the ACL.6,14 A boot-induced anterior drawer mechanism occurs during a landing, when the tail of the ski lands first and in an off-balance position, resulting in a load transmitted through the skis to the skier; this load causes an anterior drawer of the ski boot and tibia relative to the femur, straining the ACL and causing ACL rupture.6,13,14 In the forceful quadriceps muscle contraction mechanism of ACL injury, a forceful quadriceps contraction occurs after a jump to prevent a backward fall. With the knee in flexion, this quadriceps contraction causes an anterior translation of the tibia, resulting in ACL rupture.13,14

 

 

The mechanism of injury differs in snowboarding, in which both feet remain attached to the board. Davies and colleagues15 examined 35 snowboarders who sustained ACL injuries after a flat landing from a jump and concluded that snowboarders preparing for a landing exhibit more quadriceps contraction, which increases the loading force on the ACL during landing. Furthermore, the snowboarder’s stance on the board, with the front foot slightly rotated relative to the board, results in a slight internal tibial rotation of the knee and establishes a posture that makes the snowboarder susceptible to injury. However, the lower incidence of knee injuries among snowboarders compared with skiers may be attributable to the fact that there is a limited amount of torque that can be generated on either knee as both feet are fixed to the board.16

The increased quadriceps force in anticipation of a landing, combined with the internal tibial rotation of the knee caused by the snowboarder’s stance, may be the primary mechanism of ACL rupture in snowboarders.15

Injury Prevention Strategies

Prevention strategies require an identification of injury risk factors for snowboarders. Hasler and colleagues7 conducted a study with 306 patients to identify variables that presented a risk for snowboarders. Low readiness for speed, bad weather, and bad visibility, as well as snow conditions, were found to be significant risk factors.

Skiers’ overall injury rate has decreased over the past 60 years, and this decrease has been attributed in part to improved ski technique and instruction.17,18 Improperly adjusted ski bindings are the culprit in many equipment-related lower extremity injuries, and beginners are at much higher risk for such injuries. Lessons and comprehensive safety training could reduce this injury rate.17,19 Several awareness video and training programs focusing on injury prevention have reduced knee sprains in ski patrollers compared with controls by 62% in 1 study; a similar program reduced injury by 30% in nonprofessional skiers.17 A study of injured snowboarders during a winter in Scotland found that 37% of the patients had no formal instruction or training in correct snowboarding and falling technique.20 Training programs for snowboarders could yield meaningful results in injury prevention and avoidance of risk-taking behavior among snowboarders.

Advances in equipment have also had an impact on the incidence of skiing injuries. Ski bindings protect skiers in 2 ways. First, the binding keeps the boot attached to the ski and prevents unintended release on difficult terrain. Second, the binding releases the boot from the ski during extreme conditions to prevent the skier from experiencing extreme forces or moments that could result in injury. Functional failure in ski bindings has been implicated in increased incidence of knee injuries and ligament rupture. In a study of injuries sustained by recreational alpine skiers in Japan, Urabe and colleagues21 found that 96% of those injured stated that the ski bindings had not released at time of incident. The effects of binding adjustment and maintenance among snowboarders have not been fully investigated, and there are no set guidelines for individual snowboarders on appropriate binding level. However, as there is a range of binding adjustment options available, snowboarders may have an optimum level that maximizes both mobility and protection from injury.22

Soft-shelled boots may also increase injury risk for snowboarders. Such boots allow for a wider range of ankle motion and offer little protection from extreme joint movements. Soft boots are generally preferred among snowboarders because they allow for increased mobility for sharp turns and maneuvers. However, modification of the stiffness of boots that limit ankle and foot joint mobility could reduce the incidence of ankle fractures and sprains among snowboarders.22

Summary

Snowboarding has become increasingly popular worldwide. It attracts a loyal group of amateur athletes and has developed into a billion-dollar industry with a growing rank of professionals. Although most snowboarding injuries are upper extremity injuries, the foot, ankle, and knee represent commonly injured areas among recreational and experienced snowboarders. Advances in ski equipment have significantly reduced the incidence of ankle injuries, but rising knee ligament injuries continue to pose a challenge. Foot and ankle injuries remain an issue in snowboarders despite advances in equipment and safety. New snowboard designs and boot and binding modifications may hold promise in decreasing the risk for injury in these athletes.

Epidemiology

The several studies of lower extremity injuries sustained while skiing and snowboarding have differed markedly with respect to patient demographics. Kim and colleagues1 compared snowboarding and skiing injuries over 18 seasons at a Vermont ski resort and found that the injury rate, assessed as mean number of days between injuries, was 400 for snowboarders and 345 for skiers. However, most snowboarding injuries were wrist injuries and generally of the upper extremity, whereas skiing injuries were mainly lower extremity injuries. Overall, young and inexperienced snowboarders had the highest injury rate. In a study on skiing and snowboarding injuries through 4 Utah seasons, Wasden and colleagues2 found that mean age at injury was 41 years for skiers and 23 years for snowboarders. This corroborates the finding from several studies1-3 that snowboarders tend to be younger. Snowboarding is a newer sport with many beginners. However, Ishimaru and colleagues4 found that lower extremity injuries may be associated with experienced snowboarders, who may be prone to take more risks and tackle more challenging slopes. Experienced snowboarders are also likely to sustain lower extremity injuries from falling, because of their risk-taking behavior.5

Although upper extremity injuries account for most snowboarding injuries, lower extremity injuries are a significant issue.6 Modern equipment and more challenging slopes have allowed snowboarders to attain great speeds going down slopes—leading to a surge in lower extremity injuries.7 Lower extremity injuries sustained during snowboarding are more likely to be on the leading side4; the ankle is the most frequent fracture site. Unlike snowboard equipment, modern ski equipment, including new boots and binding systems, is designed to reduce ankle injuries and lower leg fractures.6 The decline in foot, ankle, and tibia fractures can be attributed to taller and stiffer boots, which offer the lower extremities more protection.8

Mechanism of Injury

Talus Fractures

An increasingly common injury among snowboarders is a fracture of the lateral process of the talus; this injury accounts for 32% of snowboarders’ ankle fractures.6 The lateral process of the talus—wedge-shaped and covered in articular cartilage—is involved in the subtalar and ankle joints.9 A fracture here is often misdiagnosed as an ankle sprain (Figures 1–3).6,9,10 The exact mechanism of injury remains controversial, and several biomechanical factors seem to be involved. Funk and colleagues11 conducted a cadaveric study and concluded that eversion of an axially loaded, dorsiflexed ankle may be the primary injury mechanism for fracture. Furthermore, snowboarders have their feet in a position perpendicular to the board, and a fall parallel to the board could increase the eversion force on the ankle of the leading leg. Valderrabano and colleagues9 conducted a clinical study of 26 patients who sustained this injury from snowboarding. All the patients reported they had felt an axial impact from falling, jumping, or unexpectedly hitting a ground object, and 80% reported a rotational movement in the lower leg during the impact. The authors concluded that axial loading and dorsiflexion were not the only factors involved in lateral process talus fractures, and an external moment is necessary to cause this injury from a forward fall.9

Anterior Cruciate Ligament Injuries

Although snowboarders’ lower extremity injuries are primarily ankle injuries, snowboarders are also at risk for serious knee issues when landing from jumps. In skiers, anterior cruciate ligament (ACL) injuries have 5 well-established mechanisms, all involving separation of the feet and a twisting force in the knee (Figures 4, 5): boot-induced anterior drawer mechanism, phantom-foot mechanism, valgus-external rotation, forceful quadriceps muscle contraction, and a combination of internal rotation and extension.8,12 A valgus–external rotation mechanism of knee injury occurs when external rotation of the tibia results from the skier catching the inside edge of the front of the ski. A valgus force acts on the knee as the lower leg is abducted during forward momentum. The torque created on the knee joint is amplified by the length of the knee and commonly results in an ACL injury or medial collateral ligament injury.6 Reports indicate that the phantom-foot mechanism is the most common mechanism of ACL injury among skiers.6,13,14 In this situation, internal rotation of the knee results when an off-balance skier falls backward, which causes the knee to hyperflex. The skier catches an inside edge on the snow, which creates a torque that rotates the tibia relative to the femur and results in injury to the ACL.6,14 A boot-induced anterior drawer mechanism occurs during a landing, when the tail of the ski lands first and in an off-balance position, resulting in a load transmitted through the skis to the skier; this load causes an anterior drawer of the ski boot and tibia relative to the femur, straining the ACL and causing ACL rupture.6,13,14 In the forceful quadriceps muscle contraction mechanism of ACL injury, a forceful quadriceps contraction occurs after a jump to prevent a backward fall. With the knee in flexion, this quadriceps contraction causes an anterior translation of the tibia, resulting in ACL rupture.13,14

 

 

The mechanism of injury differs in snowboarding, in which both feet remain attached to the board. Davies and colleagues15 examined 35 snowboarders who sustained ACL injuries after a flat landing from a jump and concluded that snowboarders preparing for a landing exhibit more quadriceps contraction, which increases the loading force on the ACL during landing. Furthermore, the snowboarder’s stance on the board, with the front foot slightly rotated relative to the board, results in a slight internal tibial rotation of the knee and establishes a posture that makes the snowboarder susceptible to injury. However, the lower incidence of knee injuries among snowboarders compared with skiers may be attributable to the fact that there is a limited amount of torque that can be generated on either knee as both feet are fixed to the board.16

The increased quadriceps force in anticipation of a landing, combined with the internal tibial rotation of the knee caused by the snowboarder’s stance, may be the primary mechanism of ACL rupture in snowboarders.15

Injury Prevention Strategies

Prevention strategies require an identification of injury risk factors for snowboarders. Hasler and colleagues7 conducted a study with 306 patients to identify variables that presented a risk for snowboarders. Low readiness for speed, bad weather, and bad visibility, as well as snow conditions, were found to be significant risk factors.

Skiers’ overall injury rate has decreased over the past 60 years, and this decrease has been attributed in part to improved ski technique and instruction.17,18 Improperly adjusted ski bindings are the culprit in many equipment-related lower extremity injuries, and beginners are at much higher risk for such injuries. Lessons and comprehensive safety training could reduce this injury rate.17,19 Several awareness video and training programs focusing on injury prevention have reduced knee sprains in ski patrollers compared with controls by 62% in 1 study; a similar program reduced injury by 30% in nonprofessional skiers.17 A study of injured snowboarders during a winter in Scotland found that 37% of the patients had no formal instruction or training in correct snowboarding and falling technique.20 Training programs for snowboarders could yield meaningful results in injury prevention and avoidance of risk-taking behavior among snowboarders.

Advances in equipment have also had an impact on the incidence of skiing injuries. Ski bindings protect skiers in 2 ways. First, the binding keeps the boot attached to the ski and prevents unintended release on difficult terrain. Second, the binding releases the boot from the ski during extreme conditions to prevent the skier from experiencing extreme forces or moments that could result in injury. Functional failure in ski bindings has been implicated in increased incidence of knee injuries and ligament rupture. In a study of injuries sustained by recreational alpine skiers in Japan, Urabe and colleagues21 found that 96% of those injured stated that the ski bindings had not released at time of incident. The effects of binding adjustment and maintenance among snowboarders have not been fully investigated, and there are no set guidelines for individual snowboarders on appropriate binding level. However, as there is a range of binding adjustment options available, snowboarders may have an optimum level that maximizes both mobility and protection from injury.22

Soft-shelled boots may also increase injury risk for snowboarders. Such boots allow for a wider range of ankle motion and offer little protection from extreme joint movements. Soft boots are generally preferred among snowboarders because they allow for increased mobility for sharp turns and maneuvers. However, modification of the stiffness of boots that limit ankle and foot joint mobility could reduce the incidence of ankle fractures and sprains among snowboarders.22

Summary

Snowboarding has become increasingly popular worldwide. It attracts a loyal group of amateur athletes and has developed into a billion-dollar industry with a growing rank of professionals. Although most snowboarding injuries are upper extremity injuries, the foot, ankle, and knee represent commonly injured areas among recreational and experienced snowboarders. Advances in ski equipment have significantly reduced the incidence of ankle injuries, but rising knee ligament injuries continue to pose a challenge. Foot and ankle injuries remain an issue in snowboarders despite advances in equipment and safety. New snowboard designs and boot and binding modifications may hold promise in decreasing the risk for injury in these athletes.

References

1. Kim S, Endres NK, Johnson RJ, Ettlinger CF, Shealy JE. Snowboarding injuries: trends over time and comparisons with alpine skiing injuries. Am J Sports Med. 2012;40(4):770-776.

2. Wasden CC, McIntosh SE, Keith DS, McCowan C. An analysis of skiing and snowboarding injuries on Utah slopes. J Trauma. 2009;67(5):1022-1026.

3. Rust DA, Gilmore CJ, Treme G. Injury patterns at a large western United States ski resort with and without snowboarders: the Taos experience. Am J Sports Med. 2013;41(3):652-656.

4. Ishimaru D, Ogawa H, Sumi H, Sumi Y, Shimizu K. Lower extremity injuries in snowboarding. J Trauma. 2011;70(3):E48-E52.

5. Torjussen J, Bahr R. Injuries among competitive snowboarders at the national elite level. Am J Sports Med. 2005;33(3):370-377.

6. Deady LH, Salonen D. Skiing and snowboarding injuries: a review with a focus on mechanism of injury. Radiol Clin North Am. 2010;48(6):1113-1124.

7. Hasler RM, Berov S, Banneker L, et al. Are there risk factors for snowboard injuries? A case–control multicentre study of 559 snowboarders. Br J Sports Med. 2010;44(11):816-821.

8. St-Onge N, Chevalier Y, Hagemeister N, Van De Putte M, De Guise J. Effect of ski binding parameters on knee biomechanics: a three-dimensional computational study. Med Sci Sports Exerc. 2004;36(7):1218-1225.

9. Valderrabano V, Perren T, Ryf C, Rillmann P, Hintermann B. Snowboarder’s talus fracture: treatment outcome of 20 cases after 3.5 years. Am J Sports Med. 2005;33(6):871-880.

10. von Knoch F, Reckord U, von Knoch M, Sommer C. Fracture of the lateral process of the talus in snowboarders. J Bone Joint Surg Br. 2007;89(6):772-777.

11. Funk JR, Srinivasan SC, Crandall JR. Snowboarder’s talus fractures experimentally produced by eversion and dorsiflexion. Am J Sports Med. 2003;31(6):921-928.

12. Pujol N, Blanchi MP, Chambat P. The incidence of anterior cruciate ligament injuries among competitive alpine skiers: a 25-year investigation. Am J Sports Med. 2007;35(7):1070-1074.

13. Hame SL, Oakes DA, Markolf KL. Injury to the anterior cruciate ligament during alpine skiing: a biomechanical analysis of tibial torque and knee flexion angle. Am J Sports Med. 2002;30(4):537-540.

14. Bere T, Flørenes TW, Krosshaug T, Nordsletten L, Bahr R. Events leading to anterior cruciate ligament injury in World Cup alpine skiing: a systematic video analysis of 20 cases. Br J Sports Med. 2011;45(16):1294-1302.

15. Davies H, Tietjens B, Van Sterkenburg M, Mehgan A. Anterior cruciate ligament injuries in snowboarders: a quadriceps-induced injury. Knee Surg Sports Traumatol Arthrosc. 2009;17(9):1048-1051.

16. Bladin C, McCrory P, Pogorzelski A. Snowboarding injuries: current trends and future directions. Sports Med. 2004;34(2):133-139.

17. Rossi MJ, Lubowitz JH, Guttmann D. The skier’s knee. Arthroscopy. 2003;19(1):75-84.

18. Pressman A, Johnson DH. A review of ski injuries resulting in combined injury to the anterior cruciate ligament and medial collateral ligaments. Arthroscopy. 2003;19(2):194-202.

19. Hildebrandt C, Mildner E, Hotter B, Kirschner W, Höbenreich C, Raschner C. Accident prevention on ski slopes—perceptions of safety and knowledge of existing rules. Accid Anal Prev. 2011;43(4):1421-1426.

20. Langran M, Selvaraj S. Increased injury risk among first-day skiers, snowboarders, and skiboarders. Am J Sports Med. 2004;32(1):96-103.

21. Urabe Y, Ochi M, Onari K, Ikuta Y. Anterior cruciate ligament injury in recreational alpine skiers: analysis of mechanisms and strategy for prevention. J Orthop Sci. 2002;7(1):1-5.

22. McAlpine PR. Biomechanical Analysis of Snowboard Jump Landings: A Focus on the Ankle Joint Complex [doctoral thesis]. Auckland, New Zealand: University of Auckland; 2010.

References

1. Kim S, Endres NK, Johnson RJ, Ettlinger CF, Shealy JE. Snowboarding injuries: trends over time and comparisons with alpine skiing injuries. Am J Sports Med. 2012;40(4):770-776.

2. Wasden CC, McIntosh SE, Keith DS, McCowan C. An analysis of skiing and snowboarding injuries on Utah slopes. J Trauma. 2009;67(5):1022-1026.

3. Rust DA, Gilmore CJ, Treme G. Injury patterns at a large western United States ski resort with and without snowboarders: the Taos experience. Am J Sports Med. 2013;41(3):652-656.

4. Ishimaru D, Ogawa H, Sumi H, Sumi Y, Shimizu K. Lower extremity injuries in snowboarding. J Trauma. 2011;70(3):E48-E52.

5. Torjussen J, Bahr R. Injuries among competitive snowboarders at the national elite level. Am J Sports Med. 2005;33(3):370-377.

6. Deady LH, Salonen D. Skiing and snowboarding injuries: a review with a focus on mechanism of injury. Radiol Clin North Am. 2010;48(6):1113-1124.

7. Hasler RM, Berov S, Banneker L, et al. Are there risk factors for snowboard injuries? A case–control multicentre study of 559 snowboarders. Br J Sports Med. 2010;44(11):816-821.

8. St-Onge N, Chevalier Y, Hagemeister N, Van De Putte M, De Guise J. Effect of ski binding parameters on knee biomechanics: a three-dimensional computational study. Med Sci Sports Exerc. 2004;36(7):1218-1225.

9. Valderrabano V, Perren T, Ryf C, Rillmann P, Hintermann B. Snowboarder’s talus fracture: treatment outcome of 20 cases after 3.5 years. Am J Sports Med. 2005;33(6):871-880.

10. von Knoch F, Reckord U, von Knoch M, Sommer C. Fracture of the lateral process of the talus in snowboarders. J Bone Joint Surg Br. 2007;89(6):772-777.

11. Funk JR, Srinivasan SC, Crandall JR. Snowboarder’s talus fractures experimentally produced by eversion and dorsiflexion. Am J Sports Med. 2003;31(6):921-928.

12. Pujol N, Blanchi MP, Chambat P. The incidence of anterior cruciate ligament injuries among competitive alpine skiers: a 25-year investigation. Am J Sports Med. 2007;35(7):1070-1074.

13. Hame SL, Oakes DA, Markolf KL. Injury to the anterior cruciate ligament during alpine skiing: a biomechanical analysis of tibial torque and knee flexion angle. Am J Sports Med. 2002;30(4):537-540.

14. Bere T, Flørenes TW, Krosshaug T, Nordsletten L, Bahr R. Events leading to anterior cruciate ligament injury in World Cup alpine skiing: a systematic video analysis of 20 cases. Br J Sports Med. 2011;45(16):1294-1302.

15. Davies H, Tietjens B, Van Sterkenburg M, Mehgan A. Anterior cruciate ligament injuries in snowboarders: a quadriceps-induced injury. Knee Surg Sports Traumatol Arthrosc. 2009;17(9):1048-1051.

16. Bladin C, McCrory P, Pogorzelski A. Snowboarding injuries: current trends and future directions. Sports Med. 2004;34(2):133-139.

17. Rossi MJ, Lubowitz JH, Guttmann D. The skier’s knee. Arthroscopy. 2003;19(1):75-84.

18. Pressman A, Johnson DH. A review of ski injuries resulting in combined injury to the anterior cruciate ligament and medial collateral ligaments. Arthroscopy. 2003;19(2):194-202.

19. Hildebrandt C, Mildner E, Hotter B, Kirschner W, Höbenreich C, Raschner C. Accident prevention on ski slopes—perceptions of safety and knowledge of existing rules. Accid Anal Prev. 2011;43(4):1421-1426.

20. Langran M, Selvaraj S. Increased injury risk among first-day skiers, snowboarders, and skiboarders. Am J Sports Med. 2004;32(1):96-103.

21. Urabe Y, Ochi M, Onari K, Ikuta Y. Anterior cruciate ligament injury in recreational alpine skiers: analysis of mechanisms and strategy for prevention. J Orthop Sci. 2002;7(1):1-5.

22. McAlpine PR. Biomechanical Analysis of Snowboard Jump Landings: A Focus on the Ankle Joint Complex [doctoral thesis]. Auckland, New Zealand: University of Auckland; 2010.

Issue
The American Journal of Orthopedics - 43(11)
Issue
The American Journal of Orthopedics - 43(11)
Page Number
502-505
Page Number
502-505
Publications
Publications
Topics
Article Type
Display Headline
Lower Extremity Injuries in Snowboarders
Display Headline
Lower Extremity Injuries in Snowboarders
Legacy Keywords
american journal of orthopedics, AJO, injury, injuries, lower extremity, snowboarders, snowboarder, snow, review paper, snowboarding, winter, sports medicine, sports, skiing, foot and ankle, knee, winter sport, mahmood, duggal
Legacy Keywords
american journal of orthopedics, AJO, injury, injuries, lower extremity, snowboarders, snowboarder, snow, review paper, snowboarding, winter, sports medicine, sports, skiing, foot and ankle, knee, winter sport, mahmood, duggal
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Visualization and Reduction of a Meniscal Capsular Junction Tear in the Knee: An Arthroscopic Surgical Technique

Article Type
Changed
Thu, 09/19/2019 - 13:39
Display Headline
Visualization and Reduction of a Meniscal Capsular Junction Tear in the Knee: An Arthroscopic Surgical Technique

The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3

Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.

We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.

To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.

Materials and Methods

The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.

Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.

Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.

Discussion

Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.

To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.

 

 

There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.

References

1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.

2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.

3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.

4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.

5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.

6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.

7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.

8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.

Article PDF
Author and Disclosure Information

Mickey Plymale, MD, Glenn S. Fleisig, PhD, Stephen M. Kocaj, MD, William P. Cooney, MD, Timothy J. Evans, MS, E. Lyle Cain, MD, and Jeffrey R. Dugas, MD

Authors’ Disclosure Statement: The authors wish to report that the American Sports Medicine Institute receives a research grant and educational support from Smith & Nephew (maker of the suture passer described in this article) and a sports medicine grant and educational support from Biomet (maker of the MaxBraid suture described here).

Issue
The American Journal of Orthopedics - 43(11)
Publications
Topics
Page Number
498-500
Legacy Keywords
american journal of orthopedics, AJO, technology, technique, orthopedic, ACL, anterior cruciate ligament, injury, injuries, capsular junction tear, tear, knee, arthroscopic, arthroscopy, surgical, surgery, MCJ, meniscal, reconstruction
Sections
Author and Disclosure Information

Mickey Plymale, MD, Glenn S. Fleisig, PhD, Stephen M. Kocaj, MD, William P. Cooney, MD, Timothy J. Evans, MS, E. Lyle Cain, MD, and Jeffrey R. Dugas, MD

Authors’ Disclosure Statement: The authors wish to report that the American Sports Medicine Institute receives a research grant and educational support from Smith & Nephew (maker of the suture passer described in this article) and a sports medicine grant and educational support from Biomet (maker of the MaxBraid suture described here).

Author and Disclosure Information

Mickey Plymale, MD, Glenn S. Fleisig, PhD, Stephen M. Kocaj, MD, William P. Cooney, MD, Timothy J. Evans, MS, E. Lyle Cain, MD, and Jeffrey R. Dugas, MD

Authors’ Disclosure Statement: The authors wish to report that the American Sports Medicine Institute receives a research grant and educational support from Smith & Nephew (maker of the suture passer described in this article) and a sports medicine grant and educational support from Biomet (maker of the MaxBraid suture described here).

Article PDF
Article PDF

The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3

Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.

We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.

To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.

Materials and Methods

The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.

Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.

Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.

Discussion

Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.

To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.

 

 

There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.

The annual incidence of anterior cruciate ligament (ACL) injury in the general US population is estimated at 1 in 3000, or approximately 100,000 ACL injuries per year.1 The incidence of meniscal injuries after ACL tears ranges from 34% to 92%,2 with peripheral posterior horn tears of the medial meniscus accounting for 40% of the meniscal pathology.3

Although several meniscal tear patterns and their treatments have been described in the literature, posterior medial meniscal capsular junction (MCJ) tears have not been adequately addressed. Thijn4 found the accuracy of routine anterior portal arthroscopy in identifying medial meniscus tears was only 81%. Gillies and Seligson5 found a 25% arthroscopic false-negative rate caused by failure to detect peripheral tears in the posterior horn of the medial meniscus.

We reviewed 781 (517 male, 264 female) patients who underwent ACL reconstruction at our clinic and found a 12.3% incidence of MCJ tear with primary ACL injury and a 23.6% incidence of MCJ tear with revision ACL reconstruction. We believe this is a specific injury pattern. If not looked for during arthroscopy, it can be missed. Whether this tear pattern behaves differently from a posterior medial meniscus tear is yet to be determined.

To address such tear patterns, with or without ACL reconstruction, we use an arthroscopic repair technique that shows direct visualization of the tear and its reduction.

Materials and Methods

The standard anterior medial and lateral arthroscopic portals are established. A 30° scope is placed in the anterior lateral portal, and an arthroscopic shaver is used to débride the ACL remnants, including the footprint and the femoral insertion site. The camera is then adjusted to look straight down. Next, it is placed between the posterior cruciate ligament (PCL) and the medial femoral condyle and advanced toward the posterior capsule. It is then adjusted to view medially (Figure 1). If there is a tear (Figures 2A, 2B), a posterior medial portal (described by Gillquist and colleagues6) is established using an 18-gauge spinal needle for localization followed by a small stab incision through the skin. The spinal needle is left in position to obtain the correct angle for the suture passer (Figure 3). A 70° Hewson suture passer (Smith & Nephew, Memphis, Tennessee) is passed through the posterior medial portal.

Once inside the joint, the suture passer is passed through the capsule and then through the posterior horn of the meniscus (Figure 4). A loop grasper is used to grab the suture on the end of the passer and then is brought out the posterior medial portal and loaded with a No. 2 MaxBraid suture (Biomet, Warsaw, Indiana) (Figure 5). In some cases, the suture passer’s wire goes out the notch toward the anterior aspect of the knee. If this occurs, the loop grasper can be used to grab this wire from the anterior medial portal and load with the MaxBraid suture.

Standard arthroscopic knot-tying techniques are used under direct visualization showing the reduction of the capsule to the meniscus (Figure 6). This is done from the posterior medial portal. The excess suture is cut with an arthroscopic suture cutter in the standard fashion. In the rare case of an intact ACL with this same tear pattern, the same technique can be used. If there is difficulty moving past the intact ACL and PCL, a posterior lateral portal can be used as another accessory portal. The arthroscope can then be placed in the posterior lateral portal, while the posterior medial portal can be used as the working portal. Care must be taken in either technique to avoid soft-tissue bridges.

Discussion

Previous biomechanical studies have shown the meniscus to be important to knee stability. In an ACL-deficient knee, the posterior medial meniscus is important as a secondary stabilizer, and for that reason it is crucial to identify and repair tears there to avoid risking extra force on the ACL graft.7,8 We think an MCJ tear can potentially compromise knee stability as well, so there is a need to examine the posterior aspect of the knee during every knee arthroscopy. However, biomechanical studies must be performed to validate this theory.

To assess whether orthopedists in general are aware of and concerned about MCJ tears, a survey was e-mailed to members of the Arthroscopy Association of North America (AANA) and the American Sports Medicine Fellowship Society (ASMFS). Sixty-seven orthopedic surgeons who perform ACL reconstruction surgeries responded to some or all of the following questions. Nearly half (48%) of the surgeons said they always assess the posteromedial MCJ by placing the camera between the PCL and the medial femoral condyle. Only 25% said MCJ tears should be repaired always, but another 64% said these tears should be repaired sometimes. Thus, 89% responded that at least some MCJ tears should be repaired. Most (88%) said these tears could sometimes or always be a source of chronic pain. Also, 92% said these tears could sometimes or always change the contact pressures in the knee, and 66% said these tears could sometimes or always change the rotational stability of the knee. Finally, 60% said MCJ tears could sometimes or always affect ACL graft failure. These data show a need to determine an appropriate surgical technique that will help treat MCJ tears.

 

 

There is a vast amount of literature about the meniscus, but there are few current studies on the specific entity of MCJ tears. We think these tears act similarly to posterior meniscus tears and should be addressed similarly. MCJ tears are easily missed on anterior arthroscopy. In every knee arthroscopy, the posterior aspect of the knee should be checked for these injuries, particularly in ACL-deficient knees. A lesion found within the capsule can be repaired with the technique we have described.

References

1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.

2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.

3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.

4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.

5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.

6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.

7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.

8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.

References

1. Fu FH, Cohen SB. Current Concepts in ACL Reconstruction. Thorofare, NJ: Slack; 2008.

2. Simonian PT, Cole BJ, Bach BR. Sports Injuries of the Knee: Surgical Approaches. New York, NY: Thieme; 2006.

3. Smith JP 3rd, Barrett GR. Medial and lateral meniscal tear patterns in anterior cruciate ligament-deficient knees. A prospective analysis of 575 tears. Am J Sports Med. 2001;29(4):415-419.

4. Thijn CJ. Accuracy of double-contrast arthrography and arthroscopy of the knee joint. Skeletal Radiol. 1982;8(3):187-192.

5. Gillies H, Seligson D. Precision in the diagnosis of meniscal lesions: a comparison of clinical evaluation, arthrography, and arthroscopy. J Bone Joint Surg Am. 1979;61(3):343-346.

6. Gillquist J, Hagberg G, Oretorp N. Arthroscopic examination of the posteromedial compartment of the knee joint. Int Orthop. 1979;3(1):13-18.

7. Levy IM, Torzilli PA, Warren RF. The effect of medial meniscectomy on anterior-posterior motion of the knee. J Bone Joint Surg Am. 1982;64(6):883-888.

8. Allen CR, Wong EK, Livesay GA, Sakane M, Fu FH, Woo SL. Importance of the medial meniscus in the anterior cruciate ligament–deficient knee. J Orthop Res. 2000;18(1):109-115.

Issue
The American Journal of Orthopedics - 43(11)
Issue
The American Journal of Orthopedics - 43(11)
Page Number
498-500
Page Number
498-500
Publications
Publications
Topics
Article Type
Display Headline
Visualization and Reduction of a Meniscal Capsular Junction Tear in the Knee: An Arthroscopic Surgical Technique
Display Headline
Visualization and Reduction of a Meniscal Capsular Junction Tear in the Knee: An Arthroscopic Surgical Technique
Legacy Keywords
american journal of orthopedics, AJO, technology, technique, orthopedic, ACL, anterior cruciate ligament, injury, injuries, capsular junction tear, tear, knee, arthroscopic, arthroscopy, surgical, surgery, MCJ, meniscal, reconstruction
Legacy Keywords
american journal of orthopedics, AJO, technology, technique, orthopedic, ACL, anterior cruciate ligament, injury, injuries, capsular junction tear, tear, knee, arthroscopic, arthroscopy, surgical, surgery, MCJ, meniscal, reconstruction
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Timing of lifestyle interventions for obesity

Article Type
Changed
Fri, 01/18/2019 - 14:08
Display Headline
Timing of lifestyle interventions for obesity

Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).

Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.

Dr. Patrick Catalano

Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.

Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.

Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).

How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.

However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.

Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.

A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.

Role of prepregnancy BMI

In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.

Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.

Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).

Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).

 

 

In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).

Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).

Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.

Interventions in Pregnancy

Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).

Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).

Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).

A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.

Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.

Focus on prepregnancy

Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.

We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.

In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.

When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

 

 

In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.

The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.

According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.

Dr. Catalano reports that he has no disclosures relevant to this Master Class.

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Obesity, pregnancy, overweight, body mass index
Sections
Author and Disclosure Information

Author and Disclosure Information

Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).

Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.

Dr. Patrick Catalano

Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.

Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.

Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).

How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.

However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.

Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.

A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.

Role of prepregnancy BMI

In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.

Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.

Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).

Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).

 

 

In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).

Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).

Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.

Interventions in Pregnancy

Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).

Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).

Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).

A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.

Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.

Focus on prepregnancy

Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.

We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.

In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.

When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

 

 

In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.

The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.

According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.

Dr. Catalano reports that he has no disclosures relevant to this Master Class.

Obesity has become so pervasive that it is now considered a major health concern during pregnancy. Almost 56% of women aged 20-39 years in the United States are overweight or obese, based on the World Health Organization’s criteria for body mass index (BMI) and data from the 2009-2010 National Health and Nutrition Examination Survey (NHANES). Moreover, 7.5% of women in this age group are morbidly obese, with a body mass index (BMI) greater than 40 kg/m2 (JAMA 2012;307:491-7).

Obesity in pregnancy not only increases the risk of spontaneous abortions and congenital anomalies, it also increases the risk of gestational diabetes (GDM), hypertensive disorders, and other metabolic complications that affect both the mother and fetus.

Dr. Patrick Catalano

Of much concern is the increased risk of fetal overgrowth and long-term health consequences for children of obese mothers. Obesity in early pregnancy has been shown to more than double the risk of obesity in the offspring, which in turn puts these children at risk for developing the metabolic syndrome – and, as Dr. Thomas Moore pointed out in September’s Master Class – appears to program these offspring for downstream cardiovascular risk in adulthood.

Mean term birth weights have risen in the United States during the past several decades. In Cleveland, we have seen a significant 116 g increase in mean term birth weight since 1975; this increase encompasses weights from the 5th to the 95th percentiles. Even more concerning is our finding that the ponderal index in our neonatal population has increased because of decreased fetal length over the last decade.

Some recent studies have suggested that the increase in birth weight in the United States has reached a plateau, but our analyses of national trends suggest that such change is secondary to factors such as earlier gestational age of delivery. Concurrently, an alarming number of children and adolescents – 17% of those aged 2-19 years, according to the 2009-2010 NHANES data – are overweight or obese (JAMA 2012;307:483-90).

How to best treat obesity for improved maternal and fetal health has thus become a focus of research. Studies on lifestyle interventions for obese women during pregnancy have aimed to prevent excessive gestational weight gain and decrease adverse perinatal outcomes – mainly macrosomia, GDM, and hypertensive disorders.

However, the results of this recent body of research have been disappointing. Lifestyle interventions initiated during pregnancy have had only limited success in improving perinatal outcomes. The research tells us that while we may be able to reduce excessive gestational weight gain, it is unlikely that we will be successful in reducing fetal overgrowth, GDM, or preeclampsia in obese women.

Moreover, other studies show that it is a high pregravid BMI – not excessive gestational weight gain or the development of GDM – that plays the biggest role in fetal overgrowth and fetal adiposity.

A paradigm shift is in order. We must think about lifestyle intervention and weight loss before pregnancy, when the woman’s metabolic condition can be improved in time to minimize adverse perinatal metabolic outcomes and to maximize metabolic benefits relating to fetal body composition and metabolism.

Role of prepregnancy BMI

In 2008, the Institute of Medicine (IOM) and National Research Council reexamined 1990 guidelines for gestational weight gain. They concluded that excessive weight gain in pregnancy was a primary contributor to the development of obesity in women. In fact, according to the 2009 IOM report, “Weight Gain During Pregnancy: Reexamining the Guidelines” (Washington: National Academy Press, 2009), 38% of normal weight, 63% of overweight, and 46% of obese women had gained weight in excess of the earlier guidelines.

Helping our patients to gain within the guidelines is important. Excessive gestational weight gain is a primary risk factor for maternal postpartum weight retention, which increases the risk for maternal obesity in a subsequent pregnancy. It also has been associated with a modest increased risk of preterm birth and development of type 2 diabetes.

Interestingly, however, high gestational weight gain has not been related to an increased risk of fetal overgrowth or macrosomia in many obese women. Increased gestational weight gain is a greater risk for fetal overgrowth in women who are of normal weight prior to pregnancy (J. Clin. Endocrinol. Metab. 2012;97:3648-54).

Our research has found that in overweight and obese women, it is maternal pregravid BMI – and not gestational weight gain – that presents the greatest risk for fetal macrosomia, and more specifically, the greatest risk for fetal obesity. Even when glucose tolerance levels are normal, overweight and obese women have neonates who are heavier and who have significant increases in the percentage of body fat and fat mass (Am. J. Obstet. Gynecol. 2006;195:1100-3).

 

 

In an 8-year prospective study of the perinatal risk factors associated with childhood obesity, we similarly found that maternal pregravid BMI – independent of maternal glucose status or gestational weight gain – was the strongest predictor of childhood obesity and metabolic dysfunction (Am. J. Clin. Nutr. 2009;90:1303-13).

Other studies have teased apart the roles of maternal obesity and GDM in long-term health of offspring. This work has found that maternal obesity during pregnancy is associated with metabolic syndrome in the offspring and an increased risk of type 2 diabetes in youth, independent of maternal diabetes during pregnancy. A recent meta-analysis also reported that, although maternal diabetes is associated with an increased BMI z score, this was no longer significant after adjustments were made for prepregnancy BMI (Diabetologia 2011;54:1957-66).

Maternal pregravid obesity, therefore, is not only a risk factor for neonatal adiposity at birth, but also for the longer-term risk of obesity and metabolic dysfunction in the offspring – independent of maternal GDM or excessive gestational weight gain.

Interventions in Pregnancy

Numerous prospective trials have examined lifestyle interventions for obese women during pregnancy. One randomized controlled study of a low glycemic index diet in pregnancy (coined the ROLO study) involved 800 women in Ireland who had previously delivered an infant weighting greater than 4,000 g. Women were randomized to receive the restricted diet or no intervention at 13 weeks. Despite a decrease in gestational weight gain in the intervention group, there were no differences in birth weight, birth weight percentile, ponderal index, or macrosomia between the two groups (BMJ 2012;345:e5605).

Another randomized controlled trial reported by a Danish group involved an intervention that consisted of dietary guidance, free membership in a fitness center, and personal coaching initiated between 10 and 14 weeks of gestation. There was a decrease in gestational weight gain in the intervention group, but paradoxically, the infants in the intervention group also had significantly higher birth weight, compared with controls (Diabetes Care 2011;34:2502-7).

Additionally, there have been at least five meta-analyses published in the past 2 years looking at lifestyle interventions during pregnancy. All have concluded that interventions initiated during pregnancy have limited success in reducing excessive gestational weight gain but not necessarily to within the IOM guidelines. The literature contains scant evidence to support further benefits for infant or maternal health (in other words, fetal overgrowth, GDM, or hypertensive disorders).

A recent Cochrane review also concluded that the results of several randomized controlled trials suggest no significant difference in GDM incidence between women receiving exercise intervention versus routine care.

Just this year, three additional randomized controlled trials of lifestyle interventions during pregnancy were published. Only one, the Treatment of Obese Pregnant Women (TOP) study, showed a modest effect in decreasing gestational weight gain. None found a reduction in GDM or fetal overgrowth.

Focus on prepregnancy

Obesity is an inflammatory condition that increases the risk of insulin resistance, impaired beta-cell function, and abnormal adiponectin concentrations. In pregnancy, maternal obesity and hyperinsulinemia can affect placental growth and gene expression.

We have studied lean and obese women recruited prior to a planned pregnancy, as well as lean and obese women scheduled for elective pregnancy termination in the first trimester. Our research, some of which we reported recently in the American Journal of Physiology , has shown increased expression of lipogenic and inflammatory genes in maternal adipose tissue and in the placenta of obese women in the early first trimester, before any phenotypic change becomes apparent (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

Specifically, hyperinsulinemia and/or defective insulin action in obese women appears to affect the placental programming of genes relating to adipokine expression and lipid metabolism, as well as mitrochondrial function. Altered inflammatory and lipid pathways affect the availability of nutrients for the fetus and, consequently, the size and body composition of the fetus. Fetal overgrowth and neonatal adiposity can result.

In addition, our research has shown that obese women have decreased insulin suppression of lipolysis in white adipose tissue, which during pregnancy results in improved lipid availability for fetal fat accretion and lipotoxicity.

When interventions aimed at weight loss and improved insulin sensitivity are undertaken before pregnancy or in the period between pregnancies, we have the opportunity to increase fat oxidation and reduce oxidative stress in early pregnancy. We also may be able to limit placental inflammation and favorably affect placental growth and gene expression. By the second trimester, our research suggests, gene expression in the placenta and early molecular changes in the white adipose tissue have already been programmed and cannot be reversed (Am. J. Physiol. Endocrinol. Metab. 2012;303:e832-40).

 

 

In studies by our group and others of interpregnancy weight loss or gain, interpregnancy weight loss has been associated with a lower risk of large-for-gestational-age (LGA) infants, whereas interpregnancy weight gain has been associated with an increased risk of LGA. Preliminary work from our group shows that the decrease in birth weight involves primarily fat and not lean mass.

The 2009 IOM guidelines support weight loss before pregnancy and state that overweight women should receive individual preconceptional counseling to improve diet quality, increase physical activity, and normalize weight. Multifaceted interventions do work: In obese nonpregnant individuals, lifestyle interventions, which include an exercise program, diet, and behavioral modification have been shown to be successful in improving insulin sensitivity, inflammation, and overall metabolic function.

According to the IOM report, preconceptional services aimed at achieving a healthy weight before conceiving will represent “a radical change to the care provided to obese women of childbearing age.” With continuing research and accumulating data, however, the concept is gaining traction as a viable paradigm for improving perinatal outcomes, with long-term benefits for both the mother and her baby.

Dr. Catalano reports that he has no disclosures relevant to this Master Class.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Timing of lifestyle interventions for obesity
Display Headline
Timing of lifestyle interventions for obesity
Legacy Keywords
Obesity, pregnancy, overweight, body mass index
Legacy Keywords
Obesity, pregnancy, overweight, body mass index
Sections
Article Source

PURLs Copyright

Inside the Article

Newer blood linked to fewer complications from heart surgery

Article Type
Changed
Tue, 10/28/2014 - 06:00
Display Headline
Newer blood linked to fewer complications from heart surgery

Heart surgery

Credit: University of Ottawa

Heart Institute

VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.

Patients who received newer blood had a lower rate of mortality, infection, and renal failure.

They were also less likely to require prolonged ventilation or re-exploration for bleeding.

Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.

The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.

Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.

Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.

After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).

In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).

After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).

“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”

Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.

Publications
Topics

Heart surgery

Credit: University of Ottawa

Heart Institute

VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.

Patients who received newer blood had a lower rate of mortality, infection, and renal failure.

They were also less likely to require prolonged ventilation or re-exploration for bleeding.

Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.

The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.

Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.

Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.

After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).

In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).

After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).

“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”

Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.

Heart surgery

Credit: University of Ottawa

Heart Institute

VANCOUVER—In a large study, heart surgery patients who received recently donated blood had significantly fewer post-operative complications than those who received blood stored for more than 2 weeks.

Patients who received newer blood had a lower rate of mortality, infection, and renal failure.

They were also less likely to require prolonged ventilation or re-exploration for bleeding.

Ansar Hassan, MD, PhD, of Saint John Regional Hospital in New Brunswick, Canada, and his colleagues presented these results at the Canadian Cardiovascular Congress as abstract 562.

The researchers examined records at the New Brunswick Heart Centre in Saint John for non-emergency heart surgeries performed from January 2005 to September 2013 on patients who received red blood cells during or after surgery and who stayed in the hospital less than 30 days.

Of 2015 patients, slightly more than half (n=1052) received only blood that was donated within 14 days of the transfusion. The rest of the patients received some or only blood that was donated more than 14 days before transfusion. Canadian protocols allow blood to be stored and used up to 6 weeks after donation.

Patients who received newer blood were more likely to be female, have unstable angina, to have undergone isolated coronary artery bypass graft or valve surgery, to have experienced shorter bypass and cross-clamp times, and to have left the operating room on inotropes.

After surgery, patients who received newer blood had a lower rate of mortality (1.7% vs 3.3%, P=0.02), infection (3.2% vs 5.4%, P=0.02), atrial fibrillation (43.8% vs 47.3%, P=0.12), and renal failure (12.8% vs 17.7%, P=0.0003).

In addition, they were less likely to require ventilation for more than 24 hours (3% vs 7.7%, P<0.0001) or re-exploration for bleeding (1.5% vs 3.1%, P=0.02).

After the researchers adjusted for differences in baseline and intra-operative characteristics, receiving newer blood was associated with a significant reduction in a composite of the aforementioned outcomes (odds ratio=0.79, P=0.01).

“The findings show that we need to pay attention to the age of the blood we give cardiac surgery patients,” Dr Hassan said. “Perhaps more importantly, we need new studies to determine what is driving this relationship between the age of blood and the outcomes we are seeing.”

Dr Hassan noted that previous studies have reached contradictory conclusions on this subject, which was a reason this study was conducted.

Publications
Publications
Topics
Article Type
Display Headline
Newer blood linked to fewer complications from heart surgery
Display Headline
Newer blood linked to fewer complications from heart surgery
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Obese ALL patients more likely to have MRD after induction

Article Type
Changed
Tue, 10/28/2014 - 06:00
Display Headline
Obese ALL patients more likely to have MRD after induction

Woman on a scale

Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.

To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.

The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.

“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.

“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”

The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.

Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”

About one-third of the patients were obese or overweight at the time of diagnosis.

MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.

The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.

Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.

“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.

Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.

Publications
Topics

Woman on a scale

Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.

To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.

The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.

“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.

“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”

The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.

Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”

About one-third of the patients were obese or overweight at the time of diagnosis.

MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.

The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.

Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.

“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.

Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.

Woman on a scale

Obese youths with acute lymphoblastic leukemia (ALL) are known to have worse outcomes than their lean counterparts.

To gain more insight into this phenomenon, investigators set out to determine if body mass index (BMI) impacted ALL patients’ responses to initial chemotherapy.

The results showed that, following induction chemotherapy, obese patients were more than twice as likely to have minimal residual disease (MRD) than non-obese patients.

“Induction chemotherapy provides a patient’s best chance for remission or a cure,” said principal investigator Steven Mittelman, MD, PhD, of The Saban Research Institute of Children’s Hospital Los Angeles in California.

“Our findings indicate that a patient’s obesity negatively impacts the ability of chemotherapy to kill leukemia cells, reducing the odds of survival.”

The study, which was published in Blood, included 198 patients who were diagnosed with ALL and between the ages of 1 and 21 years.

Each patient’s BMI was converted to a percentile and classified according to the Center for Disease Control and Prevention’s thresholds for overweight (85% to 94%) and obese (greater than 95%). Patients with a BMI less than 85% were considered “lean.”

About one-third of the patients were obese or overweight at the time of diagnosis.

MRD was determined by testing bone marrow specimens at the end of induction therapy, and patients were followed for 2 to 5 years from the time of diagnosis.

The investigators found that lean patients with MRD had similar outcomes to obese patients without evidence of MRD. Obese patients with MRD had the worst outcomes.

Additionally, although nearly a quarter of the patients initially deemed “lean” gained weight and became obese during the first month of treatment, these patients still showed similar outcomes to those who remained lean.

“In addition to increasing a patient’s likelihood of having persistent disease following treatment, obesity appears to add a risk factor that changes the interaction between chemotherapy and residual leukemia cells,” said Hisham Abdel-Azim, MD, also of The Saban Research Institute.

Findings from this study offer new avenues for investigation that include modifying chemotherapy regimens for obese patients and working to change a patient’s weight status beginning at the time of diagnosis.

Publications
Publications
Topics
Article Type
Display Headline
Obese ALL patients more likely to have MRD after induction
Display Headline
Obese ALL patients more likely to have MRD after induction
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Armored CAR T cells next on the production line

Article Type
Changed
Tue, 10/28/2014 - 05:00
Display Headline
Armored CAR T cells next on the production line

Jae H. Park, MD

NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.

“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.

He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.

Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.

So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.

The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.

The built-in costimulatory signal proved superior to the first-generation CAR T cells.

In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.

Clinical trials

Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.

So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.

CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.

“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”

Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”

In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.

However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.

The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.

And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.

CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.

Building a better T cell

Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.

Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.

Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.

 

 

He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Jae H. Park, MD

NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.

“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.

He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.

Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.

So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.

The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.

The built-in costimulatory signal proved superior to the first-generation CAR T cells.

In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.

Clinical trials

Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.

So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.

CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.

“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”

Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”

In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.

However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.

The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.

And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.

CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.

Building a better T cell

Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.

Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.

Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.

 

 

He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.

Jae H. Park, MD

NEW YORK—Chimeric antigen receptor (CAR) T cells have “remarkable” activity, according to a speaker at the NCCN 9th Annual Congress: Hematologic Malignancies.

“[T]his chimera binds like an antibody, but it acts like a T cell, so it combines the best of both worlds,” said Jae H. Park, MD, of Memorial Sloan Kettering Cancer Center (MSKCC) in New York.

He then traced the evolution of CAR T-cell design, discussed clinical trials using CD19-targed T cells, and described how investigators are working at building a better T cell.

Researchers found that T-cell activation and proliferation require signaling through a costimulatory receptor, such as CD28, 4-1BB, or OX-40. Without costimulation, the T cell becomes unresponsive or undergoes apoptosis.

So based on this observation, Dr Park said, several research groups created second- and third-generation CARs to incorporate the costimulatory signal.

The first generation was typically fused to the CD8 domain. Second-generation CARs include a costimulatory signaling domain, such as CD28, 4-1BB, or OX40. And the third generation contains signaling domains from 2 costimulatory receptors, CD28 with 4-1BB and CD28 with OX40.

The built-in costimulatory signal proved superior to the first-generation CAR T cells.

In NOD/SCID mice inoculated with NALM-6 lymphoma cells, Dr Park said, about 50% more were “cured,” in terms of survival, using a CD80 costimulatory ligand with CD19-targeted T cells compared to those without the ligand.

Clinical trials

Clinical trials using second-generation CD19-targeted T cells in relapsed B-cell acute lymphoblastic leukemia (ALL) at MSKCC produced an overall complete response (CR) rate of 88% in a median of 22.1 days. And 72% of the CRs were minimal residual disease (MRD) negative.

So the CAR T cells produce a “very rapid and deep remission,” Dr Park said.

CAR T-cell therapy, however, comes with adverse events, most notably, cytokine release syndrome (CRS), which results from T-cell activation. CRS causes fevers, hypotension, and neurologic toxicities including mental status changes, obtundation, and seizures.

“CRS is not unique to CAR T-cell therapy,” Dr Park said. “Any therapy that activates T cells can have this type of side effect.”

Dr Park noted that CRS is associated with disease burden at the time of treatment. “The larger the disease burden pre T-cell therapy,” he said, “the more likely [patients are] to develop CRS.”

In the MSKCC trial, no patient with very low disease burden—5% blasts in the bone marrow—developed CRS.

However, there is also a correlation between tumor burden and T-cell expansion, he added. T cells expand much better with a larger disease burden, because there is a greater antigen load.

The investigators found that serum C-reactive protein can serve as a surrogate marker for the severity of CRS. Patients with levels above 20 mg/dL are more likely to experience CRS.

And Dr Park pointed out that CRS symptoms respond pretty rapidly to steroids or interleukin-6 receptor blockade.

CAR T-cell therapy has also been used to treat chronic lymphocytic leukemia, but with much more modest response rates than in ALL. Both University of Pennsylvania and MSKCC trials in CLL have produced overall response rates around 40%.

Building a better T cell

Dr Park described efforts underway to develop the fourth-generation “armored” CAR T cells to overcome the hostile tumor microenvironment, which contains multiple inhibitory factors designed to suppress effector T cells.

Armored T cells can actually secrete some of the inflammatory cytokines to change the tumor microenvironment and overcome the inhibitory effect.

Dr Park described a potential scenario: The armored CAR T cells secrete IL-12, enhance the central memory phenotype, enhance cytotoxicity, enhance persistence, modify the endogenous immune system and T-cell activation, and reactivate tumor-infiltrating lymphocytes.

 

 

He said future studies will focus on translation of these armored CAR T cells to the clinical setting in both hematologic and solid tumor malignancies.

Publications
Publications
Topics
Article Type
Display Headline
Armored CAR T cells next on the production line
Display Headline
Armored CAR T cells next on the production line
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Bacterium could help control malaria, dengue

Article Type
Changed
Tue, 10/28/2014 - 05:00
Display Headline
Bacterium could help control malaria, dengue

Anopheles gambiae mosquito

Credit: CDC

A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.

With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.

Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.

George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.

The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.

Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.

Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.

When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.

The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.

Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.

Publications
Topics

Anopheles gambiae mosquito

Credit: CDC

A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.

With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.

Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.

George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.

The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.

Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.

Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.

When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.

The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.

Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.

Anopheles gambiae mosquito

Credit: CDC

A bacterium isolated from a mosquito’s gut could aid the fight against malaria and dengue, according to a study published in PLOS Pathogens.

With previous research, scientists isolated Csp_P, a member of the family of chromobacteria, from the gut of Aedes aegypti mosquitoes.

Now, the team has found that Csp_P can directly inhibit malaria and dengue pathogens in vitro and shorten the life span of the mosquitoes that transmit both diseases.

George Dimopoulos, PhD, of Johns Hopkins University in Baltimore, Maryland, and his colleagues examined Csp_P’s actions on both mosquitoes and pathogens, and the results suggest that Csp_P might help to fight malaria and dengue at different levels.

The researchers added Csp_P to sugar water fed to mosquitoes and found that the bacteria are able to quickly colonize the gut of the two most important mosquito disease vectors—Aedes aegypti and Anopheles gambiae.

Moreover, the presence of Csp_P in the gut reduced the susceptibility of the respective mosquitoes to infection with the malaria parasite Plasmodium falciparum or with dengue virus.

Even without gut colonization, exposure to Csp_P through food or breeding water shortened the lifespan of adult mosquitoes and mosquito larvae of both species.

When the researchers tested whether Csp_P could act against the malaria or dengue pathogens directly, they found that the bacterium, likely through the production of toxic metabolites, can inhibit the growth of Plasmodium at various stages during the parasite’s life cycle and also abolish dengue virus infectivity.

The team said these toxic metabolites could potentially be developed into drugs to treat malaria and dengue.

Overall, the researchers concluded that Csp_P’s broad-spectrum antipathogen properties and ability to kill mosquitoes make it a good candidate for the development of novel control strategies for malaria and dengue, so it warrants further study.

Publications
Publications
Topics
Article Type
Display Headline
Bacterium could help control malaria, dengue
Display Headline
Bacterium could help control malaria, dengue
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica