User login
Antibiotic Prophylaxis Might Prevent Recurrent UTIs
Clinical question: Does antibiotic prophylaxis prevent future episodes of urinary tract infections?
Background: Recurrent urinary tract infections (UTI) in children might be associated with renal scarring and subsequent clinical consequences associated with long-term morbidity. Historically, antibiotic prophylaxis has been recommended for children who might have risk factors for recurrent infection, most commonly vesicoureteral reflux. However, scars may be present in the absence of known risk factors and upon first UTI. The efficacy of antibiotic prophylaxis in preventing recurrent UTIs is unclear.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Four centers in Australia.
Synopsis: The study looked at 576 children under the age of 18 with a history of at least one symptomatic UTI. The patients were randomized to receive trimethoprim-sulfamethoxazole (TMP-SMX) or placebo for 12 months. Children with vesicoureteral reflux were included, but those with known neurologic, skeletal, or urologic predispositions were excluded.
Thirteen percent of patients in the antibiotic group developed a UTI compared with 19% of patients in the placebo group (P=0.02). The authors calculate that at 12 months, 14 patients would need to be treated to prevent one UTI.
This study was unable to enroll the planned number of children but remained adequately powered to show a reduction in the primary outcome (rate of symptomatic UTI). However, a significant number of patients (approximately 28%) in each arm stopped taking the medication, the majority for undisclosed reasons. Despite an intention-to-treat analysis, this degree of dropout raises questions about the true effect size. Additionally, this study does not answer the more important clinical question regarding the effect of prophylaxis on potential future renal damage, specifically in children with vesicoureteral reflux.
Bottom line: Antibiotic prophylaxis might be modestly effective in preventing recurrent UTIs.
Citation: Craig JC, Simpson JM, Williams GJ, et al. Antibiotic prophylaxis and recurrent urinary tract infection in children. N Engl J Med. 2009;361(18):1748-1759.
Clinical question: Does antibiotic prophylaxis prevent future episodes of urinary tract infections?
Background: Recurrent urinary tract infections (UTI) in children might be associated with renal scarring and subsequent clinical consequences associated with long-term morbidity. Historically, antibiotic prophylaxis has been recommended for children who might have risk factors for recurrent infection, most commonly vesicoureteral reflux. However, scars may be present in the absence of known risk factors and upon first UTI. The efficacy of antibiotic prophylaxis in preventing recurrent UTIs is unclear.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Four centers in Australia.
Synopsis: The study looked at 576 children under the age of 18 with a history of at least one symptomatic UTI. The patients were randomized to receive trimethoprim-sulfamethoxazole (TMP-SMX) or placebo for 12 months. Children with vesicoureteral reflux were included, but those with known neurologic, skeletal, or urologic predispositions were excluded.
Thirteen percent of patients in the antibiotic group developed a UTI compared with 19% of patients in the placebo group (P=0.02). The authors calculate that at 12 months, 14 patients would need to be treated to prevent one UTI.
This study was unable to enroll the planned number of children but remained adequately powered to show a reduction in the primary outcome (rate of symptomatic UTI). However, a significant number of patients (approximately 28%) in each arm stopped taking the medication, the majority for undisclosed reasons. Despite an intention-to-treat analysis, this degree of dropout raises questions about the true effect size. Additionally, this study does not answer the more important clinical question regarding the effect of prophylaxis on potential future renal damage, specifically in children with vesicoureteral reflux.
Bottom line: Antibiotic prophylaxis might be modestly effective in preventing recurrent UTIs.
Citation: Craig JC, Simpson JM, Williams GJ, et al. Antibiotic prophylaxis and recurrent urinary tract infection in children. N Engl J Med. 2009;361(18):1748-1759.
Clinical question: Does antibiotic prophylaxis prevent future episodes of urinary tract infections?
Background: Recurrent urinary tract infections (UTI) in children might be associated with renal scarring and subsequent clinical consequences associated with long-term morbidity. Historically, antibiotic prophylaxis has been recommended for children who might have risk factors for recurrent infection, most commonly vesicoureteral reflux. However, scars may be present in the absence of known risk factors and upon first UTI. The efficacy of antibiotic prophylaxis in preventing recurrent UTIs is unclear.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Four centers in Australia.
Synopsis: The study looked at 576 children under the age of 18 with a history of at least one symptomatic UTI. The patients were randomized to receive trimethoprim-sulfamethoxazole (TMP-SMX) or placebo for 12 months. Children with vesicoureteral reflux were included, but those with known neurologic, skeletal, or urologic predispositions were excluded.
Thirteen percent of patients in the antibiotic group developed a UTI compared with 19% of patients in the placebo group (P=0.02). The authors calculate that at 12 months, 14 patients would need to be treated to prevent one UTI.
This study was unable to enroll the planned number of children but remained adequately powered to show a reduction in the primary outcome (rate of symptomatic UTI). However, a significant number of patients (approximately 28%) in each arm stopped taking the medication, the majority for undisclosed reasons. Despite an intention-to-treat analysis, this degree of dropout raises questions about the true effect size. Additionally, this study does not answer the more important clinical question regarding the effect of prophylaxis on potential future renal damage, specifically in children with vesicoureteral reflux.
Bottom line: Antibiotic prophylaxis might be modestly effective in preventing recurrent UTIs.
Citation: Craig JC, Simpson JM, Williams GJ, et al. Antibiotic prophylaxis and recurrent urinary tract infection in children. N Engl J Med. 2009;361(18):1748-1759.
Advanced Dementia Is a Terminal Illness with High Morbidity and Mortality
Clinical question: Does understanding the expected clinical course of advanced dementia influence end-of-life decisions by proxy decision-makers?
Background: Advanced dementia is a leading cause of death in the United States, but the clinical course of advanced dementia has not been described in a rigorous, prospective manner. The lack of information might cause risk to be underestimated, and patients might receive suboptimal palliative care.
Study design: Multicenter prospective cohort study.
Setting: Twenty-two nursing homes in a single U.S. city.
Synopsis: The survey examined 323 nursing home residents with advanced dementia. The patients were clinically assessed at baseline and quarterly for 18 months through chart reviews, nursing interviews, and physical examinations. Additionally, their proxies were surveyed regarding their understanding of the subjects’ prognoses.
During the survey period, 41.1% of patients developed pneumonia, 52.6% of patients experienced a febrile episode, and 85.8% of patients developed an eating problem; cumulative all-cause mortality was 54.8%. Adjusted for age, sex, and disease duration, the six-month mortality rate for subjects who had pneumonia was 46.7%; a febrile episode, 44.5%; and an eating problem, 38.6%.
Distressing symptoms, including dyspnea (46.0%) and pain (39.1%), were common. In the last three months of life, 40.7% of subjects underwent at least one burdensome intervention (defined as hospitalization, ED visit, parenteral therapy, or tube feeding).
Subjects whose proxies reported an understanding of the poor prognosis and expected clinical complications of advanced dementia underwent significantly fewer burdensome interventions (adjusted odds ratio 0.12).
Bottom line: Advanced dementia is associated with frequent complications, including infections and eating problems, with high six-month mortality and significant associated morbidity. Patients whose healthcare proxies have a good understanding of the expected clinical course and prognosis receive less-aggressive end-of-life care.
Citation: Mitchell SL, Teno JM, Kiely DK, et al. The clinical course of advanced dementia. N Engl J Med. 2009;361(16):1529-1538. TH
Clinical question: Does understanding the expected clinical course of advanced dementia influence end-of-life decisions by proxy decision-makers?
Background: Advanced dementia is a leading cause of death in the United States, but the clinical course of advanced dementia has not been described in a rigorous, prospective manner. The lack of information might cause risk to be underestimated, and patients might receive suboptimal palliative care.
Study design: Multicenter prospective cohort study.
Setting: Twenty-two nursing homes in a single U.S. city.
Synopsis: The survey examined 323 nursing home residents with advanced dementia. The patients were clinically assessed at baseline and quarterly for 18 months through chart reviews, nursing interviews, and physical examinations. Additionally, their proxies were surveyed regarding their understanding of the subjects’ prognoses.
During the survey period, 41.1% of patients developed pneumonia, 52.6% of patients experienced a febrile episode, and 85.8% of patients developed an eating problem; cumulative all-cause mortality was 54.8%. Adjusted for age, sex, and disease duration, the six-month mortality rate for subjects who had pneumonia was 46.7%; a febrile episode, 44.5%; and an eating problem, 38.6%.
Distressing symptoms, including dyspnea (46.0%) and pain (39.1%), were common. In the last three months of life, 40.7% of subjects underwent at least one burdensome intervention (defined as hospitalization, ED visit, parenteral therapy, or tube feeding).
Subjects whose proxies reported an understanding of the poor prognosis and expected clinical complications of advanced dementia underwent significantly fewer burdensome interventions (adjusted odds ratio 0.12).
Bottom line: Advanced dementia is associated with frequent complications, including infections and eating problems, with high six-month mortality and significant associated morbidity. Patients whose healthcare proxies have a good understanding of the expected clinical course and prognosis receive less-aggressive end-of-life care.
Citation: Mitchell SL, Teno JM, Kiely DK, et al. The clinical course of advanced dementia. N Engl J Med. 2009;361(16):1529-1538. TH
Clinical question: Does understanding the expected clinical course of advanced dementia influence end-of-life decisions by proxy decision-makers?
Background: Advanced dementia is a leading cause of death in the United States, but the clinical course of advanced dementia has not been described in a rigorous, prospective manner. The lack of information might cause risk to be underestimated, and patients might receive suboptimal palliative care.
Study design: Multicenter prospective cohort study.
Setting: Twenty-two nursing homes in a single U.S. city.
Synopsis: The survey examined 323 nursing home residents with advanced dementia. The patients were clinically assessed at baseline and quarterly for 18 months through chart reviews, nursing interviews, and physical examinations. Additionally, their proxies were surveyed regarding their understanding of the subjects’ prognoses.
During the survey period, 41.1% of patients developed pneumonia, 52.6% of patients experienced a febrile episode, and 85.8% of patients developed an eating problem; cumulative all-cause mortality was 54.8%. Adjusted for age, sex, and disease duration, the six-month mortality rate for subjects who had pneumonia was 46.7%; a febrile episode, 44.5%; and an eating problem, 38.6%.
Distressing symptoms, including dyspnea (46.0%) and pain (39.1%), were common. In the last three months of life, 40.7% of subjects underwent at least one burdensome intervention (defined as hospitalization, ED visit, parenteral therapy, or tube feeding).
Subjects whose proxies reported an understanding of the poor prognosis and expected clinical complications of advanced dementia underwent significantly fewer burdensome interventions (adjusted odds ratio 0.12).
Bottom line: Advanced dementia is associated with frequent complications, including infections and eating problems, with high six-month mortality and significant associated morbidity. Patients whose healthcare proxies have a good understanding of the expected clinical course and prognosis receive less-aggressive end-of-life care.
Citation: Mitchell SL, Teno JM, Kiely DK, et al. The clinical course of advanced dementia. N Engl J Med. 2009;361(16):1529-1538. TH
Adding Basal Insulin to Oral Agents in Type 2 Diabetes Might Offer Best Glycemic Control
Clinical question: When added to oral diabetic agents, which insulin regimen (biphasic, prandial or basal) best achieves glycemic control in patients with Type 2 diabetes?
Background: Most patients with Type 2 diabetes mellitus (DM2) require insulin when oral agents provide suboptimal glycemic control. Little is known about which insulin regimen is most effective.
Study design: Three-year, open-label, multicenter trial.
Setting: Fifty-eight clinical centers in the United Kingdom and Ireland.
Synopsis: The authors randomized 708 insulin-naïve DM2 patients (median age 62 years) with HgbA1c 7% to 10% on maximum-dose metformin or sulfonylurea to one of three regimens: biphasic insulin twice daily; prandial insulin three times daily; or basal insulin once daily. Outcomes were HgbA1c, hypoglycemia rates, and weight gain. Sulfonylureas were replaced by another insulin if glycemic control was unacceptable.
The patients were mostly Caucasian and overweight. At three years of followup, median HgbA1c was similar in all groups (7.1% biphasic, 6.8% prandial, 6.9% basal); however, more patients who received prandial or basal insulin achieved HgbA1c less than 6.5% (45% and 43%, respectively) than in the biphasic group (32%).
Hypoglycemia was significantly less frequent in the basal insulin group (1.7 per patient per year versus 3.0 and 5.5 with biphasic and prandial, respectively). Patients gained weight in all groups; the greatest gain was with prandial insulin. At three years, there were no significant between-group differences in blood pressure, cholesterol, albuminuria, or quality of life.
Bottom line: Adding insulin to oral diabetic regimens improves glycemic control. Basal or prandial insulin regimens achieve glycemic targets more frequently than biphasic dosing.
Citation: Holman RR, Farmer AJ, Davies MJ, et al. Three-year efficacy of complex insulin regimens in type 2 diabetes. N Engl J Med. 2009;361(18):1736-1747.
Clinical question: When added to oral diabetic agents, which insulin regimen (biphasic, prandial or basal) best achieves glycemic control in patients with Type 2 diabetes?
Background: Most patients with Type 2 diabetes mellitus (DM2) require insulin when oral agents provide suboptimal glycemic control. Little is known about which insulin regimen is most effective.
Study design: Three-year, open-label, multicenter trial.
Setting: Fifty-eight clinical centers in the United Kingdom and Ireland.
Synopsis: The authors randomized 708 insulin-naïve DM2 patients (median age 62 years) with HgbA1c 7% to 10% on maximum-dose metformin or sulfonylurea to one of three regimens: biphasic insulin twice daily; prandial insulin three times daily; or basal insulin once daily. Outcomes were HgbA1c, hypoglycemia rates, and weight gain. Sulfonylureas were replaced by another insulin if glycemic control was unacceptable.
The patients were mostly Caucasian and overweight. At three years of followup, median HgbA1c was similar in all groups (7.1% biphasic, 6.8% prandial, 6.9% basal); however, more patients who received prandial or basal insulin achieved HgbA1c less than 6.5% (45% and 43%, respectively) than in the biphasic group (32%).
Hypoglycemia was significantly less frequent in the basal insulin group (1.7 per patient per year versus 3.0 and 5.5 with biphasic and prandial, respectively). Patients gained weight in all groups; the greatest gain was with prandial insulin. At three years, there were no significant between-group differences in blood pressure, cholesterol, albuminuria, or quality of life.
Bottom line: Adding insulin to oral diabetic regimens improves glycemic control. Basal or prandial insulin regimens achieve glycemic targets more frequently than biphasic dosing.
Citation: Holman RR, Farmer AJ, Davies MJ, et al. Three-year efficacy of complex insulin regimens in type 2 diabetes. N Engl J Med. 2009;361(18):1736-1747.
Clinical question: When added to oral diabetic agents, which insulin regimen (biphasic, prandial or basal) best achieves glycemic control in patients with Type 2 diabetes?
Background: Most patients with Type 2 diabetes mellitus (DM2) require insulin when oral agents provide suboptimal glycemic control. Little is known about which insulin regimen is most effective.
Study design: Three-year, open-label, multicenter trial.
Setting: Fifty-eight clinical centers in the United Kingdom and Ireland.
Synopsis: The authors randomized 708 insulin-naïve DM2 patients (median age 62 years) with HgbA1c 7% to 10% on maximum-dose metformin or sulfonylurea to one of three regimens: biphasic insulin twice daily; prandial insulin three times daily; or basal insulin once daily. Outcomes were HgbA1c, hypoglycemia rates, and weight gain. Sulfonylureas were replaced by another insulin if glycemic control was unacceptable.
The patients were mostly Caucasian and overweight. At three years of followup, median HgbA1c was similar in all groups (7.1% biphasic, 6.8% prandial, 6.9% basal); however, more patients who received prandial or basal insulin achieved HgbA1c less than 6.5% (45% and 43%, respectively) than in the biphasic group (32%).
Hypoglycemia was significantly less frequent in the basal insulin group (1.7 per patient per year versus 3.0 and 5.5 with biphasic and prandial, respectively). Patients gained weight in all groups; the greatest gain was with prandial insulin. At three years, there were no significant between-group differences in blood pressure, cholesterol, albuminuria, or quality of life.
Bottom line: Adding insulin to oral diabetic regimens improves glycemic control. Basal or prandial insulin regimens achieve glycemic targets more frequently than biphasic dosing.
Citation: Holman RR, Farmer AJ, Davies MJ, et al. Three-year efficacy of complex insulin regimens in type 2 diabetes. N Engl J Med. 2009;361(18):1736-1747.
Initiation of Dialysis Does Not Help Maintain Functional Status in Elderly
Clinical question: Is functional status in the elderly maintained over time after initiating long-term dialysis?
Background: Quality-of-life maintenance often is used as a goal when initiating long-term dialysis in elderly patients with end-stage renal disease. More elderly patients are being offered long-term dialysis treatment. Little is known about the functional status of elderly patients on long-term dialysis.
Study design: Retrospective cohort study.
Setting: U.S. nursing homes.
Synopsis: By cross-linking data from two population-based administrative datasets, this study identified 3,702 nursing home patients (mean 73.4 years) who had started long-term dialysis and whose functional status had been assessed. Activities of daily living assessments before and at three-month intervals after dialysis initiation were compared to see if functional status was maintained.
Within three months of starting dialysis, 61% of patients had a decline in functional status or had died. By one year, only 1 in 8 patients had maintained their pre-dialysis functional status.
Decline in functional status cannot be attributed solely to dialysis because study patients were not compared to patients with chronic kidney disease who were not dialyzed. In addition, these results might not apply to all elderly patients on dialysis, as the functional status of elderly nursing home patients might differ significantly from those living at home.
Bottom line: Functional status is not maintained in most elderly nursing home patients in the first 12 months after long-term dialysis is initiated. Elderly patients considering dialysis treatment should be aware that dialysis might not help maintain functional status and quality of life.
Citation: Kurella Tamura MK, Covinsky KE, Chertow GM, Yaffe C, Landefeld CS, McCulloch CE. Functional status of elderly adults before and after initiation of dialysis. N Engl J Med. 2009;361(16):1539-1547.
Clinical question: Is functional status in the elderly maintained over time after initiating long-term dialysis?
Background: Quality-of-life maintenance often is used as a goal when initiating long-term dialysis in elderly patients with end-stage renal disease. More elderly patients are being offered long-term dialysis treatment. Little is known about the functional status of elderly patients on long-term dialysis.
Study design: Retrospective cohort study.
Setting: U.S. nursing homes.
Synopsis: By cross-linking data from two population-based administrative datasets, this study identified 3,702 nursing home patients (mean 73.4 years) who had started long-term dialysis and whose functional status had been assessed. Activities of daily living assessments before and at three-month intervals after dialysis initiation were compared to see if functional status was maintained.
Within three months of starting dialysis, 61% of patients had a decline in functional status or had died. By one year, only 1 in 8 patients had maintained their pre-dialysis functional status.
Decline in functional status cannot be attributed solely to dialysis because study patients were not compared to patients with chronic kidney disease who were not dialyzed. In addition, these results might not apply to all elderly patients on dialysis, as the functional status of elderly nursing home patients might differ significantly from those living at home.
Bottom line: Functional status is not maintained in most elderly nursing home patients in the first 12 months after long-term dialysis is initiated. Elderly patients considering dialysis treatment should be aware that dialysis might not help maintain functional status and quality of life.
Citation: Kurella Tamura MK, Covinsky KE, Chertow GM, Yaffe C, Landefeld CS, McCulloch CE. Functional status of elderly adults before and after initiation of dialysis. N Engl J Med. 2009;361(16):1539-1547.
Clinical question: Is functional status in the elderly maintained over time after initiating long-term dialysis?
Background: Quality-of-life maintenance often is used as a goal when initiating long-term dialysis in elderly patients with end-stage renal disease. More elderly patients are being offered long-term dialysis treatment. Little is known about the functional status of elderly patients on long-term dialysis.
Study design: Retrospective cohort study.
Setting: U.S. nursing homes.
Synopsis: By cross-linking data from two population-based administrative datasets, this study identified 3,702 nursing home patients (mean 73.4 years) who had started long-term dialysis and whose functional status had been assessed. Activities of daily living assessments before and at three-month intervals after dialysis initiation were compared to see if functional status was maintained.
Within three months of starting dialysis, 61% of patients had a decline in functional status or had died. By one year, only 1 in 8 patients had maintained their pre-dialysis functional status.
Decline in functional status cannot be attributed solely to dialysis because study patients were not compared to patients with chronic kidney disease who were not dialyzed. In addition, these results might not apply to all elderly patients on dialysis, as the functional status of elderly nursing home patients might differ significantly from those living at home.
Bottom line: Functional status is not maintained in most elderly nursing home patients in the first 12 months after long-term dialysis is initiated. Elderly patients considering dialysis treatment should be aware that dialysis might not help maintain functional status and quality of life.
Citation: Kurella Tamura MK, Covinsky KE, Chertow GM, Yaffe C, Landefeld CS, McCulloch CE. Functional status of elderly adults before and after initiation of dialysis. N Engl J Med. 2009;361(16):1539-1547.
Inhaled Corticosteroids Decrease Inflammation in Moderate to Severe COPD
Clinical question: Does long-term inhaled corticosteroid therapy, with and without long-acting beta-agonists, decrease airway inflammation and improve lung function in patients with moderate to severe chronic obstructive pulmonary disease (COPD)?
Background: Guideline-recommended treatment of COPD with inhaled corticosteroids and long-acting beta-agonists improves symptoms and exacerbation rates; little is known about the impact of these therapies on inflammation and long-term lung function.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Two university medical centers in the Netherlands.
Synopsis: One hundred one steroid-naïve patients, ages 45 to 75 who were current or former smokers with moderate to severe COPD, were randomized to one of four regimens: 1) fluticasone for six months, then placebo for 24 months; 2) fluticasone for 30 months; 3) fluticasone and salmeterol for 30 months; or 4) placebo for 30 months. The primary outcome was inflammatory cell counts in bronchial biopsies/induced sputum. Secondary outcomes included postbronchodilator spirometry, methacholine hyperresponsiveness, and self-reported symptoms and health status. Patients with asthma were excluded.
Short-term fluticasone therapy decreased inflammation and improved forced expiratory volume in one second (FEV1). Long-term therapy also decreased the rate of FEV1 decline, reduced dyspnea, and improved health status. Discontinuation of therapy at six months led to inflammation relapse with worsened symptoms and increased rate of FEV1 decline. The addition of long-acting beta-agonists did not provide additional anti-inflammatory benefits, but it did improve FEV1 and dyspnea at six months.
Additional studies are needed to further define clinical outcomes and assess the cost benefit of these therapies.
Bottom line: Inhaled corticosteroids decrease inflammation in steroid-naïve patients with moderate to severe COPD and might decrease the rate of lung function decline. Long-acting beta-agonists do not offer additional anti-inflammatory benefit.
Citation: Lapperre TS, Snoeck-Stroband JB, Gosman MM, et al. Effect of fluticasone with and without salmeterol on pulmonary outcomes in chronic obstructive pulmonary disease: a randomized trial. Ann Intern Med. 2009;151(8):517-527.
Clinical question: Does long-term inhaled corticosteroid therapy, with and without long-acting beta-agonists, decrease airway inflammation and improve lung function in patients with moderate to severe chronic obstructive pulmonary disease (COPD)?
Background: Guideline-recommended treatment of COPD with inhaled corticosteroids and long-acting beta-agonists improves symptoms and exacerbation rates; little is known about the impact of these therapies on inflammation and long-term lung function.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Two university medical centers in the Netherlands.
Synopsis: One hundred one steroid-naïve patients, ages 45 to 75 who were current or former smokers with moderate to severe COPD, were randomized to one of four regimens: 1) fluticasone for six months, then placebo for 24 months; 2) fluticasone for 30 months; 3) fluticasone and salmeterol for 30 months; or 4) placebo for 30 months. The primary outcome was inflammatory cell counts in bronchial biopsies/induced sputum. Secondary outcomes included postbronchodilator spirometry, methacholine hyperresponsiveness, and self-reported symptoms and health status. Patients with asthma were excluded.
Short-term fluticasone therapy decreased inflammation and improved forced expiratory volume in one second (FEV1). Long-term therapy also decreased the rate of FEV1 decline, reduced dyspnea, and improved health status. Discontinuation of therapy at six months led to inflammation relapse with worsened symptoms and increased rate of FEV1 decline. The addition of long-acting beta-agonists did not provide additional anti-inflammatory benefits, but it did improve FEV1 and dyspnea at six months.
Additional studies are needed to further define clinical outcomes and assess the cost benefit of these therapies.
Bottom line: Inhaled corticosteroids decrease inflammation in steroid-naïve patients with moderate to severe COPD and might decrease the rate of lung function decline. Long-acting beta-agonists do not offer additional anti-inflammatory benefit.
Citation: Lapperre TS, Snoeck-Stroband JB, Gosman MM, et al. Effect of fluticasone with and without salmeterol on pulmonary outcomes in chronic obstructive pulmonary disease: a randomized trial. Ann Intern Med. 2009;151(8):517-527.
Clinical question: Does long-term inhaled corticosteroid therapy, with and without long-acting beta-agonists, decrease airway inflammation and improve lung function in patients with moderate to severe chronic obstructive pulmonary disease (COPD)?
Background: Guideline-recommended treatment of COPD with inhaled corticosteroids and long-acting beta-agonists improves symptoms and exacerbation rates; little is known about the impact of these therapies on inflammation and long-term lung function.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Two university medical centers in the Netherlands.
Synopsis: One hundred one steroid-naïve patients, ages 45 to 75 who were current or former smokers with moderate to severe COPD, were randomized to one of four regimens: 1) fluticasone for six months, then placebo for 24 months; 2) fluticasone for 30 months; 3) fluticasone and salmeterol for 30 months; or 4) placebo for 30 months. The primary outcome was inflammatory cell counts in bronchial biopsies/induced sputum. Secondary outcomes included postbronchodilator spirometry, methacholine hyperresponsiveness, and self-reported symptoms and health status. Patients with asthma were excluded.
Short-term fluticasone therapy decreased inflammation and improved forced expiratory volume in one second (FEV1). Long-term therapy also decreased the rate of FEV1 decline, reduced dyspnea, and improved health status. Discontinuation of therapy at six months led to inflammation relapse with worsened symptoms and increased rate of FEV1 decline. The addition of long-acting beta-agonists did not provide additional anti-inflammatory benefits, but it did improve FEV1 and dyspnea at six months.
Additional studies are needed to further define clinical outcomes and assess the cost benefit of these therapies.
Bottom line: Inhaled corticosteroids decrease inflammation in steroid-naïve patients with moderate to severe COPD and might decrease the rate of lung function decline. Long-acting beta-agonists do not offer additional anti-inflammatory benefit.
Citation: Lapperre TS, Snoeck-Stroband JB, Gosman MM, et al. Effect of fluticasone with and without salmeterol on pulmonary outcomes in chronic obstructive pulmonary disease: a randomized trial. Ann Intern Med. 2009;151(8):517-527.
Resident Fatigue and Distress Contribute to Perceived Medical Errors
Clinical question: Do resident fatigue and distress contribute to medical errors?
Background: In recent years, such measures as work-hour limitations have been implemented to decrease resident fatigue and, it is presumed, medical errors. However, few studies address the relationship between residents’ well-being and self-reported medical errors.
Study design: Prospective six-year longitudinal cohort study.
Setting: Single academic medical center.
Synopsis: The authors had 380 internal-medicine residents complete quarterly surveys to assess fatigue, quality of life, burnout, symptoms of depression, and frequency of perceived medical errors. In a univariate analysis, fatigue/sleepiness, burnout, depression, and overall quality of life measures correlated significantly with self-reported major medical errors. Fatigue/sleepiness and measures of distress additively increased the risk of self-reported errors. Increases in one or both domains were estimated to increase the risk of self-reported errors by as much as 15% to 28%.
The authors studied only self-reported medical errors. It is difficult to know whether these errors directly affected patient outcomes. Additionally, results of this single-site study might not be able to be generalized.
Bottom line: Fatigue and distress contribute to self-perceived medical errors among residents.
Citation: West CP, Tan AD, Habermann TM, Sloan JA, Shanafelt TD. Association of resident fatigue and distress with perceived medical errors. JAMA. 2009;302(12):1294-1300.
Clinical question: Do resident fatigue and distress contribute to medical errors?
Background: In recent years, such measures as work-hour limitations have been implemented to decrease resident fatigue and, it is presumed, medical errors. However, few studies address the relationship between residents’ well-being and self-reported medical errors.
Study design: Prospective six-year longitudinal cohort study.
Setting: Single academic medical center.
Synopsis: The authors had 380 internal-medicine residents complete quarterly surveys to assess fatigue, quality of life, burnout, symptoms of depression, and frequency of perceived medical errors. In a univariate analysis, fatigue/sleepiness, burnout, depression, and overall quality of life measures correlated significantly with self-reported major medical errors. Fatigue/sleepiness and measures of distress additively increased the risk of self-reported errors. Increases in one or both domains were estimated to increase the risk of self-reported errors by as much as 15% to 28%.
The authors studied only self-reported medical errors. It is difficult to know whether these errors directly affected patient outcomes. Additionally, results of this single-site study might not be able to be generalized.
Bottom line: Fatigue and distress contribute to self-perceived medical errors among residents.
Citation: West CP, Tan AD, Habermann TM, Sloan JA, Shanafelt TD. Association of resident fatigue and distress with perceived medical errors. JAMA. 2009;302(12):1294-1300.
Clinical question: Do resident fatigue and distress contribute to medical errors?
Background: In recent years, such measures as work-hour limitations have been implemented to decrease resident fatigue and, it is presumed, medical errors. However, few studies address the relationship between residents’ well-being and self-reported medical errors.
Study design: Prospective six-year longitudinal cohort study.
Setting: Single academic medical center.
Synopsis: The authors had 380 internal-medicine residents complete quarterly surveys to assess fatigue, quality of life, burnout, symptoms of depression, and frequency of perceived medical errors. In a univariate analysis, fatigue/sleepiness, burnout, depression, and overall quality of life measures correlated significantly with self-reported major medical errors. Fatigue/sleepiness and measures of distress additively increased the risk of self-reported errors. Increases in one or both domains were estimated to increase the risk of self-reported errors by as much as 15% to 28%.
The authors studied only self-reported medical errors. It is difficult to know whether these errors directly affected patient outcomes. Additionally, results of this single-site study might not be able to be generalized.
Bottom line: Fatigue and distress contribute to self-perceived medical errors among residents.
Citation: West CP, Tan AD, Habermann TM, Sloan JA, Shanafelt TD. Association of resident fatigue and distress with perceived medical errors. JAMA. 2009;302(12):1294-1300.
Dabigatran Is Not Inferior to Warfarin in Atrial Fibrillation
Clinical question: Is dabigatran, an oral thrombin inhibitor, an effective and safe alternative to warfarin in patients with atrial fibrillation?
Background: Warfarin reduces the risk of stroke among patients with atrial fibrillation (AF) but requires frequent laboratory monitoring. Dabigatran is an oral direct thrombin inhibitor given in fixed dosages without laboratory monitoring.
Study design: Randomized, multicenter, open-label, noninferiority trial.
Setting: 951 clinical centers in 44 countries.
Synopsis: More than 18,000 patients 65 and older with AF and at least one stroke risk factor were enrolled. The average CHADS2 score was 2.1. Patients were randomized to receive fixed doses of dabigatran (110 mg or 150 mg, twice daily) or warfarin adjusted to an INR of 2.0-3.0. The primary outcomes were a) stroke or systemic embolism and b) major hemorrhage. Median followup was two years.
The annual rates of stroke or systemic embolism for both doses of dabigatran were noninferior to warfarin (P<0.001); higher-dose dabigatran was statistically superior to warfarin (relative risk (RR)=0.66, P<0.001). The annual rate of major hemorrhage was lowest in the lower-dose dabigatran group (RR=0.80, P=0.003 compared with warfarin); the higher-dose dabigatran and warfarin groups had equivalent rates of major bleeding. No increased risk of liver function abnormalities was noted.
Bottom line: Dabigatran appears to be an effective and safe alternative to warfarin in AF patients. If the drug were to be FDA-approved, appropriate patient selection and cost will need to be established.
Citation: Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med. 2009;361(12):1139-1151.
Clinical question: Is dabigatran, an oral thrombin inhibitor, an effective and safe alternative to warfarin in patients with atrial fibrillation?
Background: Warfarin reduces the risk of stroke among patients with atrial fibrillation (AF) but requires frequent laboratory monitoring. Dabigatran is an oral direct thrombin inhibitor given in fixed dosages without laboratory monitoring.
Study design: Randomized, multicenter, open-label, noninferiority trial.
Setting: 951 clinical centers in 44 countries.
Synopsis: More than 18,000 patients 65 and older with AF and at least one stroke risk factor were enrolled. The average CHADS2 score was 2.1. Patients were randomized to receive fixed doses of dabigatran (110 mg or 150 mg, twice daily) or warfarin adjusted to an INR of 2.0-3.0. The primary outcomes were a) stroke or systemic embolism and b) major hemorrhage. Median followup was two years.
The annual rates of stroke or systemic embolism for both doses of dabigatran were noninferior to warfarin (P<0.001); higher-dose dabigatran was statistically superior to warfarin (relative risk (RR)=0.66, P<0.001). The annual rate of major hemorrhage was lowest in the lower-dose dabigatran group (RR=0.80, P=0.003 compared with warfarin); the higher-dose dabigatran and warfarin groups had equivalent rates of major bleeding. No increased risk of liver function abnormalities was noted.
Bottom line: Dabigatran appears to be an effective and safe alternative to warfarin in AF patients. If the drug were to be FDA-approved, appropriate patient selection and cost will need to be established.
Citation: Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med. 2009;361(12):1139-1151.
Clinical question: Is dabigatran, an oral thrombin inhibitor, an effective and safe alternative to warfarin in patients with atrial fibrillation?
Background: Warfarin reduces the risk of stroke among patients with atrial fibrillation (AF) but requires frequent laboratory monitoring. Dabigatran is an oral direct thrombin inhibitor given in fixed dosages without laboratory monitoring.
Study design: Randomized, multicenter, open-label, noninferiority trial.
Setting: 951 clinical centers in 44 countries.
Synopsis: More than 18,000 patients 65 and older with AF and at least one stroke risk factor were enrolled. The average CHADS2 score was 2.1. Patients were randomized to receive fixed doses of dabigatran (110 mg or 150 mg, twice daily) or warfarin adjusted to an INR of 2.0-3.0. The primary outcomes were a) stroke or systemic embolism and b) major hemorrhage. Median followup was two years.
The annual rates of stroke or systemic embolism for both doses of dabigatran were noninferior to warfarin (P<0.001); higher-dose dabigatran was statistically superior to warfarin (relative risk (RR)=0.66, P<0.001). The annual rate of major hemorrhage was lowest in the lower-dose dabigatran group (RR=0.80, P=0.003 compared with warfarin); the higher-dose dabigatran and warfarin groups had equivalent rates of major bleeding. No increased risk of liver function abnormalities was noted.
Bottom line: Dabigatran appears to be an effective and safe alternative to warfarin in AF patients. If the drug were to be FDA-approved, appropriate patient selection and cost will need to be established.
Citation: Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med. 2009;361(12):1139-1151.
Cardiac Resynchronization Therapy with Implantable Cardioverter Defibrillator Placement Decreases Heart Failure
Clinical question: Does cardiac resynchronization therapy (CRT) with biventricular pacing decrease cardiac events in patients with reduced ejection fraction (EF) and wide QRS complex but only mild cardiac symptoms?
Background: In patients with severely reduced EF, implantable cardioverter defibrillators (ICDs) have been shown to improve survival. Meanwhile, CRT decreases heart-failure-related hospitalizations for patients with advanced heart-failure symptoms, EF less than 35%, and intraventricular conduction delay. It is not as clear whether patients with less-severe symptoms benefit from CRT.
Study design: Randomized, controlled trial.
Setting: 110 medical centers in the U.S., Canada, and Europe.
Synopsis: This Multicenter Automatic Defibrillator Implantation Trial with Cardiac Resynchronization Therapy (MADIT-CRT) study randomly assigned 1,820 adults with EF less than 30%, New York Health Association Class I or II congestive heart failure, and in sinus rhythm with QRS greater than 130 msec to receive ICD with CRT or ICD alone. The primary endpoint was all-cause mortality or nonfatal heart-failure events. Average followup was 2.4 years.
A 34% reduction in the primary endpoint was found in the ICD-CRT group when compared with the ICD-only group, primarily due to a 41% reduction in heart-failure events. In a subgroup analysis, women and patients with QRS greater than 150 msec experienced particular benefit. Echocardiography one year after device implantation demonstrated significant reductions in left ventricular end-systolic and end-diastolic volume, and a significant increase in EF with ICD-CRT versus ICD-only (P<0.001).
Bottom line: Compared with ICD alone, CRT in combination with ICD prevented heart-failure events in relatively asymptomatic heart-failure patients with low EF and prolonged QRS.
Citation: Moss AJ, Hall WJ, Cannom DS, et al. Cardiac-resynchronization therapy for the prevention of heart-failure events. N Engl J Med. 2009;361(14):1329-1338.
Clinical question: Does cardiac resynchronization therapy (CRT) with biventricular pacing decrease cardiac events in patients with reduced ejection fraction (EF) and wide QRS complex but only mild cardiac symptoms?
Background: In patients with severely reduced EF, implantable cardioverter defibrillators (ICDs) have been shown to improve survival. Meanwhile, CRT decreases heart-failure-related hospitalizations for patients with advanced heart-failure symptoms, EF less than 35%, and intraventricular conduction delay. It is not as clear whether patients with less-severe symptoms benefit from CRT.
Study design: Randomized, controlled trial.
Setting: 110 medical centers in the U.S., Canada, and Europe.
Synopsis: This Multicenter Automatic Defibrillator Implantation Trial with Cardiac Resynchronization Therapy (MADIT-CRT) study randomly assigned 1,820 adults with EF less than 30%, New York Health Association Class I or II congestive heart failure, and in sinus rhythm with QRS greater than 130 msec to receive ICD with CRT or ICD alone. The primary endpoint was all-cause mortality or nonfatal heart-failure events. Average followup was 2.4 years.
A 34% reduction in the primary endpoint was found in the ICD-CRT group when compared with the ICD-only group, primarily due to a 41% reduction in heart-failure events. In a subgroup analysis, women and patients with QRS greater than 150 msec experienced particular benefit. Echocardiography one year after device implantation demonstrated significant reductions in left ventricular end-systolic and end-diastolic volume, and a significant increase in EF with ICD-CRT versus ICD-only (P<0.001).
Bottom line: Compared with ICD alone, CRT in combination with ICD prevented heart-failure events in relatively asymptomatic heart-failure patients with low EF and prolonged QRS.
Citation: Moss AJ, Hall WJ, Cannom DS, et al. Cardiac-resynchronization therapy for the prevention of heart-failure events. N Engl J Med. 2009;361(14):1329-1338.
Clinical question: Does cardiac resynchronization therapy (CRT) with biventricular pacing decrease cardiac events in patients with reduced ejection fraction (EF) and wide QRS complex but only mild cardiac symptoms?
Background: In patients with severely reduced EF, implantable cardioverter defibrillators (ICDs) have been shown to improve survival. Meanwhile, CRT decreases heart-failure-related hospitalizations for patients with advanced heart-failure symptoms, EF less than 35%, and intraventricular conduction delay. It is not as clear whether patients with less-severe symptoms benefit from CRT.
Study design: Randomized, controlled trial.
Setting: 110 medical centers in the U.S., Canada, and Europe.
Synopsis: This Multicenter Automatic Defibrillator Implantation Trial with Cardiac Resynchronization Therapy (MADIT-CRT) study randomly assigned 1,820 adults with EF less than 30%, New York Health Association Class I or II congestive heart failure, and in sinus rhythm with QRS greater than 130 msec to receive ICD with CRT or ICD alone. The primary endpoint was all-cause mortality or nonfatal heart-failure events. Average followup was 2.4 years.
A 34% reduction in the primary endpoint was found in the ICD-CRT group when compared with the ICD-only group, primarily due to a 41% reduction in heart-failure events. In a subgroup analysis, women and patients with QRS greater than 150 msec experienced particular benefit. Echocardiography one year after device implantation demonstrated significant reductions in left ventricular end-systolic and end-diastolic volume, and a significant increase in EF with ICD-CRT versus ICD-only (P<0.001).
Bottom line: Compared with ICD alone, CRT in combination with ICD prevented heart-failure events in relatively asymptomatic heart-failure patients with low EF and prolonged QRS.
Citation: Moss AJ, Hall WJ, Cannom DS, et al. Cardiac-resynchronization therapy for the prevention of heart-failure events. N Engl J Med. 2009;361(14):1329-1338.
Fluvastatin Improves Postoperative Cardiac Outcomes in Patients Undergoing Vascular Surgery
Clinical question: Does perioperative fluvastatin decrease adverse cardiac events after vascular surgery?
Background: Patients with atherosclerotic vascular disease who undergo vascular surgery are at high risk for postoperative cardiac events. Studies in nonsurgical populations have shown the beneficial effects of statin therapy on cardiac outcomes. However, no placebo-controlled trials have addressed the effect of statins on postoperative cardiac outcomes.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Single large academic medical center in the Netherlands.
Synopsis: The study looked at 497 statin-naïve patients 40 years or older undergoing non-cardiac vascular surgery. The patients were randomized to 80 mg of extended-release fluvastatin versus placebo; all patients received a beta-blocker. Therapy began preoperatively (median of 37 days) and continued for at least 30 days after surgery. Outcomes were assessed at 30 days post-surgery.
Postoperative myocardial infarction (MI) was significantly less common in the fluvastatin group than with placebo (10.8% vs. 19%, hazard ratio (HR) 0.55, P=0.01). In addition, the treatment group had a lower frequency of death from cardiovascular causes (4.8% vs. 10.1%, HR 0.47, P=0.03). Statin therapy was not associated with an increased rate of adverse events.
Notably, all of the patients enrolled in this study were high-risk patients undergoing high-risk (vascular) surgery. Patients already on statins were excluded.
Further studies are needed to determine whether the findings can be extrapolated to other populations, including nonvascular surgery patients.
Bottom line: Perioperative statin therapy resulted in a significant decrease in postoperative MI and death within 30 days of vascular surgery.
Citation: Schouten O, Boersma E, Hoeks SE, et al. Fluvastatin and perioperative events in patients undergoing vascular surgery. N Engl J Med. 2009;361(10):980-989.
Clinical question: Does perioperative fluvastatin decrease adverse cardiac events after vascular surgery?
Background: Patients with atherosclerotic vascular disease who undergo vascular surgery are at high risk for postoperative cardiac events. Studies in nonsurgical populations have shown the beneficial effects of statin therapy on cardiac outcomes. However, no placebo-controlled trials have addressed the effect of statins on postoperative cardiac outcomes.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Single large academic medical center in the Netherlands.
Synopsis: The study looked at 497 statin-naïve patients 40 years or older undergoing non-cardiac vascular surgery. The patients were randomized to 80 mg of extended-release fluvastatin versus placebo; all patients received a beta-blocker. Therapy began preoperatively (median of 37 days) and continued for at least 30 days after surgery. Outcomes were assessed at 30 days post-surgery.
Postoperative myocardial infarction (MI) was significantly less common in the fluvastatin group than with placebo (10.8% vs. 19%, hazard ratio (HR) 0.55, P=0.01). In addition, the treatment group had a lower frequency of death from cardiovascular causes (4.8% vs. 10.1%, HR 0.47, P=0.03). Statin therapy was not associated with an increased rate of adverse events.
Notably, all of the patients enrolled in this study were high-risk patients undergoing high-risk (vascular) surgery. Patients already on statins were excluded.
Further studies are needed to determine whether the findings can be extrapolated to other populations, including nonvascular surgery patients.
Bottom line: Perioperative statin therapy resulted in a significant decrease in postoperative MI and death within 30 days of vascular surgery.
Citation: Schouten O, Boersma E, Hoeks SE, et al. Fluvastatin and perioperative events in patients undergoing vascular surgery. N Engl J Med. 2009;361(10):980-989.
Clinical question: Does perioperative fluvastatin decrease adverse cardiac events after vascular surgery?
Background: Patients with atherosclerotic vascular disease who undergo vascular surgery are at high risk for postoperative cardiac events. Studies in nonsurgical populations have shown the beneficial effects of statin therapy on cardiac outcomes. However, no placebo-controlled trials have addressed the effect of statins on postoperative cardiac outcomes.
Study design: Randomized, double-blind, placebo-controlled trial.
Setting: Single large academic medical center in the Netherlands.
Synopsis: The study looked at 497 statin-naïve patients 40 years or older undergoing non-cardiac vascular surgery. The patients were randomized to 80 mg of extended-release fluvastatin versus placebo; all patients received a beta-blocker. Therapy began preoperatively (median of 37 days) and continued for at least 30 days after surgery. Outcomes were assessed at 30 days post-surgery.
Postoperative myocardial infarction (MI) was significantly less common in the fluvastatin group than with placebo (10.8% vs. 19%, hazard ratio (HR) 0.55, P=0.01). In addition, the treatment group had a lower frequency of death from cardiovascular causes (4.8% vs. 10.1%, HR 0.47, P=0.03). Statin therapy was not associated with an increased rate of adverse events.
Notably, all of the patients enrolled in this study were high-risk patients undergoing high-risk (vascular) surgery. Patients already on statins were excluded.
Further studies are needed to determine whether the findings can be extrapolated to other populations, including nonvascular surgery patients.
Bottom line: Perioperative statin therapy resulted in a significant decrease in postoperative MI and death within 30 days of vascular surgery.
Citation: Schouten O, Boersma E, Hoeks SE, et al. Fluvastatin and perioperative events in patients undergoing vascular surgery. N Engl J Med. 2009;361(10):980-989.
New criteria for diagnosing MM could prevent organ damage

Credit: Chad McNeeley
The International Myeloma Working Group (IMWG) has published new criteria for diagnosing multiple myeloma (MM) in The Lancet Oncology.
The group has added validated biomarkers to the current clinical symptoms used for MM diagnosis—hypercalcemia, renal failure, anemia, and bone lesions.
This addition will allow physicians to diagnose MM before patients become symptomatic and, therefore, before organ damage occurs, according to the IMWG.
Lead author S. Vincent Rajkumar, MD, of the Mayo Clinic in Rochester, Minnesota, noted that MM is always preceded sequentially by two conditions—monoclonal gammopathy of undetermined significance and smoldering MM. Since both are asymptomatic, most MM patients are not diagnosed until organ damage occurs.
“The new IMWG criteria allow for the diagnosis of myeloma to be made in patients without symptoms and before organ damage occurs, using validated biomarkers that identify patients with [smoldering] MM who have an ‘ultra-high’ risk of progression to multiple myeloma,” Dr Rajkumar said.
“These biomarkers are associated with the near-inevitable development of clinical symptoms and are important for early diagnosis and treatment, which is very important for patients.”
Other updates to the criteria used to diagnose MM include the use of CT and PET-CT scans to identify bone lesions. According to the authors, this will enable more accurate diagnosis and intervention before fractures or other serious problems arise.
“We believe that the new criteria will rectify the situation where we were unable to use the considerable advances in multiple myeloma therapy prior to organ damage,” Dr Rajkumar said. “We can now initiate therapy in some patients early on in the course of their disease.”
The IMWG’s revised diagnostic criteria for MM and smoldering MM are as follows.
Definition of MM
Clonal bone marrow plasma cells ≥10% or biopsy-proven bony or extramedullary plasmacytoma* and one or more of the following myeloma defining events:
- Evidence of end organ damage that can be attributed to the underlying plasma cell proliferative disorder, specifically:
- Hypercalcemia: serum calcium >0.25 mmol/L (>1 mg/dL) higher than the upper limit of normal or >2.75 mmol/L (>11 mg/dL).
- Renal insufficiency: creatinine clearance <40 mL per min (measured or estimated by validated equations) or serum creatinine >177 μmol/L (>2 mg/dL).
- Anemia: hemoglobin value of >20 g/L below the lower limit of normal or a hemoglobin value <100 g/L.
- Bone lesions: one or more osteolytic lesions on skeletal radiography, CT, or PET-CT. If the bone marrow has less than 10% clonal plasma cells, more than one bone lesion is required to distinguish from solitary plasmacytoma with minimal marrow involvement.
- One or more of the following biomarkers:
- Clonal bone marrow plasma cell percentage ≥60%.
- Involved:uninvolved serum free light chain ratio ≥100. These values are based on the serum Freelite assay (The Binding Site Group, Birmingham, UK). The involved free light chain must be ≥100 mg/L.
- >1 focal lesions on MRI studies. Each focal lesion must be 5 mm or more in size.
*The IMWG said clonality should be established by showing κ/λ-light-chain restriction on flow cytometry, immunohistochemistry, or immunofluorescence. Bone
marrow plasma cell percentage should preferably be estimated from a core biopsy specimen. In case of a disparity between the aspirate and core biopsy, the highest value should be used.
Definition of smoldering MM
Both of the following criteria must be met:
- Serum monoclonal protein (IgG or IgA) ≥30 g/L or urinary monoclonal protein ≥500 mg per 24 hours and/or clonal bone marrow plasma cells 10%–60%.
- Absence of myeloma defining events or amyloidosis.


Credit: Chad McNeeley
The International Myeloma Working Group (IMWG) has published new criteria for diagnosing multiple myeloma (MM) in The Lancet Oncology.
The group has added validated biomarkers to the current clinical symptoms used for MM diagnosis—hypercalcemia, renal failure, anemia, and bone lesions.
This addition will allow physicians to diagnose MM before patients become symptomatic and, therefore, before organ damage occurs, according to the IMWG.
Lead author S. Vincent Rajkumar, MD, of the Mayo Clinic in Rochester, Minnesota, noted that MM is always preceded sequentially by two conditions—monoclonal gammopathy of undetermined significance and smoldering MM. Since both are asymptomatic, most MM patients are not diagnosed until organ damage occurs.
“The new IMWG criteria allow for the diagnosis of myeloma to be made in patients without symptoms and before organ damage occurs, using validated biomarkers that identify patients with [smoldering] MM who have an ‘ultra-high’ risk of progression to multiple myeloma,” Dr Rajkumar said.
“These biomarkers are associated with the near-inevitable development of clinical symptoms and are important for early diagnosis and treatment, which is very important for patients.”
Other updates to the criteria used to diagnose MM include the use of CT and PET-CT scans to identify bone lesions. According to the authors, this will enable more accurate diagnosis and intervention before fractures or other serious problems arise.
“We believe that the new criteria will rectify the situation where we were unable to use the considerable advances in multiple myeloma therapy prior to organ damage,” Dr Rajkumar said. “We can now initiate therapy in some patients early on in the course of their disease.”
The IMWG’s revised diagnostic criteria for MM and smoldering MM are as follows.
Definition of MM
Clonal bone marrow plasma cells ≥10% or biopsy-proven bony or extramedullary plasmacytoma* and one or more of the following myeloma defining events:
- Evidence of end organ damage that can be attributed to the underlying plasma cell proliferative disorder, specifically:
- Hypercalcemia: serum calcium >0.25 mmol/L (>1 mg/dL) higher than the upper limit of normal or >2.75 mmol/L (>11 mg/dL).
- Renal insufficiency: creatinine clearance <40 mL per min (measured or estimated by validated equations) or serum creatinine >177 μmol/L (>2 mg/dL).
- Anemia: hemoglobin value of >20 g/L below the lower limit of normal or a hemoglobin value <100 g/L.
- Bone lesions: one or more osteolytic lesions on skeletal radiography, CT, or PET-CT. If the bone marrow has less than 10% clonal plasma cells, more than one bone lesion is required to distinguish from solitary plasmacytoma with minimal marrow involvement.
- One or more of the following biomarkers:
- Clonal bone marrow plasma cell percentage ≥60%.
- Involved:uninvolved serum free light chain ratio ≥100. These values are based on the serum Freelite assay (The Binding Site Group, Birmingham, UK). The involved free light chain must be ≥100 mg/L.
- >1 focal lesions on MRI studies. Each focal lesion must be 5 mm or more in size.
*The IMWG said clonality should be established by showing κ/λ-light-chain restriction on flow cytometry, immunohistochemistry, or immunofluorescence. Bone
marrow plasma cell percentage should preferably be estimated from a core biopsy specimen. In case of a disparity between the aspirate and core biopsy, the highest value should be used.
Definition of smoldering MM
Both of the following criteria must be met:
- Serum monoclonal protein (IgG or IgA) ≥30 g/L or urinary monoclonal protein ≥500 mg per 24 hours and/or clonal bone marrow plasma cells 10%–60%.
- Absence of myeloma defining events or amyloidosis.


Credit: Chad McNeeley
The International Myeloma Working Group (IMWG) has published new criteria for diagnosing multiple myeloma (MM) in The Lancet Oncology.
The group has added validated biomarkers to the current clinical symptoms used for MM diagnosis—hypercalcemia, renal failure, anemia, and bone lesions.
This addition will allow physicians to diagnose MM before patients become symptomatic and, therefore, before organ damage occurs, according to the IMWG.
Lead author S. Vincent Rajkumar, MD, of the Mayo Clinic in Rochester, Minnesota, noted that MM is always preceded sequentially by two conditions—monoclonal gammopathy of undetermined significance and smoldering MM. Since both are asymptomatic, most MM patients are not diagnosed until organ damage occurs.
“The new IMWG criteria allow for the diagnosis of myeloma to be made in patients without symptoms and before organ damage occurs, using validated biomarkers that identify patients with [smoldering] MM who have an ‘ultra-high’ risk of progression to multiple myeloma,” Dr Rajkumar said.
“These biomarkers are associated with the near-inevitable development of clinical symptoms and are important for early diagnosis and treatment, which is very important for patients.”
Other updates to the criteria used to diagnose MM include the use of CT and PET-CT scans to identify bone lesions. According to the authors, this will enable more accurate diagnosis and intervention before fractures or other serious problems arise.
“We believe that the new criteria will rectify the situation where we were unable to use the considerable advances in multiple myeloma therapy prior to organ damage,” Dr Rajkumar said. “We can now initiate therapy in some patients early on in the course of their disease.”
The IMWG’s revised diagnostic criteria for MM and smoldering MM are as follows.
Definition of MM
Clonal bone marrow plasma cells ≥10% or biopsy-proven bony or extramedullary plasmacytoma* and one or more of the following myeloma defining events:
- Evidence of end organ damage that can be attributed to the underlying plasma cell proliferative disorder, specifically:
- Hypercalcemia: serum calcium >0.25 mmol/L (>1 mg/dL) higher than the upper limit of normal or >2.75 mmol/L (>11 mg/dL).
- Renal insufficiency: creatinine clearance <40 mL per min (measured or estimated by validated equations) or serum creatinine >177 μmol/L (>2 mg/dL).
- Anemia: hemoglobin value of >20 g/L below the lower limit of normal or a hemoglobin value <100 g/L.
- Bone lesions: one or more osteolytic lesions on skeletal radiography, CT, or PET-CT. If the bone marrow has less than 10% clonal plasma cells, more than one bone lesion is required to distinguish from solitary plasmacytoma with minimal marrow involvement.
- One or more of the following biomarkers:
- Clonal bone marrow plasma cell percentage ≥60%.
- Involved:uninvolved serum free light chain ratio ≥100. These values are based on the serum Freelite assay (The Binding Site Group, Birmingham, UK). The involved free light chain must be ≥100 mg/L.
- >1 focal lesions on MRI studies. Each focal lesion must be 5 mm or more in size.
*The IMWG said clonality should be established by showing κ/λ-light-chain restriction on flow cytometry, immunohistochemistry, or immunofluorescence. Bone
marrow plasma cell percentage should preferably be estimated from a core biopsy specimen. In case of a disparity between the aspirate and core biopsy, the highest value should be used.
Definition of smoldering MM
Both of the following criteria must be met:
- Serum monoclonal protein (IgG or IgA) ≥30 g/L or urinary monoclonal protein ≥500 mg per 24 hours and/or clonal bone marrow plasma cells 10%–60%.
- Absence of myeloma defining events or amyloidosis.

