Team discovers how cerebral malaria kills children

Article Type
Changed
Display Headline
Team discovers how cerebral malaria kills children

Terrie Taylor examines a child

at Queen Elizabeth Hospital

Photo by Jim Peck

After grant money brought magnetic resonance imaging (MRI) to a hospital in Africa, researchers were able to uncover the cause of death in children with cerebral malaria.

MRI scans revealed that, in some children, the brain can become so swollen that it is forced out through the bottom of the skull and compresses the brain stem. This pressure causes the children to stop breathing and die.

The researchers reported these findings in NEJM.

“Because we know now that the brain swelling is what causes death, we can work to find new treatments,” said study author Terrie Taylor, DO, of Michigan State University in East Lansing.

“The next step is to identify what’s causing the swelling and then develop treatments targeting those causes. It’s also possible that using ventilators to keep the children breathing until the swelling subsides might save lives, but ventilators are few and far between in Africa at the moment.”

Scans reveal brain swelling

In 2008, GE Healthcare provided a $1 million MRI to the Queen Elizabeth Hospital in Blantyre, Malawi, where Dr Taylor spends 6 months of every year treating and studying children with malaria.

Dr Taylor and her colleagues used the MRI to view brain images from hundreds of children with cerebral malaria, comparing findings in those who died to those who survived.

The team imaged 168 children with cerebral malaria (as defined by the World Health Organization). Fifteen percent (25/168) of the children died. And 84% of these children (21/25) had evidence of severe brain swelling at admission.

In contrast, the researchers found evidence of severe brain swelling in 27% (39/143) of children who survived. And serial MRI scans revealed decreasing brain volume in the survivors who initially had brain swelling.

“We found that survivors’ brains were either never swollen or decreased in size after 2 to 3 days,” Dr Taylor said. “This was a triumphant moment. I wanted to say to the parasite, ‘Ha! You never thought we’d get an MRI, did you?’”

Publications
Topics

Terrie Taylor examines a child

at Queen Elizabeth Hospital

Photo by Jim Peck

After grant money brought magnetic resonance imaging (MRI) to a hospital in Africa, researchers were able to uncover the cause of death in children with cerebral malaria.

MRI scans revealed that, in some children, the brain can become so swollen that it is forced out through the bottom of the skull and compresses the brain stem. This pressure causes the children to stop breathing and die.

The researchers reported these findings in NEJM.

“Because we know now that the brain swelling is what causes death, we can work to find new treatments,” said study author Terrie Taylor, DO, of Michigan State University in East Lansing.

“The next step is to identify what’s causing the swelling and then develop treatments targeting those causes. It’s also possible that using ventilators to keep the children breathing until the swelling subsides might save lives, but ventilators are few and far between in Africa at the moment.”

Scans reveal brain swelling

In 2008, GE Healthcare provided a $1 million MRI to the Queen Elizabeth Hospital in Blantyre, Malawi, where Dr Taylor spends 6 months of every year treating and studying children with malaria.

Dr Taylor and her colleagues used the MRI to view brain images from hundreds of children with cerebral malaria, comparing findings in those who died to those who survived.

The team imaged 168 children with cerebral malaria (as defined by the World Health Organization). Fifteen percent (25/168) of the children died. And 84% of these children (21/25) had evidence of severe brain swelling at admission.

In contrast, the researchers found evidence of severe brain swelling in 27% (39/143) of children who survived. And serial MRI scans revealed decreasing brain volume in the survivors who initially had brain swelling.

“We found that survivors’ brains were either never swollen or decreased in size after 2 to 3 days,” Dr Taylor said. “This was a triumphant moment. I wanted to say to the parasite, ‘Ha! You never thought we’d get an MRI, did you?’”

Terrie Taylor examines a child

at Queen Elizabeth Hospital

Photo by Jim Peck

After grant money brought magnetic resonance imaging (MRI) to a hospital in Africa, researchers were able to uncover the cause of death in children with cerebral malaria.

MRI scans revealed that, in some children, the brain can become so swollen that it is forced out through the bottom of the skull and compresses the brain stem. This pressure causes the children to stop breathing and die.

The researchers reported these findings in NEJM.

“Because we know now that the brain swelling is what causes death, we can work to find new treatments,” said study author Terrie Taylor, DO, of Michigan State University in East Lansing.

“The next step is to identify what’s causing the swelling and then develop treatments targeting those causes. It’s also possible that using ventilators to keep the children breathing until the swelling subsides might save lives, but ventilators are few and far between in Africa at the moment.”

Scans reveal brain swelling

In 2008, GE Healthcare provided a $1 million MRI to the Queen Elizabeth Hospital in Blantyre, Malawi, where Dr Taylor spends 6 months of every year treating and studying children with malaria.

Dr Taylor and her colleagues used the MRI to view brain images from hundreds of children with cerebral malaria, comparing findings in those who died to those who survived.

The team imaged 168 children with cerebral malaria (as defined by the World Health Organization). Fifteen percent (25/168) of the children died. And 84% of these children (21/25) had evidence of severe brain swelling at admission.

In contrast, the researchers found evidence of severe brain swelling in 27% (39/143) of children who survived. And serial MRI scans revealed decreasing brain volume in the survivors who initially had brain swelling.

“We found that survivors’ brains were either never swollen or decreased in size after 2 to 3 days,” Dr Taylor said. “This was a triumphant moment. I wanted to say to the parasite, ‘Ha! You never thought we’d get an MRI, did you?’”

Publications
Publications
Topics
Article Type
Display Headline
Team discovers how cerebral malaria kills children
Display Headline
Team discovers how cerebral malaria kills children
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Multifaceted Hospitalist QI Intervention

Article Type
Changed
Display Headline
A multifaceted hospitalist quality improvement intervention: Decreased frequency of common labs

Waste in US healthcare is a public health threat, with an estimated value of $910 billion per year.[1] It constitutes some of the relatively high per‐discharge healthcare spending seen in the United States when compared to other nations.[2] Waste takes many forms, one of which is excessive use of diagnostic laboratory testing.[1] Many hospital providers obtain common labs, such as complete blood counts (CBCs) and basic metabolic panels (BMPs), in an open‐ended, daily manner for their hospitalized patients, without regard for the patient's clinical condition or despite stability of the previous results. Reasons for ordering these tests in a nonpatient‐centered manner include provider convenience (such as inclusion in an order set), ease of access, habit, or defensive practice.[3, 4, 5] All of these reasons may represent waste.

Although the potential waste of routine daily labs may seem small, the frequency with which they are ordered results in a substantial real and potential cost, both financially and clinically. Multiple studies have shown a link between excessive diagnostic phlebotomy and hospital‐acquired anemia.[6, 7, 8, 9] Hospital‐acquired anemia itself has been associated with increased mortality.[10] In addition to blood loss and financial cost, patient experience and satisfaction are also detrimentally affected by excessive laboratory testing in the form of pain and inconvenience from the act of phlebotomy.[11]

There are many reports of strategies to decrease excessive diagnostic laboratory testing as a means of addressing this waste in the inpatient setting.[12, 13, 14, 15, 16, 17, 18, 19, 20, 21] All of these studies have taken place in a traditional academic setting, and many implemented their intervention through a computer‐based order entry system. Based on the literature search regarding this topic, we found no examples of studies conducted among and within community‐based hospitalist practices. More recently, this issue was highlighted as part of the Choosing Wisely campaign sponsored by the American Board of Internal Medicine Foundation, Consumer Reports, and more than 60 specialty societies. The Society of Hospital Medicine, the professional society for hospitalists, recommended avoidance of repetitive common laboratory testing in the face of clinical stability.[22]

Much has been written about quality improvement (QI) by the Institute for Healthcare Improvement, the Society of Hospitalist Medicine, and others.[23, 24, 25] How best to move from a Choosing Wisely recommendation to highly reliable incorporation in clinical practice in a community setting is not known and likely varies depending upon the care environment. Successful QI interventions are often multifaceted and include academic detailing and provider education, transparent display of data, and regular audit and feedback of performance data.[26, 27, 28, 29] Prior to the publication of the Society of Hospital Medicine's Choosing Wisely recommendations, we chose to implement the recommendation to decrease ordering of daily labs using 3 QI strategies in our community 4‐hospital health system.

METHODS

Study Participants

This activity was undertaken as a QI initiative by Swedish Hospital Medicine (SHM), a 53‐provider employed hospitalist group that staffs a total of 1420 beds across 4 inpatient facilities. SHM has a longstanding record of working together as a team on QI projects.

An informal preliminary audit of our common lab ordering by a member of the study team revealed multiple examples of labs ordered every day without medical‐record evidence of intervention or management decisions being made based on the results. This preliminary activity raised the notion within the hospitalist group that this was a topic ripe for intervention and improvement. Four common labs, CBC, BMP, nutrition panel (called TPN 2 in our system, consisting of a BMP and magnesium and phosphorus) and comprehensive metabolic panel (BMP and liver function tests), formed the bulk of the repetitively ordered labs and were the focus of our activity. We excluded prothrombin time/International Normalized Ratio, as it was less clear that obtaining these daily clearly represented waste. We then reviewed medical literature for successful QI strategies and chose academic detailing, transparent display of data, and audit and feedback as our QI tactics.[29]

Using data from our electronic medical record, we chose a convenience preintervention period of 10 months for our baseline data. We allowed for a 2‐month wash‐in period in August 2013, and a convenience period of 7 months was chosen as the intervention period.

Intervention

An introductory email was sent out in mid‐August 2013 to all hospitalist providers describing the waste and potential harm to patients associated with unnecessary common blood tests, in particular those ordered as daily. The email recommended 2 changes: (1) immediate cessation of the practice of ordering common labs as daily, in an open, unending manner and (2) assessing the need for common labs in the next 24 hours, and ordering based on that need, but no further into the future.

Hospitalist providers were additionally informed that the number of common labs ordered daily would be tracked prospectively, with monthly reporting of individual provider ordering. In addition, the 5 members of the hospitalist team who most frequently ordered common labs as daily during January 2013 to March 2013 were sent individual emails informing them of their top‐5 position.

During the 7‐month intervention period, a monthly email was sent to all members of the hospitalist team with 4 basic components: (1) reiteration of the recommendations and reasoning stated in the original email; (2) a list of all members of the hospitalist team and the corresponding frequency of common labs ordered as daily (open ended) per provider for the month; (3) a recommendation to discontinue any common labs ordered as daily; and (4) at least 1 example of a patient cared for during the month by the hospitalist team, who had at least 1 common lab ordered for at least 5 days in a row, with no mention of the results in the progress notes and no apparent contribution to the management of the medical conditions for which the patient was being treated.

The change in number of tests ordered during the intervention was not shared with the team until early January 2014.

Data Elements and Endpoints

Number of common labs ordered as daily, and the total number of common labs per hospital‐day, ordered by any frequency, on hospitalist patients were abstracted from the electronic medical record. Hospitalist patients were defined as those both admitted and discharged by a hospitalist provider. We chose to compare the 10 months prior to the intervention with the 7 months during the intervention, allowing 1 month as the intervention wash‐in period. No other interventions related to lab ordering occurred during the study period. Additional variables collected included duration of hospitalization, mortality, readmission, and transfusion data. Consistency of providers in the preintervention and intervention period was high. Two providers were included in some of the preintervention data, but were not included in the intervention data, as they both left for other positions. Otherwise, all other providers in the data were consistent between the 2 time periods.

The primary endpoint was chosen a priori as the total number of common labs ordered per hospital‐day. Additionally, we identified a priori potential confounders, including age, sex, and primary discharge diagnosis, as captured by the all‐patient refined diagnosis‐related group (APR‐DRG, hereafter DRG). DRG was chosen as a clinical risk adjustment variable because there does not exist an established method to model the effects of clinical conditions on the propensity to obtain labs, the primary endpoint. Many models used for risk adjustment in patient quality reporting use hospital mortality as the primary endpoint, not the need for laboratory testing.[30, 31] As our primary endpoint was common labs and not mortality, we chose DRG as the best single variable to model changes in the clinical case mix that might affect the number of common labs.

Secondary endpoints were also determined a priori. Out of desire to assess the patient safety implications of an intervention targeting decreased monitoring, we included hospital mortality, duration of hospitalization, and readmission as safety variables. Two secondary endpoints were obtained as possible additional efficacy endpoints to test the hypothesis that the intervention might be associated with a reduction in transfusion burden: red blood cell transfusion and transfusion volume. We also tracked the frequency with which providers ordered common labs as daily in the baseline and intervention periods, as this was the behavior targeted by the interventions.

Costs to the hospital to produce the lab studies were also considered as a secondary endpoint. Median hospital costs were obtained from the first‐quarter, 2013 Premier dataset, a national dataset of hospital costs (basic metabolic panel $14.69, complete blood count $11.68, comprehensive metabolic panel $18.66). Of note, the Premier data did not include cost data on what our institution calls a TPN 2, and BMP cost was used as a substitute, given the overlap of the 2 tests' components and a desire to conservatively estimate the effects on cost to produce. Additionally, we factored in estimate of hospitalist and analyst time at $150/hour and $75/hour, respectively, to conduct that data abstraction and analysis and to manage the program. We did not formally factor in other costs, including electronic medical record acquisition costs.

Statistical Analyses

Descriptive statistics were used to describe the 2 cohorts. To test our primary hypothesis about the association between cohort membership and number of common labs per patient day, a clustered multivariable linear regression model was constructed to adjust for the a priori identified potential confounders, including sex, age, and principle discharge diagnosis. Each DRG was entered as a categorical variable in the model. Clustering was employed to account for correlation of lab ordering behavior by a given hospitalist. Separate clustered multivariable models were constructed to test the association between cohort and secondary outcomes, including duration of hospitalization, readmission, mortality, transfusion frequency, and transfusion volume using the same potential confounders. All P values were 2‐sided, and a P0.05 was considered statistically significant. All analyses were conducted with Stata 11.2 (StataCorp, College Station, TX). The study was reviewed by the Swedish Health Services Clinical Research Center and determined to be nonhuman subjects research.

RESULTS

Patient Characteristics

Patient characteristics in the before and after cohorts are shown in Table 1. Both proportion of male sex (44.9% vs 44.9%, P=1.0) and the mean age (64.6 vs 64.8 years, P=0.5) did not significantly differ between the 2 cohorts. Interestingly, there was a significant change in the distribution of DRGs between the 2 cohorts, with each of the top 10 DRGs becoming more common in the intervention cohort. For example, the percentage of patients with sepsis or severe sepsis, DRGs 871 and 872, increased by 2.2% (8.2% vs 10.4%, P0.01).

Patient Characteristics by Daily Lab Cohort
Baseline, n=7832 Intervention, n=5759 P Valuea
  • NOTE: Abbreviations: DRG, diagnosis‐related group; SD, standard deviation.

  • P value determined by 2 or Student t test.

  • Only the top 10 DRGs are listed.

Age, y, mean (SD) 64.6 (19.6) 64.8 0.5
Male, n (%) 3,514 (44.9) 2,585 (44.9) 1.0
Primary discharge diagnosis, DRG no., name, n (%)b
871 and 872, severe sepsis 641 (8.2) 599 (10.4) 0.01
885, psychoses 72 (0.9) 141 (2.4) 0.01
392, esophagitis, gastroenteritis and miscellaneous intestinal disorders 171 (2.2) 225 (3.9) 0.01
313, chest pain 114 (1.5) 123 (2.1) 0.01
378, gastrointestinal bleed 100 (1.3) 117 (2.0) 0.01
291, congestive heart failure and shock 83 (1.1) 101 (1.8) 0.01
189, pulmonary edema and respiratory failure 69 (0.9) 112 (1.9) 0.01
312, syncope and collapse 82 (1.0) 119 (2.1) 0.01
64, intracranial hemorrhage or cerebral infarction 49 (0.6) 54 (0.9) 0.04
603, cellulitis 96 (1.2) 94 (1.6) 0.05

Primary Endpoint

In the unadjusted comparison, 3 of the 4 common labs showed a similar decrease in the intervention cohort from the baseline (Table 2). For example, the mean number of CBCs ordered per patient‐day decreased by 0.15 labs per patient day (1.06 vs 0.91, P0.01). The total number of common labs ordered per patient‐day decreased by 0.30 labs per patient‐day (2.06 vs 1.76, P0.01) in the unadjusted analysis (Figure 1 and Table 2). Part of our hypothesis was that decreasing the number of labs that were ordered as daily, in an open‐ended manner, would likely decrease the number of common labs obtained per day. We found that the number of labs ordered as daily decreased by 0.71 labs per patient‐day (0.872.90 vs 0.161.01, P0.01), an 81.6% decrease from the preintervention time period.

Patient Outcomes by Daily Lab Cohort
Baseline Intervention P Valuea
  • NOTE: Abbreviations: SD, standard deviation.

  • P value determined by [2] or Student t test.

  • Basic metabolic panel plus magnesium and phosphate.

Complete blood count, per patient‐day, mean (SD) 1.06 (0.76) 0.91 (0.75) 0.01
Basic metabolic panel, per patient‐day, mean (SD) 0.68 (0.71) 0.55 (0.60) 0.01
Nutrition panel, mean (SD)b 0.06 (0.24) 0.07 (0.32) 0.01
Comprehensive metabolic panel, per patient‐day, mean (SD) 0.27 (0.49) 0.23 (0.46) 0.01
Total no. of basic labs ordered per patient‐day, mean (SD) 2.06 (1.40) 1.76 (1.37) 0.01
Transfused, n (%) 414 (5.3) 268 (4.7) 0.1
Transfused volume, mL, mean (SD) 847.3 (644.3) 744.9 (472.0) 0.02
Length of stay, days, mean (SD) 3.79 (4.58) 3.81 (4.50) 0.7
Readmitted, n (%) 1049 (13.3) 733 (12.7) 0.3
Died, n (%) 173 (2.2) 104 (1.8) 0.1
Figure 1
Mean number of total basic labs ordered per day shown over the 10 months of the preintervention period, from October 2012 to July 2013, and the 7 months of the intervention period, September 2013 to March 2014. The vertical line denotes the missing wash‐in month where the intervention began (August 2013).

In our multivariable regression model, after adjusting for sex, age, and the primary reason for admission as captured by DRG, the number of common labs ordered per day was reduced by 0.22 (95% CI, 0.34 to 0.11; P0.01). This represents a 10.7% reduction in common labs ordered per patient day.

Secondary Endpoints

Table 2 shows secondary outcomes of the study. Patient safety endpoints were not changed in unadjusted analyses. For example, the hospital length of stay in number of days was similar in both the baseline and intervention cohorts (3.784.58 vs 3.814.50, P=0.7). There was a nonsignificant reduction in the hospital mortality rate during the intervention period by 0.4% (2.2% vs 1.8%, P=0.1). No significant differences were found when the multivariable model was rerun for each of the 3 secondary endpoints individually, readmissions, mortality, and length of stay.

Two secondary efficacy endpoints were also evaluated. The percentage of patients receiving transfusions did not decrease in either the unadjusted or adjusted analysis. However, the volume of blood transfused per patient who received a transfusion decreased by 91.9 mL in the bivariate analysis (836.8 mL621.4 mL vs 744.9 mL472.0 mL; P=0.03) (Table 2). The decrease, however, was not significant in the multivariable model (127.2 mL; 95% CI, 257.9 to 3.6; P=0.06).

Cost Data

Based on the Premier estimate of the cost to the hospital to perform the common lab tests, the intervention likely decreased direct costs by $16.19 per patient (95% CI, $12.95 to $19.43). The cost saving was decreased by the expense of the intervention, which is estimated to be $8000 and was driven by hospitalist and analyst time. Based on the patient volume in our health system, and factoring in the cost of implementation, we estimate that this intervention resulted in annualized savings of $151,682 (95% CI, $119,746 to $187,618).

DISCUSSION

Ordering common labs daily is a routine practice among providers at many institutions. In fact, at our institution, prior to the intervention, 42% of all common labs were ordered as daily, meaning they were obtained each day without regard to the previous value or the patient's clinical condition. The practice is one of convenience or habit, and many times not clinically indicated.[5, 32]

We observed a significant reduction in the number of common labs ordered as daily, and more importantly, the total number of common labs in the intervention period. The rapid change in provider behavior is notable and likely due to several factors. First, there was a general sentiment among the hospitalists in the merits of the project. Second, there may have been an aversion to the display of lower performance relative to peers in the monthly e‐mails. Third, and perhaps most importantly, our hospitalist team had worked together for many years on projects like this, creating a culture of QI and willingness to change practice patterns in response to data.[33]

Concern about decreasing waste and increasing the value of healthcare abound, particularly in the United States.[1] Decreasing the cost to produce equivalent or improved health outcomes for a given episode of care has been proposed as a way to improve value.[34] This intervention results in modest waste reduction, the benefits of which are readily apparent in a DRG‐based reimbursement model, where the hospital realizes any saving in the cost of producing a hospital stay, as well as in a total cost of care environment, such as could be found in an Accountable Care Organization.

The previous work in the field of lab reduction has all been performed at university‐affiliated academic institutions. We demonstrated that the QI tactics described in the literature can be successfully employed in a community‐based hospitalist practice. This has broad applicability to increasing the value of healthcare and could serve as a model for future community‐based hospitalist QI projects.

The study has several limitations. First, the length of follow‐up is only 7 months, and although there was rapid and effective adoption of the intervention, provider behavior may regress to previous practice patterns over time. Second, the simple before‐after nature of our trial design raises the possibility that environmental influences exist and that changes in ordering behavior may have been the result of something other than the intervention. Most notably, the Choosing Wisely recommendation for hospitalists was published in September of 2013, coinciding with our intervention period.[22] The reduction in number of labs ordered may have been a partial result of these recommendations. Third, the 2 cohorts included different times of the year based on the distribution of DRGs, which likely had a different composition of diagnoses being treated. To address this we adjusted for DRG, but there may have been some residual confounding, as some diagnoses may be managed with more laboratory tests than others in a way that was not fully adjusted for in our model. Fourth, the intervention was made possible because of the substantial and ongoing investments that our health system has made in our electronic medical record and data analytics capability. The variability of these resources across institutions limits generalizability. Fifth, although we used the QI tools that were described, we did not do a formal process map or utilize other Lean or Six Sigma tools. As the healthcare industry continues on its journey to high reliability, these use tools will hopefully become more widespread. We demonstrated that even with these simple tactics, significant progress can be made.

Finally, there exists a concern that decreasing regular laboratory monitoring might be associated with undetected worsening in the patient's clinical status. We did not observe any significant adverse effects on coarse measures of clinical performance, including length of stay, readmission rate, or mortality. However, we did not collect data on all clinical parameters, and it is possible that there could have been an undetected effect on incident renal failure or hemodialysis or intensive care unit transfer. Other studies on this type of intervention have evaluated some of these possible adverse outcomes and have not noted an association.[12, 15, 18, 20, 22] Future studies should evaluate harms associated with implementation of Choosing Wisely and other interventions targeted at waste reduction. Future work is also needed to disseminate more formal and rigorous QI tools and methodologies.

CONCLUSION

We implemented a multifaceted QI intervention including provider education, transparent display of data, and audit and feedback that was associated with a significant reduction in the number of common labs ordered in a large community‐based hospitalist group, without evidence of harm. Further study is needed to understand how hospitalist groups can optimally decrease waste in healthcare.

Disclosures

This work was performed at the Swedish Health System, Seattle, Washington. Dr. Corson served as primary author, designed the study protocol, obtained the data, analyzed all the data and wrote the manuscript and its revisions, and approved the final version of the manuscript. He attests that no undisclosed authors contributed to the manuscript. Dr. Fan designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Mr. White reviewed the study protocol, obtained the study data, reviewed the manuscript, and approved the final version of the manuscript. Sean D. Sullivan, PhD, designed the study protocol, obtained study data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Asakura designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Dr. Myint reviewed the study protocol and data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Dale designed the study protocol, analyzed the data, reviewed the manuscript, and approved the final version of the manuscript. The authors report no conflicts of interest.

Files
References
  1. Berwick D. Eliminating “waste” in health care. JAMA. 2012;307(14):15131516.
  2. Squires DA. The U.S. health system in perspective: a comparison of twelve industrialized nations. Issue Brief (Commonw Fund). 2011;16:114.
  3. DeKay ML, Asch DA. Is the defensive use of diagnostic tests good for patients, or bad? Med Decis Mak. 1998;18(1):1928.
  4. Epstein AM, McNeil BJ. Physician characteristics and organizational factors influencing use of ambulatory tests. Med Decis Making. 1985;5:401415.
  5. Salinas M, Lopez‐Garrigos M, Uris J; Pilot Group of the Appropriate Utilization of Laboratory Tests (REDCONLAB) Working Group. Differences in laboratory requesting patterns in emergency department in Spain. Ann Clin Biochem. 2013;50:353359.
  6. Wong P, Intragumtornchai T. Hospital‐acquired anemia. J Med Assoc Thail. 2006;89(1):6367.
  7. Thavendiranathan P, Bagai A, Ebidia A, Detsky AS, Choudhry NK. Do blood tests cause anemia in hospitalized patients? The effect of diagnostic phlebotomy on hemoglobin and hematocrit levels. J Gen Intern Med. 2005;20(6):520524.
  8. Smoller BR, Kruskall MS. Phlebotomy for diagnostic laboratory tests in adults. Pattern of use and effect on transfusion requirements. N Engl J Med. 1986;314(19):12331235.
  9. Salisbury AC, Reid KJ, Alexander KP, et al. Diagnostic blood loss from phlebotomy and hospital‐acquired anemia during acute myocardial infarction. Arch Intern Med. 2011;171(18):16461653.
  10. Koch CG, Li L, Sun Z, et al. Hospital‐acquired anemia: prevalence, outcomes, and healthcare implications. J Hosp Med. 2013;8(9):506512.
  11. Howanitz PJ, Cembrowski GS, Bachner P. Laboratory phlebotomy. College of American Pathologists Q‐Probe study of patient satisfaction and complications in 23,783 patients. Arch Pathol Lab Med. 1991;115:867872.
  12. Attali M, Barel Y, Somin M, et al. A cost‐effective method for reducing the volume of laboratory tests in a university‐associated teaching hospital. Mt Sinai J Med. 2006;73(5):787794.
  13. Bareford D, Hayling A. Inappropriate use of laboratory services: long term combined approach to modify request patterns. BMJ. 1990;301(6764):13051307.
  14. Bunting PS, Walraven C. Effect of a controlled feedback intervention on laboratory test ordering by community physicians. Clin Chem. 2004;50(2):321326.
  15. Calderon‐Margalit R, Mor‐Yosef S, Mayer M, Adler B, Shapira SC. An administrative intervention to improve the utilization of laboratory tests within a university hospital. Int J Qual Heal Care. 2005;17(3):243248.
  16. Critique SI. Surgical vampires and rising health care expenditure. Arch Surg. 2011;146(5):524527.
  17. Fowkes FG, Hall R, Jones JH, et al. Trial of strategy for reducing the use of laboratory tests. Br Med J (Clin Res Ed). 1986;292(6524):883885.
  18. Kroenke K, Hanley JF, Copley JB, et al. Improving house staff ordering of three common laboratory tests. Reductions in test ordering need not result in underutilization. Med Care. 1987;25(10):928935.
  19. May TA, Clancy M, Critchfield J, et al. Reducing unnecessary inpatient laboratory testing in a teaching hospital. Am J Clin Pathol. 2006;126(2):200206.
  20. Neilson EG, Johnson KB, Rosenbloom ST, et al. Improving patient care the impact of peer management on test‐ordering behavior. Ann Intern Med. 2004;141(3):196204.
  21. Novich M, Gillis L, Tauber AI. The laboratory test justified. An effective means to reduce routine laboratory testing. Am J Clin Pathol. 1985;86(6):756759.
  22. Bulger J, Nickel W, Messler J, et al. Choosing wisely in adult hospital medicine: five opportunities for improved healthcare value. J Hosp Med. 2013;8(9):486492.
  23. Dale C. Quality Improvement in the intensive care unit. In: Scales DC, Rubenfeld GD, eds. The Organization of Critical Care. New York, NY: Humana Press; 2014:279.
  24. Curtis JR, Cook DJ, Wall RJ, et al. Intensive care unit quality improvement: a “how‐to” guide for the interdisciplinary team. Crit Care Med. 2006;34:211218.
  25. Pronovost PJ. Navigating adaptive challenges in quality improvement. BMJ Qual Safety. 2011;20(7):560563.
  26. Scales DC, Dainty K, Hales B, et al. A multifaceted intervention for quality improvement in a network of intensive care units: a cluster randomized trial. JAMA. 2011;305:363372.
  27. O'Neill SM. How do quality improvement interventions succeed? Archetypes of success and failure. Available at: http://www.rand.org/pubs/rgs_dissertations/RGSD282.html. Published 2011.
  28. Berwanger O, Guimarães HP, Laranjeira LN, et al. Effect of a multifaceted intervention on use of evidence‐based therapies in patients with acute coronary syndromes in Brazil: the BRIDGE‐ACS randomized trial. JAMA. 2012;307:20412049.
  29. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259.
  30. Glance LG, Osler TM, Mukamel DB, Dick AW. Impact of the present‐on‐admission indicator on hospital quality measurement: experience with the Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicators. Med Care. 2008;46:112119.
  31. Pine M, Jordan HS, Elixhauser A, et al. Enhancement of claims data to improve risk adjustment of hospital mortality. JAMA. 2007;297:7176.
  32. Salinas M, López‐Garrigós M, Tormo C, Uris J. Primary care use of laboratory tests in Spain: measurement through appropriateness indicators. Clin Lab. 2014;60(3):483490.
  33. Curry LA, Spatz E, Cherlin E, et al. What distinguishes top‐performing hospitals in acute myocardial infarction mortality rates? a qualitative study. Ann Intern Med. 2011;154(6):384390.
  34. Porter ME. What is value in health care? N Engl J Med. 2010;363(26):24772481.
Article PDF
Issue
Journal of Hospital Medicine - 10(6)
Page Number
390-395
Sections
Files
Files
Article PDF
Article PDF

Waste in US healthcare is a public health threat, with an estimated value of $910 billion per year.[1] It constitutes some of the relatively high per‐discharge healthcare spending seen in the United States when compared to other nations.[2] Waste takes many forms, one of which is excessive use of diagnostic laboratory testing.[1] Many hospital providers obtain common labs, such as complete blood counts (CBCs) and basic metabolic panels (BMPs), in an open‐ended, daily manner for their hospitalized patients, without regard for the patient's clinical condition or despite stability of the previous results. Reasons for ordering these tests in a nonpatient‐centered manner include provider convenience (such as inclusion in an order set), ease of access, habit, or defensive practice.[3, 4, 5] All of these reasons may represent waste.

Although the potential waste of routine daily labs may seem small, the frequency with which they are ordered results in a substantial real and potential cost, both financially and clinically. Multiple studies have shown a link between excessive diagnostic phlebotomy and hospital‐acquired anemia.[6, 7, 8, 9] Hospital‐acquired anemia itself has been associated with increased mortality.[10] In addition to blood loss and financial cost, patient experience and satisfaction are also detrimentally affected by excessive laboratory testing in the form of pain and inconvenience from the act of phlebotomy.[11]

There are many reports of strategies to decrease excessive diagnostic laboratory testing as a means of addressing this waste in the inpatient setting.[12, 13, 14, 15, 16, 17, 18, 19, 20, 21] All of these studies have taken place in a traditional academic setting, and many implemented their intervention through a computer‐based order entry system. Based on the literature search regarding this topic, we found no examples of studies conducted among and within community‐based hospitalist practices. More recently, this issue was highlighted as part of the Choosing Wisely campaign sponsored by the American Board of Internal Medicine Foundation, Consumer Reports, and more than 60 specialty societies. The Society of Hospital Medicine, the professional society for hospitalists, recommended avoidance of repetitive common laboratory testing in the face of clinical stability.[22]

Much has been written about quality improvement (QI) by the Institute for Healthcare Improvement, the Society of Hospitalist Medicine, and others.[23, 24, 25] How best to move from a Choosing Wisely recommendation to highly reliable incorporation in clinical practice in a community setting is not known and likely varies depending upon the care environment. Successful QI interventions are often multifaceted and include academic detailing and provider education, transparent display of data, and regular audit and feedback of performance data.[26, 27, 28, 29] Prior to the publication of the Society of Hospital Medicine's Choosing Wisely recommendations, we chose to implement the recommendation to decrease ordering of daily labs using 3 QI strategies in our community 4‐hospital health system.

METHODS

Study Participants

This activity was undertaken as a QI initiative by Swedish Hospital Medicine (SHM), a 53‐provider employed hospitalist group that staffs a total of 1420 beds across 4 inpatient facilities. SHM has a longstanding record of working together as a team on QI projects.

An informal preliminary audit of our common lab ordering by a member of the study team revealed multiple examples of labs ordered every day without medical‐record evidence of intervention or management decisions being made based on the results. This preliminary activity raised the notion within the hospitalist group that this was a topic ripe for intervention and improvement. Four common labs, CBC, BMP, nutrition panel (called TPN 2 in our system, consisting of a BMP and magnesium and phosphorus) and comprehensive metabolic panel (BMP and liver function tests), formed the bulk of the repetitively ordered labs and were the focus of our activity. We excluded prothrombin time/International Normalized Ratio, as it was less clear that obtaining these daily clearly represented waste. We then reviewed medical literature for successful QI strategies and chose academic detailing, transparent display of data, and audit and feedback as our QI tactics.[29]

Using data from our electronic medical record, we chose a convenience preintervention period of 10 months for our baseline data. We allowed for a 2‐month wash‐in period in August 2013, and a convenience period of 7 months was chosen as the intervention period.

Intervention

An introductory email was sent out in mid‐August 2013 to all hospitalist providers describing the waste and potential harm to patients associated with unnecessary common blood tests, in particular those ordered as daily. The email recommended 2 changes: (1) immediate cessation of the practice of ordering common labs as daily, in an open, unending manner and (2) assessing the need for common labs in the next 24 hours, and ordering based on that need, but no further into the future.

Hospitalist providers were additionally informed that the number of common labs ordered daily would be tracked prospectively, with monthly reporting of individual provider ordering. In addition, the 5 members of the hospitalist team who most frequently ordered common labs as daily during January 2013 to March 2013 were sent individual emails informing them of their top‐5 position.

During the 7‐month intervention period, a monthly email was sent to all members of the hospitalist team with 4 basic components: (1) reiteration of the recommendations and reasoning stated in the original email; (2) a list of all members of the hospitalist team and the corresponding frequency of common labs ordered as daily (open ended) per provider for the month; (3) a recommendation to discontinue any common labs ordered as daily; and (4) at least 1 example of a patient cared for during the month by the hospitalist team, who had at least 1 common lab ordered for at least 5 days in a row, with no mention of the results in the progress notes and no apparent contribution to the management of the medical conditions for which the patient was being treated.

The change in number of tests ordered during the intervention was not shared with the team until early January 2014.

Data Elements and Endpoints

Number of common labs ordered as daily, and the total number of common labs per hospital‐day, ordered by any frequency, on hospitalist patients were abstracted from the electronic medical record. Hospitalist patients were defined as those both admitted and discharged by a hospitalist provider. We chose to compare the 10 months prior to the intervention with the 7 months during the intervention, allowing 1 month as the intervention wash‐in period. No other interventions related to lab ordering occurred during the study period. Additional variables collected included duration of hospitalization, mortality, readmission, and transfusion data. Consistency of providers in the preintervention and intervention period was high. Two providers were included in some of the preintervention data, but were not included in the intervention data, as they both left for other positions. Otherwise, all other providers in the data were consistent between the 2 time periods.

The primary endpoint was chosen a priori as the total number of common labs ordered per hospital‐day. Additionally, we identified a priori potential confounders, including age, sex, and primary discharge diagnosis, as captured by the all‐patient refined diagnosis‐related group (APR‐DRG, hereafter DRG). DRG was chosen as a clinical risk adjustment variable because there does not exist an established method to model the effects of clinical conditions on the propensity to obtain labs, the primary endpoint. Many models used for risk adjustment in patient quality reporting use hospital mortality as the primary endpoint, not the need for laboratory testing.[30, 31] As our primary endpoint was common labs and not mortality, we chose DRG as the best single variable to model changes in the clinical case mix that might affect the number of common labs.

Secondary endpoints were also determined a priori. Out of desire to assess the patient safety implications of an intervention targeting decreased monitoring, we included hospital mortality, duration of hospitalization, and readmission as safety variables. Two secondary endpoints were obtained as possible additional efficacy endpoints to test the hypothesis that the intervention might be associated with a reduction in transfusion burden: red blood cell transfusion and transfusion volume. We also tracked the frequency with which providers ordered common labs as daily in the baseline and intervention periods, as this was the behavior targeted by the interventions.

Costs to the hospital to produce the lab studies were also considered as a secondary endpoint. Median hospital costs were obtained from the first‐quarter, 2013 Premier dataset, a national dataset of hospital costs (basic metabolic panel $14.69, complete blood count $11.68, comprehensive metabolic panel $18.66). Of note, the Premier data did not include cost data on what our institution calls a TPN 2, and BMP cost was used as a substitute, given the overlap of the 2 tests' components and a desire to conservatively estimate the effects on cost to produce. Additionally, we factored in estimate of hospitalist and analyst time at $150/hour and $75/hour, respectively, to conduct that data abstraction and analysis and to manage the program. We did not formally factor in other costs, including electronic medical record acquisition costs.

Statistical Analyses

Descriptive statistics were used to describe the 2 cohorts. To test our primary hypothesis about the association between cohort membership and number of common labs per patient day, a clustered multivariable linear regression model was constructed to adjust for the a priori identified potential confounders, including sex, age, and principle discharge diagnosis. Each DRG was entered as a categorical variable in the model. Clustering was employed to account for correlation of lab ordering behavior by a given hospitalist. Separate clustered multivariable models were constructed to test the association between cohort and secondary outcomes, including duration of hospitalization, readmission, mortality, transfusion frequency, and transfusion volume using the same potential confounders. All P values were 2‐sided, and a P0.05 was considered statistically significant. All analyses were conducted with Stata 11.2 (StataCorp, College Station, TX). The study was reviewed by the Swedish Health Services Clinical Research Center and determined to be nonhuman subjects research.

RESULTS

Patient Characteristics

Patient characteristics in the before and after cohorts are shown in Table 1. Both proportion of male sex (44.9% vs 44.9%, P=1.0) and the mean age (64.6 vs 64.8 years, P=0.5) did not significantly differ between the 2 cohorts. Interestingly, there was a significant change in the distribution of DRGs between the 2 cohorts, with each of the top 10 DRGs becoming more common in the intervention cohort. For example, the percentage of patients with sepsis or severe sepsis, DRGs 871 and 872, increased by 2.2% (8.2% vs 10.4%, P0.01).

Patient Characteristics by Daily Lab Cohort
Baseline, n=7832 Intervention, n=5759 P Valuea
  • NOTE: Abbreviations: DRG, diagnosis‐related group; SD, standard deviation.

  • P value determined by 2 or Student t test.

  • Only the top 10 DRGs are listed.

Age, y, mean (SD) 64.6 (19.6) 64.8 0.5
Male, n (%) 3,514 (44.9) 2,585 (44.9) 1.0
Primary discharge diagnosis, DRG no., name, n (%)b
871 and 872, severe sepsis 641 (8.2) 599 (10.4) 0.01
885, psychoses 72 (0.9) 141 (2.4) 0.01
392, esophagitis, gastroenteritis and miscellaneous intestinal disorders 171 (2.2) 225 (3.9) 0.01
313, chest pain 114 (1.5) 123 (2.1) 0.01
378, gastrointestinal bleed 100 (1.3) 117 (2.0) 0.01
291, congestive heart failure and shock 83 (1.1) 101 (1.8) 0.01
189, pulmonary edema and respiratory failure 69 (0.9) 112 (1.9) 0.01
312, syncope and collapse 82 (1.0) 119 (2.1) 0.01
64, intracranial hemorrhage or cerebral infarction 49 (0.6) 54 (0.9) 0.04
603, cellulitis 96 (1.2) 94 (1.6) 0.05

Primary Endpoint

In the unadjusted comparison, 3 of the 4 common labs showed a similar decrease in the intervention cohort from the baseline (Table 2). For example, the mean number of CBCs ordered per patient‐day decreased by 0.15 labs per patient day (1.06 vs 0.91, P0.01). The total number of common labs ordered per patient‐day decreased by 0.30 labs per patient‐day (2.06 vs 1.76, P0.01) in the unadjusted analysis (Figure 1 and Table 2). Part of our hypothesis was that decreasing the number of labs that were ordered as daily, in an open‐ended manner, would likely decrease the number of common labs obtained per day. We found that the number of labs ordered as daily decreased by 0.71 labs per patient‐day (0.872.90 vs 0.161.01, P0.01), an 81.6% decrease from the preintervention time period.

Patient Outcomes by Daily Lab Cohort
Baseline Intervention P Valuea
  • NOTE: Abbreviations: SD, standard deviation.

  • P value determined by [2] or Student t test.

  • Basic metabolic panel plus magnesium and phosphate.

Complete blood count, per patient‐day, mean (SD) 1.06 (0.76) 0.91 (0.75) 0.01
Basic metabolic panel, per patient‐day, mean (SD) 0.68 (0.71) 0.55 (0.60) 0.01
Nutrition panel, mean (SD)b 0.06 (0.24) 0.07 (0.32) 0.01
Comprehensive metabolic panel, per patient‐day, mean (SD) 0.27 (0.49) 0.23 (0.46) 0.01
Total no. of basic labs ordered per patient‐day, mean (SD) 2.06 (1.40) 1.76 (1.37) 0.01
Transfused, n (%) 414 (5.3) 268 (4.7) 0.1
Transfused volume, mL, mean (SD) 847.3 (644.3) 744.9 (472.0) 0.02
Length of stay, days, mean (SD) 3.79 (4.58) 3.81 (4.50) 0.7
Readmitted, n (%) 1049 (13.3) 733 (12.7) 0.3
Died, n (%) 173 (2.2) 104 (1.8) 0.1
Figure 1
Mean number of total basic labs ordered per day shown over the 10 months of the preintervention period, from October 2012 to July 2013, and the 7 months of the intervention period, September 2013 to March 2014. The vertical line denotes the missing wash‐in month where the intervention began (August 2013).

In our multivariable regression model, after adjusting for sex, age, and the primary reason for admission as captured by DRG, the number of common labs ordered per day was reduced by 0.22 (95% CI, 0.34 to 0.11; P0.01). This represents a 10.7% reduction in common labs ordered per patient day.

Secondary Endpoints

Table 2 shows secondary outcomes of the study. Patient safety endpoints were not changed in unadjusted analyses. For example, the hospital length of stay in number of days was similar in both the baseline and intervention cohorts (3.784.58 vs 3.814.50, P=0.7). There was a nonsignificant reduction in the hospital mortality rate during the intervention period by 0.4% (2.2% vs 1.8%, P=0.1). No significant differences were found when the multivariable model was rerun for each of the 3 secondary endpoints individually, readmissions, mortality, and length of stay.

Two secondary efficacy endpoints were also evaluated. The percentage of patients receiving transfusions did not decrease in either the unadjusted or adjusted analysis. However, the volume of blood transfused per patient who received a transfusion decreased by 91.9 mL in the bivariate analysis (836.8 mL621.4 mL vs 744.9 mL472.0 mL; P=0.03) (Table 2). The decrease, however, was not significant in the multivariable model (127.2 mL; 95% CI, 257.9 to 3.6; P=0.06).

Cost Data

Based on the Premier estimate of the cost to the hospital to perform the common lab tests, the intervention likely decreased direct costs by $16.19 per patient (95% CI, $12.95 to $19.43). The cost saving was decreased by the expense of the intervention, which is estimated to be $8000 and was driven by hospitalist and analyst time. Based on the patient volume in our health system, and factoring in the cost of implementation, we estimate that this intervention resulted in annualized savings of $151,682 (95% CI, $119,746 to $187,618).

DISCUSSION

Ordering common labs daily is a routine practice among providers at many institutions. In fact, at our institution, prior to the intervention, 42% of all common labs were ordered as daily, meaning they were obtained each day without regard to the previous value or the patient's clinical condition. The practice is one of convenience or habit, and many times not clinically indicated.[5, 32]

We observed a significant reduction in the number of common labs ordered as daily, and more importantly, the total number of common labs in the intervention period. The rapid change in provider behavior is notable and likely due to several factors. First, there was a general sentiment among the hospitalists in the merits of the project. Second, there may have been an aversion to the display of lower performance relative to peers in the monthly e‐mails. Third, and perhaps most importantly, our hospitalist team had worked together for many years on projects like this, creating a culture of QI and willingness to change practice patterns in response to data.[33]

Concern about decreasing waste and increasing the value of healthcare abound, particularly in the United States.[1] Decreasing the cost to produce equivalent or improved health outcomes for a given episode of care has been proposed as a way to improve value.[34] This intervention results in modest waste reduction, the benefits of which are readily apparent in a DRG‐based reimbursement model, where the hospital realizes any saving in the cost of producing a hospital stay, as well as in a total cost of care environment, such as could be found in an Accountable Care Organization.

The previous work in the field of lab reduction has all been performed at university‐affiliated academic institutions. We demonstrated that the QI tactics described in the literature can be successfully employed in a community‐based hospitalist practice. This has broad applicability to increasing the value of healthcare and could serve as a model for future community‐based hospitalist QI projects.

The study has several limitations. First, the length of follow‐up is only 7 months, and although there was rapid and effective adoption of the intervention, provider behavior may regress to previous practice patterns over time. Second, the simple before‐after nature of our trial design raises the possibility that environmental influences exist and that changes in ordering behavior may have been the result of something other than the intervention. Most notably, the Choosing Wisely recommendation for hospitalists was published in September of 2013, coinciding with our intervention period.[22] The reduction in number of labs ordered may have been a partial result of these recommendations. Third, the 2 cohorts included different times of the year based on the distribution of DRGs, which likely had a different composition of diagnoses being treated. To address this we adjusted for DRG, but there may have been some residual confounding, as some diagnoses may be managed with more laboratory tests than others in a way that was not fully adjusted for in our model. Fourth, the intervention was made possible because of the substantial and ongoing investments that our health system has made in our electronic medical record and data analytics capability. The variability of these resources across institutions limits generalizability. Fifth, although we used the QI tools that were described, we did not do a formal process map or utilize other Lean or Six Sigma tools. As the healthcare industry continues on its journey to high reliability, these use tools will hopefully become more widespread. We demonstrated that even with these simple tactics, significant progress can be made.

Finally, there exists a concern that decreasing regular laboratory monitoring might be associated with undetected worsening in the patient's clinical status. We did not observe any significant adverse effects on coarse measures of clinical performance, including length of stay, readmission rate, or mortality. However, we did not collect data on all clinical parameters, and it is possible that there could have been an undetected effect on incident renal failure or hemodialysis or intensive care unit transfer. Other studies on this type of intervention have evaluated some of these possible adverse outcomes and have not noted an association.[12, 15, 18, 20, 22] Future studies should evaluate harms associated with implementation of Choosing Wisely and other interventions targeted at waste reduction. Future work is also needed to disseminate more formal and rigorous QI tools and methodologies.

CONCLUSION

We implemented a multifaceted QI intervention including provider education, transparent display of data, and audit and feedback that was associated with a significant reduction in the number of common labs ordered in a large community‐based hospitalist group, without evidence of harm. Further study is needed to understand how hospitalist groups can optimally decrease waste in healthcare.

Disclosures

This work was performed at the Swedish Health System, Seattle, Washington. Dr. Corson served as primary author, designed the study protocol, obtained the data, analyzed all the data and wrote the manuscript and its revisions, and approved the final version of the manuscript. He attests that no undisclosed authors contributed to the manuscript. Dr. Fan designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Mr. White reviewed the study protocol, obtained the study data, reviewed the manuscript, and approved the final version of the manuscript. Sean D. Sullivan, PhD, designed the study protocol, obtained study data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Asakura designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Dr. Myint reviewed the study protocol and data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Dale designed the study protocol, analyzed the data, reviewed the manuscript, and approved the final version of the manuscript. The authors report no conflicts of interest.

Waste in US healthcare is a public health threat, with an estimated value of $910 billion per year.[1] It constitutes some of the relatively high per‐discharge healthcare spending seen in the United States when compared to other nations.[2] Waste takes many forms, one of which is excessive use of diagnostic laboratory testing.[1] Many hospital providers obtain common labs, such as complete blood counts (CBCs) and basic metabolic panels (BMPs), in an open‐ended, daily manner for their hospitalized patients, without regard for the patient's clinical condition or despite stability of the previous results. Reasons for ordering these tests in a nonpatient‐centered manner include provider convenience (such as inclusion in an order set), ease of access, habit, or defensive practice.[3, 4, 5] All of these reasons may represent waste.

Although the potential waste of routine daily labs may seem small, the frequency with which they are ordered results in a substantial real and potential cost, both financially and clinically. Multiple studies have shown a link between excessive diagnostic phlebotomy and hospital‐acquired anemia.[6, 7, 8, 9] Hospital‐acquired anemia itself has been associated with increased mortality.[10] In addition to blood loss and financial cost, patient experience and satisfaction are also detrimentally affected by excessive laboratory testing in the form of pain and inconvenience from the act of phlebotomy.[11]

There are many reports of strategies to decrease excessive diagnostic laboratory testing as a means of addressing this waste in the inpatient setting.[12, 13, 14, 15, 16, 17, 18, 19, 20, 21] All of these studies have taken place in a traditional academic setting, and many implemented their intervention through a computer‐based order entry system. Based on the literature search regarding this topic, we found no examples of studies conducted among and within community‐based hospitalist practices. More recently, this issue was highlighted as part of the Choosing Wisely campaign sponsored by the American Board of Internal Medicine Foundation, Consumer Reports, and more than 60 specialty societies. The Society of Hospital Medicine, the professional society for hospitalists, recommended avoidance of repetitive common laboratory testing in the face of clinical stability.[22]

Much has been written about quality improvement (QI) by the Institute for Healthcare Improvement, the Society of Hospitalist Medicine, and others.[23, 24, 25] How best to move from a Choosing Wisely recommendation to highly reliable incorporation in clinical practice in a community setting is not known and likely varies depending upon the care environment. Successful QI interventions are often multifaceted and include academic detailing and provider education, transparent display of data, and regular audit and feedback of performance data.[26, 27, 28, 29] Prior to the publication of the Society of Hospital Medicine's Choosing Wisely recommendations, we chose to implement the recommendation to decrease ordering of daily labs using 3 QI strategies in our community 4‐hospital health system.

METHODS

Study Participants

This activity was undertaken as a QI initiative by Swedish Hospital Medicine (SHM), a 53‐provider employed hospitalist group that staffs a total of 1420 beds across 4 inpatient facilities. SHM has a longstanding record of working together as a team on QI projects.

An informal preliminary audit of our common lab ordering by a member of the study team revealed multiple examples of labs ordered every day without medical‐record evidence of intervention or management decisions being made based on the results. This preliminary activity raised the notion within the hospitalist group that this was a topic ripe for intervention and improvement. Four common labs, CBC, BMP, nutrition panel (called TPN 2 in our system, consisting of a BMP and magnesium and phosphorus) and comprehensive metabolic panel (BMP and liver function tests), formed the bulk of the repetitively ordered labs and were the focus of our activity. We excluded prothrombin time/International Normalized Ratio, as it was less clear that obtaining these daily clearly represented waste. We then reviewed medical literature for successful QI strategies and chose academic detailing, transparent display of data, and audit and feedback as our QI tactics.[29]

Using data from our electronic medical record, we chose a convenience preintervention period of 10 months for our baseline data. We allowed for a 2‐month wash‐in period in August 2013, and a convenience period of 7 months was chosen as the intervention period.

Intervention

An introductory email was sent out in mid‐August 2013 to all hospitalist providers describing the waste and potential harm to patients associated with unnecessary common blood tests, in particular those ordered as daily. The email recommended 2 changes: (1) immediate cessation of the practice of ordering common labs as daily, in an open, unending manner and (2) assessing the need for common labs in the next 24 hours, and ordering based on that need, but no further into the future.

Hospitalist providers were additionally informed that the number of common labs ordered daily would be tracked prospectively, with monthly reporting of individual provider ordering. In addition, the 5 members of the hospitalist team who most frequently ordered common labs as daily during January 2013 to March 2013 were sent individual emails informing them of their top‐5 position.

During the 7‐month intervention period, a monthly email was sent to all members of the hospitalist team with 4 basic components: (1) reiteration of the recommendations and reasoning stated in the original email; (2) a list of all members of the hospitalist team and the corresponding frequency of common labs ordered as daily (open ended) per provider for the month; (3) a recommendation to discontinue any common labs ordered as daily; and (4) at least 1 example of a patient cared for during the month by the hospitalist team, who had at least 1 common lab ordered for at least 5 days in a row, with no mention of the results in the progress notes and no apparent contribution to the management of the medical conditions for which the patient was being treated.

The change in number of tests ordered during the intervention was not shared with the team until early January 2014.

Data Elements and Endpoints

Number of common labs ordered as daily, and the total number of common labs per hospital‐day, ordered by any frequency, on hospitalist patients were abstracted from the electronic medical record. Hospitalist patients were defined as those both admitted and discharged by a hospitalist provider. We chose to compare the 10 months prior to the intervention with the 7 months during the intervention, allowing 1 month as the intervention wash‐in period. No other interventions related to lab ordering occurred during the study period. Additional variables collected included duration of hospitalization, mortality, readmission, and transfusion data. Consistency of providers in the preintervention and intervention period was high. Two providers were included in some of the preintervention data, but were not included in the intervention data, as they both left for other positions. Otherwise, all other providers in the data were consistent between the 2 time periods.

The primary endpoint was chosen a priori as the total number of common labs ordered per hospital‐day. Additionally, we identified a priori potential confounders, including age, sex, and primary discharge diagnosis, as captured by the all‐patient refined diagnosis‐related group (APR‐DRG, hereafter DRG). DRG was chosen as a clinical risk adjustment variable because there does not exist an established method to model the effects of clinical conditions on the propensity to obtain labs, the primary endpoint. Many models used for risk adjustment in patient quality reporting use hospital mortality as the primary endpoint, not the need for laboratory testing.[30, 31] As our primary endpoint was common labs and not mortality, we chose DRG as the best single variable to model changes in the clinical case mix that might affect the number of common labs.

Secondary endpoints were also determined a priori. Out of desire to assess the patient safety implications of an intervention targeting decreased monitoring, we included hospital mortality, duration of hospitalization, and readmission as safety variables. Two secondary endpoints were obtained as possible additional efficacy endpoints to test the hypothesis that the intervention might be associated with a reduction in transfusion burden: red blood cell transfusion and transfusion volume. We also tracked the frequency with which providers ordered common labs as daily in the baseline and intervention periods, as this was the behavior targeted by the interventions.

Costs to the hospital to produce the lab studies were also considered as a secondary endpoint. Median hospital costs were obtained from the first‐quarter, 2013 Premier dataset, a national dataset of hospital costs (basic metabolic panel $14.69, complete blood count $11.68, comprehensive metabolic panel $18.66). Of note, the Premier data did not include cost data on what our institution calls a TPN 2, and BMP cost was used as a substitute, given the overlap of the 2 tests' components and a desire to conservatively estimate the effects on cost to produce. Additionally, we factored in estimate of hospitalist and analyst time at $150/hour and $75/hour, respectively, to conduct that data abstraction and analysis and to manage the program. We did not formally factor in other costs, including electronic medical record acquisition costs.

Statistical Analyses

Descriptive statistics were used to describe the 2 cohorts. To test our primary hypothesis about the association between cohort membership and number of common labs per patient day, a clustered multivariable linear regression model was constructed to adjust for the a priori identified potential confounders, including sex, age, and principle discharge diagnosis. Each DRG was entered as a categorical variable in the model. Clustering was employed to account for correlation of lab ordering behavior by a given hospitalist. Separate clustered multivariable models were constructed to test the association between cohort and secondary outcomes, including duration of hospitalization, readmission, mortality, transfusion frequency, and transfusion volume using the same potential confounders. All P values were 2‐sided, and a P0.05 was considered statistically significant. All analyses were conducted with Stata 11.2 (StataCorp, College Station, TX). The study was reviewed by the Swedish Health Services Clinical Research Center and determined to be nonhuman subjects research.

RESULTS

Patient Characteristics

Patient characteristics in the before and after cohorts are shown in Table 1. Both proportion of male sex (44.9% vs 44.9%, P=1.0) and the mean age (64.6 vs 64.8 years, P=0.5) did not significantly differ between the 2 cohorts. Interestingly, there was a significant change in the distribution of DRGs between the 2 cohorts, with each of the top 10 DRGs becoming more common in the intervention cohort. For example, the percentage of patients with sepsis or severe sepsis, DRGs 871 and 872, increased by 2.2% (8.2% vs 10.4%, P0.01).

Patient Characteristics by Daily Lab Cohort
Baseline, n=7832 Intervention, n=5759 P Valuea
  • NOTE: Abbreviations: DRG, diagnosis‐related group; SD, standard deviation.

  • P value determined by 2 or Student t test.

  • Only the top 10 DRGs are listed.

Age, y, mean (SD) 64.6 (19.6) 64.8 0.5
Male, n (%) 3,514 (44.9) 2,585 (44.9) 1.0
Primary discharge diagnosis, DRG no., name, n (%)b
871 and 872, severe sepsis 641 (8.2) 599 (10.4) 0.01
885, psychoses 72 (0.9) 141 (2.4) 0.01
392, esophagitis, gastroenteritis and miscellaneous intestinal disorders 171 (2.2) 225 (3.9) 0.01
313, chest pain 114 (1.5) 123 (2.1) 0.01
378, gastrointestinal bleed 100 (1.3) 117 (2.0) 0.01
291, congestive heart failure and shock 83 (1.1) 101 (1.8) 0.01
189, pulmonary edema and respiratory failure 69 (0.9) 112 (1.9) 0.01
312, syncope and collapse 82 (1.0) 119 (2.1) 0.01
64, intracranial hemorrhage or cerebral infarction 49 (0.6) 54 (0.9) 0.04
603, cellulitis 96 (1.2) 94 (1.6) 0.05

Primary Endpoint

In the unadjusted comparison, 3 of the 4 common labs showed a similar decrease in the intervention cohort from the baseline (Table 2). For example, the mean number of CBCs ordered per patient‐day decreased by 0.15 labs per patient day (1.06 vs 0.91, P0.01). The total number of common labs ordered per patient‐day decreased by 0.30 labs per patient‐day (2.06 vs 1.76, P0.01) in the unadjusted analysis (Figure 1 and Table 2). Part of our hypothesis was that decreasing the number of labs that were ordered as daily, in an open‐ended manner, would likely decrease the number of common labs obtained per day. We found that the number of labs ordered as daily decreased by 0.71 labs per patient‐day (0.872.90 vs 0.161.01, P0.01), an 81.6% decrease from the preintervention time period.

Patient Outcomes by Daily Lab Cohort
Baseline Intervention P Valuea
  • NOTE: Abbreviations: SD, standard deviation.

  • P value determined by [2] or Student t test.

  • Basic metabolic panel plus magnesium and phosphate.

Complete blood count, per patient‐day, mean (SD) 1.06 (0.76) 0.91 (0.75) 0.01
Basic metabolic panel, per patient‐day, mean (SD) 0.68 (0.71) 0.55 (0.60) 0.01
Nutrition panel, mean (SD)b 0.06 (0.24) 0.07 (0.32) 0.01
Comprehensive metabolic panel, per patient‐day, mean (SD) 0.27 (0.49) 0.23 (0.46) 0.01
Total no. of basic labs ordered per patient‐day, mean (SD) 2.06 (1.40) 1.76 (1.37) 0.01
Transfused, n (%) 414 (5.3) 268 (4.7) 0.1
Transfused volume, mL, mean (SD) 847.3 (644.3) 744.9 (472.0) 0.02
Length of stay, days, mean (SD) 3.79 (4.58) 3.81 (4.50) 0.7
Readmitted, n (%) 1049 (13.3) 733 (12.7) 0.3
Died, n (%) 173 (2.2) 104 (1.8) 0.1
Figure 1
Mean number of total basic labs ordered per day shown over the 10 months of the preintervention period, from October 2012 to July 2013, and the 7 months of the intervention period, September 2013 to March 2014. The vertical line denotes the missing wash‐in month where the intervention began (August 2013).

In our multivariable regression model, after adjusting for sex, age, and the primary reason for admission as captured by DRG, the number of common labs ordered per day was reduced by 0.22 (95% CI, 0.34 to 0.11; P0.01). This represents a 10.7% reduction in common labs ordered per patient day.

Secondary Endpoints

Table 2 shows secondary outcomes of the study. Patient safety endpoints were not changed in unadjusted analyses. For example, the hospital length of stay in number of days was similar in both the baseline and intervention cohorts (3.784.58 vs 3.814.50, P=0.7). There was a nonsignificant reduction in the hospital mortality rate during the intervention period by 0.4% (2.2% vs 1.8%, P=0.1). No significant differences were found when the multivariable model was rerun for each of the 3 secondary endpoints individually, readmissions, mortality, and length of stay.

Two secondary efficacy endpoints were also evaluated. The percentage of patients receiving transfusions did not decrease in either the unadjusted or adjusted analysis. However, the volume of blood transfused per patient who received a transfusion decreased by 91.9 mL in the bivariate analysis (836.8 mL621.4 mL vs 744.9 mL472.0 mL; P=0.03) (Table 2). The decrease, however, was not significant in the multivariable model (127.2 mL; 95% CI, 257.9 to 3.6; P=0.06).

Cost Data

Based on the Premier estimate of the cost to the hospital to perform the common lab tests, the intervention likely decreased direct costs by $16.19 per patient (95% CI, $12.95 to $19.43). The cost saving was decreased by the expense of the intervention, which is estimated to be $8000 and was driven by hospitalist and analyst time. Based on the patient volume in our health system, and factoring in the cost of implementation, we estimate that this intervention resulted in annualized savings of $151,682 (95% CI, $119,746 to $187,618).

DISCUSSION

Ordering common labs daily is a routine practice among providers at many institutions. In fact, at our institution, prior to the intervention, 42% of all common labs were ordered as daily, meaning they were obtained each day without regard to the previous value or the patient's clinical condition. The practice is one of convenience or habit, and many times not clinically indicated.[5, 32]

We observed a significant reduction in the number of common labs ordered as daily, and more importantly, the total number of common labs in the intervention period. The rapid change in provider behavior is notable and likely due to several factors. First, there was a general sentiment among the hospitalists in the merits of the project. Second, there may have been an aversion to the display of lower performance relative to peers in the monthly e‐mails. Third, and perhaps most importantly, our hospitalist team had worked together for many years on projects like this, creating a culture of QI and willingness to change practice patterns in response to data.[33]

Concern about decreasing waste and increasing the value of healthcare abound, particularly in the United States.[1] Decreasing the cost to produce equivalent or improved health outcomes for a given episode of care has been proposed as a way to improve value.[34] This intervention results in modest waste reduction, the benefits of which are readily apparent in a DRG‐based reimbursement model, where the hospital realizes any saving in the cost of producing a hospital stay, as well as in a total cost of care environment, such as could be found in an Accountable Care Organization.

The previous work in the field of lab reduction has all been performed at university‐affiliated academic institutions. We demonstrated that the QI tactics described in the literature can be successfully employed in a community‐based hospitalist practice. This has broad applicability to increasing the value of healthcare and could serve as a model for future community‐based hospitalist QI projects.

The study has several limitations. First, the length of follow‐up is only 7 months, and although there was rapid and effective adoption of the intervention, provider behavior may regress to previous practice patterns over time. Second, the simple before‐after nature of our trial design raises the possibility that environmental influences exist and that changes in ordering behavior may have been the result of something other than the intervention. Most notably, the Choosing Wisely recommendation for hospitalists was published in September of 2013, coinciding with our intervention period.[22] The reduction in number of labs ordered may have been a partial result of these recommendations. Third, the 2 cohorts included different times of the year based on the distribution of DRGs, which likely had a different composition of diagnoses being treated. To address this we adjusted for DRG, but there may have been some residual confounding, as some diagnoses may be managed with more laboratory tests than others in a way that was not fully adjusted for in our model. Fourth, the intervention was made possible because of the substantial and ongoing investments that our health system has made in our electronic medical record and data analytics capability. The variability of these resources across institutions limits generalizability. Fifth, although we used the QI tools that were described, we did not do a formal process map or utilize other Lean or Six Sigma tools. As the healthcare industry continues on its journey to high reliability, these use tools will hopefully become more widespread. We demonstrated that even with these simple tactics, significant progress can be made.

Finally, there exists a concern that decreasing regular laboratory monitoring might be associated with undetected worsening in the patient's clinical status. We did not observe any significant adverse effects on coarse measures of clinical performance, including length of stay, readmission rate, or mortality. However, we did not collect data on all clinical parameters, and it is possible that there could have been an undetected effect on incident renal failure or hemodialysis or intensive care unit transfer. Other studies on this type of intervention have evaluated some of these possible adverse outcomes and have not noted an association.[12, 15, 18, 20, 22] Future studies should evaluate harms associated with implementation of Choosing Wisely and other interventions targeted at waste reduction. Future work is also needed to disseminate more formal and rigorous QI tools and methodologies.

CONCLUSION

We implemented a multifaceted QI intervention including provider education, transparent display of data, and audit and feedback that was associated with a significant reduction in the number of common labs ordered in a large community‐based hospitalist group, without evidence of harm. Further study is needed to understand how hospitalist groups can optimally decrease waste in healthcare.

Disclosures

This work was performed at the Swedish Health System, Seattle, Washington. Dr. Corson served as primary author, designed the study protocol, obtained the data, analyzed all the data and wrote the manuscript and its revisions, and approved the final version of the manuscript. He attests that no undisclosed authors contributed to the manuscript. Dr. Fan designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Mr. White reviewed the study protocol, obtained the study data, reviewed the manuscript, and approved the final version of the manuscript. Sean D. Sullivan, PhD, designed the study protocol, obtained study data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Asakura designed the study protocol, reviewed the manuscript, and approved the final version of the manuscript. Dr. Myint reviewed the study protocol and data, reviewed the manuscript, and approved the final version of the manuscript. Dr. Dale designed the study protocol, analyzed the data, reviewed the manuscript, and approved the final version of the manuscript. The authors report no conflicts of interest.

References
  1. Berwick D. Eliminating “waste” in health care. JAMA. 2012;307(14):15131516.
  2. Squires DA. The U.S. health system in perspective: a comparison of twelve industrialized nations. Issue Brief (Commonw Fund). 2011;16:114.
  3. DeKay ML, Asch DA. Is the defensive use of diagnostic tests good for patients, or bad? Med Decis Mak. 1998;18(1):1928.
  4. Epstein AM, McNeil BJ. Physician characteristics and organizational factors influencing use of ambulatory tests. Med Decis Making. 1985;5:401415.
  5. Salinas M, Lopez‐Garrigos M, Uris J; Pilot Group of the Appropriate Utilization of Laboratory Tests (REDCONLAB) Working Group. Differences in laboratory requesting patterns in emergency department in Spain. Ann Clin Biochem. 2013;50:353359.
  6. Wong P, Intragumtornchai T. Hospital‐acquired anemia. J Med Assoc Thail. 2006;89(1):6367.
  7. Thavendiranathan P, Bagai A, Ebidia A, Detsky AS, Choudhry NK. Do blood tests cause anemia in hospitalized patients? The effect of diagnostic phlebotomy on hemoglobin and hematocrit levels. J Gen Intern Med. 2005;20(6):520524.
  8. Smoller BR, Kruskall MS. Phlebotomy for diagnostic laboratory tests in adults. Pattern of use and effect on transfusion requirements. N Engl J Med. 1986;314(19):12331235.
  9. Salisbury AC, Reid KJ, Alexander KP, et al. Diagnostic blood loss from phlebotomy and hospital‐acquired anemia during acute myocardial infarction. Arch Intern Med. 2011;171(18):16461653.
  10. Koch CG, Li L, Sun Z, et al. Hospital‐acquired anemia: prevalence, outcomes, and healthcare implications. J Hosp Med. 2013;8(9):506512.
  11. Howanitz PJ, Cembrowski GS, Bachner P. Laboratory phlebotomy. College of American Pathologists Q‐Probe study of patient satisfaction and complications in 23,783 patients. Arch Pathol Lab Med. 1991;115:867872.
  12. Attali M, Barel Y, Somin M, et al. A cost‐effective method for reducing the volume of laboratory tests in a university‐associated teaching hospital. Mt Sinai J Med. 2006;73(5):787794.
  13. Bareford D, Hayling A. Inappropriate use of laboratory services: long term combined approach to modify request patterns. BMJ. 1990;301(6764):13051307.
  14. Bunting PS, Walraven C. Effect of a controlled feedback intervention on laboratory test ordering by community physicians. Clin Chem. 2004;50(2):321326.
  15. Calderon‐Margalit R, Mor‐Yosef S, Mayer M, Adler B, Shapira SC. An administrative intervention to improve the utilization of laboratory tests within a university hospital. Int J Qual Heal Care. 2005;17(3):243248.
  16. Critique SI. Surgical vampires and rising health care expenditure. Arch Surg. 2011;146(5):524527.
  17. Fowkes FG, Hall R, Jones JH, et al. Trial of strategy for reducing the use of laboratory tests. Br Med J (Clin Res Ed). 1986;292(6524):883885.
  18. Kroenke K, Hanley JF, Copley JB, et al. Improving house staff ordering of three common laboratory tests. Reductions in test ordering need not result in underutilization. Med Care. 1987;25(10):928935.
  19. May TA, Clancy M, Critchfield J, et al. Reducing unnecessary inpatient laboratory testing in a teaching hospital. Am J Clin Pathol. 2006;126(2):200206.
  20. Neilson EG, Johnson KB, Rosenbloom ST, et al. Improving patient care the impact of peer management on test‐ordering behavior. Ann Intern Med. 2004;141(3):196204.
  21. Novich M, Gillis L, Tauber AI. The laboratory test justified. An effective means to reduce routine laboratory testing. Am J Clin Pathol. 1985;86(6):756759.
  22. Bulger J, Nickel W, Messler J, et al. Choosing wisely in adult hospital medicine: five opportunities for improved healthcare value. J Hosp Med. 2013;8(9):486492.
  23. Dale C. Quality Improvement in the intensive care unit. In: Scales DC, Rubenfeld GD, eds. The Organization of Critical Care. New York, NY: Humana Press; 2014:279.
  24. Curtis JR, Cook DJ, Wall RJ, et al. Intensive care unit quality improvement: a “how‐to” guide for the interdisciplinary team. Crit Care Med. 2006;34:211218.
  25. Pronovost PJ. Navigating adaptive challenges in quality improvement. BMJ Qual Safety. 2011;20(7):560563.
  26. Scales DC, Dainty K, Hales B, et al. A multifaceted intervention for quality improvement in a network of intensive care units: a cluster randomized trial. JAMA. 2011;305:363372.
  27. O'Neill SM. How do quality improvement interventions succeed? Archetypes of success and failure. Available at: http://www.rand.org/pubs/rgs_dissertations/RGSD282.html. Published 2011.
  28. Berwanger O, Guimarães HP, Laranjeira LN, et al. Effect of a multifaceted intervention on use of evidence‐based therapies in patients with acute coronary syndromes in Brazil: the BRIDGE‐ACS randomized trial. JAMA. 2012;307:20412049.
  29. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259.
  30. Glance LG, Osler TM, Mukamel DB, Dick AW. Impact of the present‐on‐admission indicator on hospital quality measurement: experience with the Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicators. Med Care. 2008;46:112119.
  31. Pine M, Jordan HS, Elixhauser A, et al. Enhancement of claims data to improve risk adjustment of hospital mortality. JAMA. 2007;297:7176.
  32. Salinas M, López‐Garrigós M, Tormo C, Uris J. Primary care use of laboratory tests in Spain: measurement through appropriateness indicators. Clin Lab. 2014;60(3):483490.
  33. Curry LA, Spatz E, Cherlin E, et al. What distinguishes top‐performing hospitals in acute myocardial infarction mortality rates? a qualitative study. Ann Intern Med. 2011;154(6):384390.
  34. Porter ME. What is value in health care? N Engl J Med. 2010;363(26):24772481.
References
  1. Berwick D. Eliminating “waste” in health care. JAMA. 2012;307(14):15131516.
  2. Squires DA. The U.S. health system in perspective: a comparison of twelve industrialized nations. Issue Brief (Commonw Fund). 2011;16:114.
  3. DeKay ML, Asch DA. Is the defensive use of diagnostic tests good for patients, or bad? Med Decis Mak. 1998;18(1):1928.
  4. Epstein AM, McNeil BJ. Physician characteristics and organizational factors influencing use of ambulatory tests. Med Decis Making. 1985;5:401415.
  5. Salinas M, Lopez‐Garrigos M, Uris J; Pilot Group of the Appropriate Utilization of Laboratory Tests (REDCONLAB) Working Group. Differences in laboratory requesting patterns in emergency department in Spain. Ann Clin Biochem. 2013;50:353359.
  6. Wong P, Intragumtornchai T. Hospital‐acquired anemia. J Med Assoc Thail. 2006;89(1):6367.
  7. Thavendiranathan P, Bagai A, Ebidia A, Detsky AS, Choudhry NK. Do blood tests cause anemia in hospitalized patients? The effect of diagnostic phlebotomy on hemoglobin and hematocrit levels. J Gen Intern Med. 2005;20(6):520524.
  8. Smoller BR, Kruskall MS. Phlebotomy for diagnostic laboratory tests in adults. Pattern of use and effect on transfusion requirements. N Engl J Med. 1986;314(19):12331235.
  9. Salisbury AC, Reid KJ, Alexander KP, et al. Diagnostic blood loss from phlebotomy and hospital‐acquired anemia during acute myocardial infarction. Arch Intern Med. 2011;171(18):16461653.
  10. Koch CG, Li L, Sun Z, et al. Hospital‐acquired anemia: prevalence, outcomes, and healthcare implications. J Hosp Med. 2013;8(9):506512.
  11. Howanitz PJ, Cembrowski GS, Bachner P. Laboratory phlebotomy. College of American Pathologists Q‐Probe study of patient satisfaction and complications in 23,783 patients. Arch Pathol Lab Med. 1991;115:867872.
  12. Attali M, Barel Y, Somin M, et al. A cost‐effective method for reducing the volume of laboratory tests in a university‐associated teaching hospital. Mt Sinai J Med. 2006;73(5):787794.
  13. Bareford D, Hayling A. Inappropriate use of laboratory services: long term combined approach to modify request patterns. BMJ. 1990;301(6764):13051307.
  14. Bunting PS, Walraven C. Effect of a controlled feedback intervention on laboratory test ordering by community physicians. Clin Chem. 2004;50(2):321326.
  15. Calderon‐Margalit R, Mor‐Yosef S, Mayer M, Adler B, Shapira SC. An administrative intervention to improve the utilization of laboratory tests within a university hospital. Int J Qual Heal Care. 2005;17(3):243248.
  16. Critique SI. Surgical vampires and rising health care expenditure. Arch Surg. 2011;146(5):524527.
  17. Fowkes FG, Hall R, Jones JH, et al. Trial of strategy for reducing the use of laboratory tests. Br Med J (Clin Res Ed). 1986;292(6524):883885.
  18. Kroenke K, Hanley JF, Copley JB, et al. Improving house staff ordering of three common laboratory tests. Reductions in test ordering need not result in underutilization. Med Care. 1987;25(10):928935.
  19. May TA, Clancy M, Critchfield J, et al. Reducing unnecessary inpatient laboratory testing in a teaching hospital. Am J Clin Pathol. 2006;126(2):200206.
  20. Neilson EG, Johnson KB, Rosenbloom ST, et al. Improving patient care the impact of peer management on test‐ordering behavior. Ann Intern Med. 2004;141(3):196204.
  21. Novich M, Gillis L, Tauber AI. The laboratory test justified. An effective means to reduce routine laboratory testing. Am J Clin Pathol. 1985;86(6):756759.
  22. Bulger J, Nickel W, Messler J, et al. Choosing wisely in adult hospital medicine: five opportunities for improved healthcare value. J Hosp Med. 2013;8(9):486492.
  23. Dale C. Quality Improvement in the intensive care unit. In: Scales DC, Rubenfeld GD, eds. The Organization of Critical Care. New York, NY: Humana Press; 2014:279.
  24. Curtis JR, Cook DJ, Wall RJ, et al. Intensive care unit quality improvement: a “how‐to” guide for the interdisciplinary team. Crit Care Med. 2006;34:211218.
  25. Pronovost PJ. Navigating adaptive challenges in quality improvement. BMJ Qual Safety. 2011;20(7):560563.
  26. Scales DC, Dainty K, Hales B, et al. A multifaceted intervention for quality improvement in a network of intensive care units: a cluster randomized trial. JAMA. 2011;305:363372.
  27. O'Neill SM. How do quality improvement interventions succeed? Archetypes of success and failure. Available at: http://www.rand.org/pubs/rgs_dissertations/RGSD282.html. Published 2011.
  28. Berwanger O, Guimarães HP, Laranjeira LN, et al. Effect of a multifaceted intervention on use of evidence‐based therapies in patients with acute coronary syndromes in Brazil: the BRIDGE‐ACS randomized trial. JAMA. 2012;307:20412049.
  29. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259.
  30. Glance LG, Osler TM, Mukamel DB, Dick AW. Impact of the present‐on‐admission indicator on hospital quality measurement: experience with the Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicators. Med Care. 2008;46:112119.
  31. Pine M, Jordan HS, Elixhauser A, et al. Enhancement of claims data to improve risk adjustment of hospital mortality. JAMA. 2007;297:7176.
  32. Salinas M, López‐Garrigós M, Tormo C, Uris J. Primary care use of laboratory tests in Spain: measurement through appropriateness indicators. Clin Lab. 2014;60(3):483490.
  33. Curry LA, Spatz E, Cherlin E, et al. What distinguishes top‐performing hospitals in acute myocardial infarction mortality rates? a qualitative study. Ann Intern Med. 2011;154(6):384390.
  34. Porter ME. What is value in health care? N Engl J Med. 2010;363(26):24772481.
Issue
Journal of Hospital Medicine - 10(6)
Issue
Journal of Hospital Medicine - 10(6)
Page Number
390-395
Page Number
390-395
Article Type
Display Headline
A multifaceted hospitalist quality improvement intervention: Decreased frequency of common labs
Display Headline
A multifaceted hospitalist quality improvement intervention: Decreased frequency of common labs
Sections
Article Source
© 2015 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Adam Corson, MD, Swedish Medical Center, 747 Broadway, Seattle, WA 98122; Telephone: 206‐215‐2520; Fax: 206‐215‐6364; E‐mail: [email protected]
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files

Group traces clonal evolution of B-ALL

Article Type
Changed
Display Headline
Group traces clonal evolution of B-ALL

Micrograph showing B-ALL

In tracing the clonal evolution of B-cell acute lymphoblastic leukemia (B-ALL) from diagnosis to relapse, researchers discovered that clonal diversity is comparable in both states.

They also identified mutations associated with B-ALL relapse and found that clonal survival is not dependent upon mutation burden.

In most of the cases the researchers analyzed, a single, minor clone survived therapy, acquired additional mutations, and drove disease relapse.

Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues recounted these findings in Nature Communications.

The researchers performed deep, whole-exome sequencing on cell samples from 20 young patients (ages 2 to 19) with relapsed B-ALL. The samples were collected at diagnosis, remission, and relapse.

“[W]e wanted to find out the underlying mechanism leading to cancer relapse,” Dr Zhang said. “When the cancer recurs, is it a completely different cancer, or is it an extension, or change, arising from pre-existing cancer?”

The researchers were able to detect the mutations in both the “rising” and “falling” clones—those that survive therapy and those that don’t—at the different disease stages and pinpoint the mutations that drove the leukemia.

Seven genes were highly likely to be mutated in relapsed disease—NT5C2, CREBBP, WHSC1, TP53, USH2A, NRAS, and IKZF1.

The researchers also characterized how diverse those mutations were at diagnosis and relapse. They found that B-ALL cells were mutating just as wildly and diversely in one phase of disease as in the other.

“This finding was interesting, because most people think that the clone that has the most mutations is more likely to survive therapy and evolve, but that doesn’t seem to be the case,” Dr Zhang said.

In most cases, relapse was driven by a minor subclone that had survived therapy and was present at an extremely low level. The researchers said this finding suggests a need to change the way we assess patients after treatment to determine the likelihood of relapse.

“When we are analyzing for the level of minimum residual disease in monitoring remission in patients, we should not only pay attention to the mutations in the predominant clone,” Dr Zhang said. “We should also be tracking what kinds of mutations exist in the minor subclones.”

Publications
Topics

Micrograph showing B-ALL

In tracing the clonal evolution of B-cell acute lymphoblastic leukemia (B-ALL) from diagnosis to relapse, researchers discovered that clonal diversity is comparable in both states.

They also identified mutations associated with B-ALL relapse and found that clonal survival is not dependent upon mutation burden.

In most of the cases the researchers analyzed, a single, minor clone survived therapy, acquired additional mutations, and drove disease relapse.

Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues recounted these findings in Nature Communications.

The researchers performed deep, whole-exome sequencing on cell samples from 20 young patients (ages 2 to 19) with relapsed B-ALL. The samples were collected at diagnosis, remission, and relapse.

“[W]e wanted to find out the underlying mechanism leading to cancer relapse,” Dr Zhang said. “When the cancer recurs, is it a completely different cancer, or is it an extension, or change, arising from pre-existing cancer?”

The researchers were able to detect the mutations in both the “rising” and “falling” clones—those that survive therapy and those that don’t—at the different disease stages and pinpoint the mutations that drove the leukemia.

Seven genes were highly likely to be mutated in relapsed disease—NT5C2, CREBBP, WHSC1, TP53, USH2A, NRAS, and IKZF1.

The researchers also characterized how diverse those mutations were at diagnosis and relapse. They found that B-ALL cells were mutating just as wildly and diversely in one phase of disease as in the other.

“This finding was interesting, because most people think that the clone that has the most mutations is more likely to survive therapy and evolve, but that doesn’t seem to be the case,” Dr Zhang said.

In most cases, relapse was driven by a minor subclone that had survived therapy and was present at an extremely low level. The researchers said this finding suggests a need to change the way we assess patients after treatment to determine the likelihood of relapse.

“When we are analyzing for the level of minimum residual disease in monitoring remission in patients, we should not only pay attention to the mutations in the predominant clone,” Dr Zhang said. “We should also be tracking what kinds of mutations exist in the minor subclones.”

Micrograph showing B-ALL

In tracing the clonal evolution of B-cell acute lymphoblastic leukemia (B-ALL) from diagnosis to relapse, researchers discovered that clonal diversity is comparable in both states.

They also identified mutations associated with B-ALL relapse and found that clonal survival is not dependent upon mutation burden.

In most of the cases the researchers analyzed, a single, minor clone survived therapy, acquired additional mutations, and drove disease relapse.

Jinghui Zhang, PhD, of St. Jude Children’s Research Hospital in Memphis, Tennessee, and her colleagues recounted these findings in Nature Communications.

The researchers performed deep, whole-exome sequencing on cell samples from 20 young patients (ages 2 to 19) with relapsed B-ALL. The samples were collected at diagnosis, remission, and relapse.

“[W]e wanted to find out the underlying mechanism leading to cancer relapse,” Dr Zhang said. “When the cancer recurs, is it a completely different cancer, or is it an extension, or change, arising from pre-existing cancer?”

The researchers were able to detect the mutations in both the “rising” and “falling” clones—those that survive therapy and those that don’t—at the different disease stages and pinpoint the mutations that drove the leukemia.

Seven genes were highly likely to be mutated in relapsed disease—NT5C2, CREBBP, WHSC1, TP53, USH2A, NRAS, and IKZF1.

The researchers also characterized how diverse those mutations were at diagnosis and relapse. They found that B-ALL cells were mutating just as wildly and diversely in one phase of disease as in the other.

“This finding was interesting, because most people think that the clone that has the most mutations is more likely to survive therapy and evolve, but that doesn’t seem to be the case,” Dr Zhang said.

In most cases, relapse was driven by a minor subclone that had survived therapy and was present at an extremely low level. The researchers said this finding suggests a need to change the way we assess patients after treatment to determine the likelihood of relapse.

“When we are analyzing for the level of minimum residual disease in monitoring remission in patients, we should not only pay attention to the mutations in the predominant clone,” Dr Zhang said. “We should also be tracking what kinds of mutations exist in the minor subclones.”

Publications
Publications
Topics
Article Type
Display Headline
Group traces clonal evolution of B-ALL
Display Headline
Group traces clonal evolution of B-ALL
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Malaria interventions prove insufficient

Article Type
Changed
Display Headline
Malaria interventions prove insufficient

Malaria-carrying mosquito

Photo by James Gathany

Current malaria interventions are failing to control the disease in high-transmission areas of sub-Saharan Africa, according to research published in The American Journal of Tropical Medicine & Hygiene.

A 2-year surveillance study revealed that the incidence of malaria in rural Uganda is high and continues to rise.

Researchers said this study offers the most accurate, comprehensive, and up-to-date measurement of the malaria disease burden in Uganda.

“Our findings suggest that current efforts at controlling malaria may not be as effective as previously believed in high-transmission areas, where the disease is the biggest threat,” said Grant Dorsey, MD, PhD, of the University of California, San Francisco.

“It’s important to tell the less happy story that we have not yet seen advances in more rural areas, including at least 2 sites in Uganda, where transmission has been historically high.”

To reach an accurate assessment of the malaria incidence in Uganda, Dr Dorsey and his colleagues gathered comprehensive surveillance data over 24 months, from August 2011 to September 2013.

Ultimately, the team evaluated 703 children between the ages of 6 months and 10 years. The children were randomly selected from 3 areas of Uganda with differing malaria characteristics.

The researchers found the incidence of malaria infection decreased in the relatively low-transmission, peri-urban Walukuba area during the study period—from an average of 0.51 to 0.31 episodes of malaria per person per year (P=0.001).

However, the incidence increased in the 2 rural areas. Episodes of malaria per person per year rose from an average of 0.97 to 1.93 (P<0.001) in the moderate-transmission area of Kihihi and rose from an average of 2.33 to 3.30 (P<0.001) in Nagongera, a high-transmission rural area near the southeastern border with Kenya.

Throughout the study period, families were provided with bednets and had access to 24-hour medical care free of charge at a designated study clinic for episodes of fever. The children were also routinely tested for malaria every 3 months, whether they had symptoms or not.

In addition, the researchers collected mosquito specimens monthly from light traps that were strategically placed in each house to estimate the percentages of malaria-carrying mosquitoes in the study areas.

Healthcare workers provided over 2500 treatments for malaria over the course of the study.

“Children in our study experienced a significantly high rate of infection, and that rate increased in the 2 rural areas,” Dr Dorsey said. “Based on prior data, our higher transmission sites are very likely to be representative of most of Uganda and perhaps of most other rural areas in sub-Saharan Africa as well.”

The researchers said these results suggest a need to further scale up campaigns to distribute insecticide-treated bednets and spray homes with insecticides. And high-transmission countries like Uganda may also require new interventions, such as using malaria drugs for prevention and controlling mosquito larvae, in order to match the malaria reduction successes seen elsewhere in the world.

In a related editorial, Steven Meshnick, MD, PhD, of the University of North Carolina, Chapel Hill, wrote, “The real take-home message of this study may be that malaria control in Africa requires sustained and consistent efforts over much more than 2 years.”

Publications
Topics

Malaria-carrying mosquito

Photo by James Gathany

Current malaria interventions are failing to control the disease in high-transmission areas of sub-Saharan Africa, according to research published in The American Journal of Tropical Medicine & Hygiene.

A 2-year surveillance study revealed that the incidence of malaria in rural Uganda is high and continues to rise.

Researchers said this study offers the most accurate, comprehensive, and up-to-date measurement of the malaria disease burden in Uganda.

“Our findings suggest that current efforts at controlling malaria may not be as effective as previously believed in high-transmission areas, where the disease is the biggest threat,” said Grant Dorsey, MD, PhD, of the University of California, San Francisco.

“It’s important to tell the less happy story that we have not yet seen advances in more rural areas, including at least 2 sites in Uganda, where transmission has been historically high.”

To reach an accurate assessment of the malaria incidence in Uganda, Dr Dorsey and his colleagues gathered comprehensive surveillance data over 24 months, from August 2011 to September 2013.

Ultimately, the team evaluated 703 children between the ages of 6 months and 10 years. The children were randomly selected from 3 areas of Uganda with differing malaria characteristics.

The researchers found the incidence of malaria infection decreased in the relatively low-transmission, peri-urban Walukuba area during the study period—from an average of 0.51 to 0.31 episodes of malaria per person per year (P=0.001).

However, the incidence increased in the 2 rural areas. Episodes of malaria per person per year rose from an average of 0.97 to 1.93 (P<0.001) in the moderate-transmission area of Kihihi and rose from an average of 2.33 to 3.30 (P<0.001) in Nagongera, a high-transmission rural area near the southeastern border with Kenya.

Throughout the study period, families were provided with bednets and had access to 24-hour medical care free of charge at a designated study clinic for episodes of fever. The children were also routinely tested for malaria every 3 months, whether they had symptoms or not.

In addition, the researchers collected mosquito specimens monthly from light traps that were strategically placed in each house to estimate the percentages of malaria-carrying mosquitoes in the study areas.

Healthcare workers provided over 2500 treatments for malaria over the course of the study.

“Children in our study experienced a significantly high rate of infection, and that rate increased in the 2 rural areas,” Dr Dorsey said. “Based on prior data, our higher transmission sites are very likely to be representative of most of Uganda and perhaps of most other rural areas in sub-Saharan Africa as well.”

The researchers said these results suggest a need to further scale up campaigns to distribute insecticide-treated bednets and spray homes with insecticides. And high-transmission countries like Uganda may also require new interventions, such as using malaria drugs for prevention and controlling mosquito larvae, in order to match the malaria reduction successes seen elsewhere in the world.

In a related editorial, Steven Meshnick, MD, PhD, of the University of North Carolina, Chapel Hill, wrote, “The real take-home message of this study may be that malaria control in Africa requires sustained and consistent efforts over much more than 2 years.”

Malaria-carrying mosquito

Photo by James Gathany

Current malaria interventions are failing to control the disease in high-transmission areas of sub-Saharan Africa, according to research published in The American Journal of Tropical Medicine & Hygiene.

A 2-year surveillance study revealed that the incidence of malaria in rural Uganda is high and continues to rise.

Researchers said this study offers the most accurate, comprehensive, and up-to-date measurement of the malaria disease burden in Uganda.

“Our findings suggest that current efforts at controlling malaria may not be as effective as previously believed in high-transmission areas, where the disease is the biggest threat,” said Grant Dorsey, MD, PhD, of the University of California, San Francisco.

“It’s important to tell the less happy story that we have not yet seen advances in more rural areas, including at least 2 sites in Uganda, where transmission has been historically high.”

To reach an accurate assessment of the malaria incidence in Uganda, Dr Dorsey and his colleagues gathered comprehensive surveillance data over 24 months, from August 2011 to September 2013.

Ultimately, the team evaluated 703 children between the ages of 6 months and 10 years. The children were randomly selected from 3 areas of Uganda with differing malaria characteristics.

The researchers found the incidence of malaria infection decreased in the relatively low-transmission, peri-urban Walukuba area during the study period—from an average of 0.51 to 0.31 episodes of malaria per person per year (P=0.001).

However, the incidence increased in the 2 rural areas. Episodes of malaria per person per year rose from an average of 0.97 to 1.93 (P<0.001) in the moderate-transmission area of Kihihi and rose from an average of 2.33 to 3.30 (P<0.001) in Nagongera, a high-transmission rural area near the southeastern border with Kenya.

Throughout the study period, families were provided with bednets and had access to 24-hour medical care free of charge at a designated study clinic for episodes of fever. The children were also routinely tested for malaria every 3 months, whether they had symptoms or not.

In addition, the researchers collected mosquito specimens monthly from light traps that were strategically placed in each house to estimate the percentages of malaria-carrying mosquitoes in the study areas.

Healthcare workers provided over 2500 treatments for malaria over the course of the study.

“Children in our study experienced a significantly high rate of infection, and that rate increased in the 2 rural areas,” Dr Dorsey said. “Based on prior data, our higher transmission sites are very likely to be representative of most of Uganda and perhaps of most other rural areas in sub-Saharan Africa as well.”

The researchers said these results suggest a need to further scale up campaigns to distribute insecticide-treated bednets and spray homes with insecticides. And high-transmission countries like Uganda may also require new interventions, such as using malaria drugs for prevention and controlling mosquito larvae, in order to match the malaria reduction successes seen elsewhere in the world.

In a related editorial, Steven Meshnick, MD, PhD, of the University of North Carolina, Chapel Hill, wrote, “The real take-home message of this study may be that malaria control in Africa requires sustained and consistent efforts over much more than 2 years.”

Publications
Publications
Topics
Article Type
Display Headline
Malaria interventions prove insufficient
Display Headline
Malaria interventions prove insufficient
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Drug appears feasible for hard-to-treat myelofibrosis

Article Type
Changed
Display Headline
Drug appears feasible for hard-to-treat myelofibrosis

Micrograph showing

myelofibrosis

Results of a phase 2 study suggest the JAK2/FLT3 inhibitor pacritinib is a feasible treatment option for patients with myelofibrosis who cannot receive or do not respond well to standard therapies.

The drug reduced patients’ spleen volume and improved disease-related symptoms without causing clinically significant myelosuppression.

And pacritinib was considered well-tolerated, even in patients with disease-related anemia and thrombocytopenia.

“Currently, myelofibrosis patients with anemia and thrombocytopenia have limited treatment options for splenomegaly and constitutional symptoms, and

these data show that pacritinib has potential to help patients that are sub-optimally managed on currently available treatments,” said study author Rami S. Komrokji, MD, of the Moffitt Cancer Center in Tampa, Florida.

Dr Komrokji and his colleagues reported these results in Blood. The study was sponsored by CTI Biopharma, the company developing pacritinib.

The researchers evaluated the safety and efficacy of pacritinib in myelofibrosis patients who had clinical splenomegaly that was poorly controlled with standard therapies or who were newly diagnosed with intermediate- or high-risk disease and not considered candidates for standard therapy.

Patients were allowed to enroll irrespective of their degree of thrombocytopenia or anemia. At study entry, 40% of patients had hemoglobin levels below 10 g/dL, and 43% had platelet counts less than 100,000/µL.

A total of 35 patients were enrolled and treated with pacritinib administered at 400 mg once daily in 28-day cycles. The patients’ median age was 69 years.

The primary endpoint was assessment of the spleen response rate, defined as the proportion of subjects achieving 35% or greater reduction in spleen volume from baseline up to week 24, measured by MRI or CT.

A secondary endpoint was the proportion of patients with a 50% or greater reduction in spleen size as determined by physical exam.

The researchers also assessed the proportion of patients with a 50% or greater reduction in total symptom score, which included symptoms of abdominal pain,

bone pain, early satiety, fatigue, inactivity, night sweats, and pruritus, from baseline up to week 24.

Results showed that 30.8% of the evaluable patients (8/26) had a 35% or greater reduction in spleen volume by CT or MRI scan, with 42% of patients reaching a

35% or greater reduction by the end of treatment.

In addition, 42.4% of evaluable patients (14/33) achieved a 50% or greater reduction in spleen size by physical exam. And 48.4% of evaluable patients (15/31) achieved a 50% or greater reduction in total symptom score.

The most common treatment-emergent adverse events were grade 1-2 diarrhea (69%) and nausea (49%). Anemia and thrombocytopenia adverse events were reported in 12 (34.3%) and 8 (22.9%) patients, respectively.

Nine patients (26%) stopped taking pacritinib due to adverse events, but 3 of the events were deemed unrelated to the drug.

There were 5 deaths, 3 of which were due to serious adverse events. Of those, 1 (subdural hematoma) was considered possibly related to pacritinib treatment.

 

Publications
Topics

Micrograph showing

myelofibrosis

Results of a phase 2 study suggest the JAK2/FLT3 inhibitor pacritinib is a feasible treatment option for patients with myelofibrosis who cannot receive or do not respond well to standard therapies.

The drug reduced patients’ spleen volume and improved disease-related symptoms without causing clinically significant myelosuppression.

And pacritinib was considered well-tolerated, even in patients with disease-related anemia and thrombocytopenia.

“Currently, myelofibrosis patients with anemia and thrombocytopenia have limited treatment options for splenomegaly and constitutional symptoms, and

these data show that pacritinib has potential to help patients that are sub-optimally managed on currently available treatments,” said study author Rami S. Komrokji, MD, of the Moffitt Cancer Center in Tampa, Florida.

Dr Komrokji and his colleagues reported these results in Blood. The study was sponsored by CTI Biopharma, the company developing pacritinib.

The researchers evaluated the safety and efficacy of pacritinib in myelofibrosis patients who had clinical splenomegaly that was poorly controlled with standard therapies or who were newly diagnosed with intermediate- or high-risk disease and not considered candidates for standard therapy.

Patients were allowed to enroll irrespective of their degree of thrombocytopenia or anemia. At study entry, 40% of patients had hemoglobin levels below 10 g/dL, and 43% had platelet counts less than 100,000/µL.

A total of 35 patients were enrolled and treated with pacritinib administered at 400 mg once daily in 28-day cycles. The patients’ median age was 69 years.

The primary endpoint was assessment of the spleen response rate, defined as the proportion of subjects achieving 35% or greater reduction in spleen volume from baseline up to week 24, measured by MRI or CT.

A secondary endpoint was the proportion of patients with a 50% or greater reduction in spleen size as determined by physical exam.

The researchers also assessed the proportion of patients with a 50% or greater reduction in total symptom score, which included symptoms of abdominal pain,

bone pain, early satiety, fatigue, inactivity, night sweats, and pruritus, from baseline up to week 24.

Results showed that 30.8% of the evaluable patients (8/26) had a 35% or greater reduction in spleen volume by CT or MRI scan, with 42% of patients reaching a

35% or greater reduction by the end of treatment.

In addition, 42.4% of evaluable patients (14/33) achieved a 50% or greater reduction in spleen size by physical exam. And 48.4% of evaluable patients (15/31) achieved a 50% or greater reduction in total symptom score.

The most common treatment-emergent adverse events were grade 1-2 diarrhea (69%) and nausea (49%). Anemia and thrombocytopenia adverse events were reported in 12 (34.3%) and 8 (22.9%) patients, respectively.

Nine patients (26%) stopped taking pacritinib due to adverse events, but 3 of the events were deemed unrelated to the drug.

There were 5 deaths, 3 of which were due to serious adverse events. Of those, 1 (subdural hematoma) was considered possibly related to pacritinib treatment.

 

Micrograph showing

myelofibrosis

Results of a phase 2 study suggest the JAK2/FLT3 inhibitor pacritinib is a feasible treatment option for patients with myelofibrosis who cannot receive or do not respond well to standard therapies.

The drug reduced patients’ spleen volume and improved disease-related symptoms without causing clinically significant myelosuppression.

And pacritinib was considered well-tolerated, even in patients with disease-related anemia and thrombocytopenia.

“Currently, myelofibrosis patients with anemia and thrombocytopenia have limited treatment options for splenomegaly and constitutional symptoms, and

these data show that pacritinib has potential to help patients that are sub-optimally managed on currently available treatments,” said study author Rami S. Komrokji, MD, of the Moffitt Cancer Center in Tampa, Florida.

Dr Komrokji and his colleagues reported these results in Blood. The study was sponsored by CTI Biopharma, the company developing pacritinib.

The researchers evaluated the safety and efficacy of pacritinib in myelofibrosis patients who had clinical splenomegaly that was poorly controlled with standard therapies or who were newly diagnosed with intermediate- or high-risk disease and not considered candidates for standard therapy.

Patients were allowed to enroll irrespective of their degree of thrombocytopenia or anemia. At study entry, 40% of patients had hemoglobin levels below 10 g/dL, and 43% had platelet counts less than 100,000/µL.

A total of 35 patients were enrolled and treated with pacritinib administered at 400 mg once daily in 28-day cycles. The patients’ median age was 69 years.

The primary endpoint was assessment of the spleen response rate, defined as the proportion of subjects achieving 35% or greater reduction in spleen volume from baseline up to week 24, measured by MRI or CT.

A secondary endpoint was the proportion of patients with a 50% or greater reduction in spleen size as determined by physical exam.

The researchers also assessed the proportion of patients with a 50% or greater reduction in total symptom score, which included symptoms of abdominal pain,

bone pain, early satiety, fatigue, inactivity, night sweats, and pruritus, from baseline up to week 24.

Results showed that 30.8% of the evaluable patients (8/26) had a 35% or greater reduction in spleen volume by CT or MRI scan, with 42% of patients reaching a

35% or greater reduction by the end of treatment.

In addition, 42.4% of evaluable patients (14/33) achieved a 50% or greater reduction in spleen size by physical exam. And 48.4% of evaluable patients (15/31) achieved a 50% or greater reduction in total symptom score.

The most common treatment-emergent adverse events were grade 1-2 diarrhea (69%) and nausea (49%). Anemia and thrombocytopenia adverse events were reported in 12 (34.3%) and 8 (22.9%) patients, respectively.

Nine patients (26%) stopped taking pacritinib due to adverse events, but 3 of the events were deemed unrelated to the drug.

There were 5 deaths, 3 of which were due to serious adverse events. Of those, 1 (subdural hematoma) was considered possibly related to pacritinib treatment.

 

Publications
Publications
Topics
Article Type
Display Headline
Drug appears feasible for hard-to-treat myelofibrosis
Display Headline
Drug appears feasible for hard-to-treat myelofibrosis
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Experts urge review of global sepsis guidelines

Article Type
Changed
Display Headline
Experts urge review of global sepsis guidelines

red blood cells

Red blood cells

Investigators are calling for a global review of guidelines used to diagnose sepsis, after a study showed that 1 in 8 patients with infections severe enough to necessitate admission to an intensive care unit did not meet current diagnostic criteria.

The researchers identified 109,663 patients with possible sepsis who had infection and organ failure. However, more than 13,000 patients from this group did not meet the classic criteria used to diagnose sepsis.

“To be diagnosed with sepsis, a patient must be thought to have an infection and exhibit at least 2 of the following criteria: abnormal body temperature or white blood cell count, high heart rate, high respiratory rate, or low carbon dioxide level in the blood,” said Maija Kaukonen, MD, PhD, of Monash University in Melbourne, Victoria, Australia.

“But our study found that many patients—for example, the elderly or those on medications that affect heart rate or the immune system—may not meet the classic criteria to diagnose sepsis, despite having severe infections and organ failure. If we continue to use these criteria, we may miss the opportunity to identify many critically ill patients with sepsis.”

The study was published in NEJM.

The investigators studied 1,171,797 patients from 172 intensive care units in New Zealand and Australia.

The team identified patients with infection and organ failure and categorized them according to whether they had signs meeting 2 or more systemic inflammatory response syndrome (SIRS) criteria (SIRS-positive severe sepsis) or less than 2 SIRS criteria (SIRS-negative severe sepsis).

Of the 109,663 patients who had infection and organ failure, 96,385 (87.9%) had SIRS-positive severe sepsis and 13,278 (12.1%) had SIRS-negative severe sepsis.

Over 14 years, the 2 patient groups had similar characteristics and changes in mortality. Mortality decreased from 36.1% (829/2296) to 18.3% (2037/11,119) in the SIRS-positive group (P<0.001) and from 27.7% (100/361) to 9.3% (122/1315) in the SIRS-negative group (P<0.001).

This similarity between the groups remained after the researchers adjusted their analysis for baseline characteristics. In both groups, the odds ratio was 0.96 (P=0.12).

The investigators also noted that, in the adjusted analysis, mortality increased linearly with each additional SIRS criterion (P<0.001), without any transitional increase in risk at a threshold of 2 SIRS criteria.

Rinaldo Bellomo, MD, PhD, also of Monash University, conceived this study. He said that although the classic definition of sepsis has been widely used throughout the world, he believed that, after 20 years, it was time for it to be reviewed.

“There are clear signs from this study that if we continue to use these criteria, we may fail to identify septic patients and, therefore, potentially delay their treatment,” he said.

Publications
Topics

red blood cells

Red blood cells

Investigators are calling for a global review of guidelines used to diagnose sepsis, after a study showed that 1 in 8 patients with infections severe enough to necessitate admission to an intensive care unit did not meet current diagnostic criteria.

The researchers identified 109,663 patients with possible sepsis who had infection and organ failure. However, more than 13,000 patients from this group did not meet the classic criteria used to diagnose sepsis.

“To be diagnosed with sepsis, a patient must be thought to have an infection and exhibit at least 2 of the following criteria: abnormal body temperature or white blood cell count, high heart rate, high respiratory rate, or low carbon dioxide level in the blood,” said Maija Kaukonen, MD, PhD, of Monash University in Melbourne, Victoria, Australia.

“But our study found that many patients—for example, the elderly or those on medications that affect heart rate or the immune system—may not meet the classic criteria to diagnose sepsis, despite having severe infections and organ failure. If we continue to use these criteria, we may miss the opportunity to identify many critically ill patients with sepsis.”

The study was published in NEJM.

The investigators studied 1,171,797 patients from 172 intensive care units in New Zealand and Australia.

The team identified patients with infection and organ failure and categorized them according to whether they had signs meeting 2 or more systemic inflammatory response syndrome (SIRS) criteria (SIRS-positive severe sepsis) or less than 2 SIRS criteria (SIRS-negative severe sepsis).

Of the 109,663 patients who had infection and organ failure, 96,385 (87.9%) had SIRS-positive severe sepsis and 13,278 (12.1%) had SIRS-negative severe sepsis.

Over 14 years, the 2 patient groups had similar characteristics and changes in mortality. Mortality decreased from 36.1% (829/2296) to 18.3% (2037/11,119) in the SIRS-positive group (P<0.001) and from 27.7% (100/361) to 9.3% (122/1315) in the SIRS-negative group (P<0.001).

This similarity between the groups remained after the researchers adjusted their analysis for baseline characteristics. In both groups, the odds ratio was 0.96 (P=0.12).

The investigators also noted that, in the adjusted analysis, mortality increased linearly with each additional SIRS criterion (P<0.001), without any transitional increase in risk at a threshold of 2 SIRS criteria.

Rinaldo Bellomo, MD, PhD, also of Monash University, conceived this study. He said that although the classic definition of sepsis has been widely used throughout the world, he believed that, after 20 years, it was time for it to be reviewed.

“There are clear signs from this study that if we continue to use these criteria, we may fail to identify septic patients and, therefore, potentially delay their treatment,” he said.

red blood cells

Red blood cells

Investigators are calling for a global review of guidelines used to diagnose sepsis, after a study showed that 1 in 8 patients with infections severe enough to necessitate admission to an intensive care unit did not meet current diagnostic criteria.

The researchers identified 109,663 patients with possible sepsis who had infection and organ failure. However, more than 13,000 patients from this group did not meet the classic criteria used to diagnose sepsis.

“To be diagnosed with sepsis, a patient must be thought to have an infection and exhibit at least 2 of the following criteria: abnormal body temperature or white blood cell count, high heart rate, high respiratory rate, or low carbon dioxide level in the blood,” said Maija Kaukonen, MD, PhD, of Monash University in Melbourne, Victoria, Australia.

“But our study found that many patients—for example, the elderly or those on medications that affect heart rate or the immune system—may not meet the classic criteria to diagnose sepsis, despite having severe infections and organ failure. If we continue to use these criteria, we may miss the opportunity to identify many critically ill patients with sepsis.”

The study was published in NEJM.

The investigators studied 1,171,797 patients from 172 intensive care units in New Zealand and Australia.

The team identified patients with infection and organ failure and categorized them according to whether they had signs meeting 2 or more systemic inflammatory response syndrome (SIRS) criteria (SIRS-positive severe sepsis) or less than 2 SIRS criteria (SIRS-negative severe sepsis).

Of the 109,663 patients who had infection and organ failure, 96,385 (87.9%) had SIRS-positive severe sepsis and 13,278 (12.1%) had SIRS-negative severe sepsis.

Over 14 years, the 2 patient groups had similar characteristics and changes in mortality. Mortality decreased from 36.1% (829/2296) to 18.3% (2037/11,119) in the SIRS-positive group (P<0.001) and from 27.7% (100/361) to 9.3% (122/1315) in the SIRS-negative group (P<0.001).

This similarity between the groups remained after the researchers adjusted their analysis for baseline characteristics. In both groups, the odds ratio was 0.96 (P=0.12).

The investigators also noted that, in the adjusted analysis, mortality increased linearly with each additional SIRS criterion (P<0.001), without any transitional increase in risk at a threshold of 2 SIRS criteria.

Rinaldo Bellomo, MD, PhD, also of Monash University, conceived this study. He said that although the classic definition of sepsis has been widely used throughout the world, he believed that, after 20 years, it was time for it to be reviewed.

“There are clear signs from this study that if we continue to use these criteria, we may fail to identify septic patients and, therefore, potentially delay their treatment,” he said.

Publications
Publications
Topics
Article Type
Display Headline
Experts urge review of global sepsis guidelines
Display Headline
Experts urge review of global sepsis guidelines
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Transradial PCI outperforms transfemoral for acute coronary syndromes

MATRIX results may boost U.S. transradial access
Article Type
Changed
Display Headline
Transradial PCI outperforms transfemoral for acute coronary syndromes

SAN DIEGO – The unshakable grip that transfemoral access has held on coronary artery catheterization for the U.S. practice of interventional cardiology may finally loosen with results from an 8,000-patient, multinational controlled trial.

MATRIX showed that transradial access for coronary catheterization of patients with acute coronary syndrome (ACS) produced significantly fewer access-site bleeding events and significantly improved patient survival, compared with transfemoral access.

“Our results, in conjunction with the updated meta-analysis, suggest that the radial approach should become the default access for patients with acute coronary syndrome undergoing invasive management,” Dr. Marco Valgimigli said at the annual meeting of the American College of Cardiology. “Access site does matter, and a reduction in access-site bleeding complications seems to translate into a mortality benefit,” said Dr. Valgimigli, an interventional cardiologist at Erasmus University Medical Center in Rotterdam, the Netherlands.

Dr. Marco Valgimigli

The MATRIX (Minimizing Adverse Haemorrhagic Events by Transradial Access Site and Systemic Implementation of Angiox) trial enrolled 8,404 ACS patients at 78 sites in four European countries: Italy, Spain, Sweden, and the Netherlands. The study randomized patients undergoing percutaneous coronary intervention (PCI) to catheterization via either the patient’s radial or femoral artery. After 30 days, the combined rate of death, myocardial infarction, stroke, and major bleeding was reduced by an absolute rate of 1.9% among the patients treated with transradial access, a 17% relative risk reduction that was statistically significant for one of the study’s two primary endpoints.

This outcome difference seemed driven primarily by significant reductions in major bleeds and specifically major access-site bleeds, and this led to a statistically significant reduction in all-cause death by 0.6%, a 28% relative risk reduction in 30-day mortality tied to transradial access, Dr. Valgimigli said.

“I think this will be the study that helps change guidelines, to make radial artery access the default approach,” commented Dr. Sanjit S. Jolly, an interventional cardiologist at McMaster University in Hamilton, Ont.

“The United States is very behind in the use of transradial access; it’s used in about 20% of coronary PCIs,” noted Dr. Cindy L. Grines, an interventional cardiologist at the Detroit Medical Center. “We need to make a concerted effort in the United States to retrain practitioners to do transradial procedures. This approach is initially more time consuming, involves more radiation exposure, and can be frustrating, so we probably need to incentivize physicians by increasing their reimbursement for transradial PCIs and by making it part of quality assurance programs. Unless we do something like that, transradial use may not change,” Dr. Grines said in an interview.

Dr. Cindy L. Grines

The significant superiority of transradial over transfemoral access for both patient survival and for one of the study’s primary endpoints contrasted with the neutral result seen in an earlier major study that compared the two access approaches, RIVAL (Radial Versus Femoral Access for Coronary Angiography and Intervention in Patients With Acute Coronary Syndromes; Lancet 2011;377:1409-20).

Dr. Valgimigli also reported results from a meta-analysis that combined the MATRIX and RIVAL results as well as data from a few additional much smaller trials. This combined analysis, which involved a total of more than 19,000 ACS patients randomized to PCI via one of the two access sites, further confirmed that transradial catheterization linked with statistically significant reductions in death, in major bleeds not associated with coronary artery surgery, and in the combined endpoint of death, myocardial infarction, and stroke, he said. Concurrent with his report at the meeting, the MATRIX results as well as the updated meta-analysis results, appeared in an article published online (Lancet 2015 [doi:10.1016/S0140-6736(15)60292-6]).

The MATRIX study used only highly experienced interventionalists who had extensive familiarity with performing PCI using both types of access. They successfully used transradial access in 94% of patients randomized to that approach, but in the other 6% technical difficulties resulted in a crossover to the transfemoral route. Among patients randomized to transfemoral access, 2% required crossover to a transradial procedure.

MATRIX was an investigator-initiated study that received grant support from Terumo and the Medicines Co. Dr. Valgimigli had no relevant financial disclosures. Dr. Jolly has been a consultant to AstraZeneca, has been a speaker on behalf of St. Jude, and has received research grants from Medtronic. Dr. Grines has been a consultant to and received honoraria from Abbott Vascular, the Medicines Co., Merck, and the Volcano Group.

[email protected]

On Twitter @mitchelzoler

References

Body

U.S. interventionalists have lagged in adopting transradial access for percutaneous coronary interventions. I reviewed the status within the past year and found that about 17% of all U.S. percutaneous coronary interventions were done by transradial access, and this level jibes with recent results from a survey of U.S. interventionalists.

Dr. David E. Kandzari

American use of transradial access grew markedly over the last decade. Ten years ago, the rate stood at about 3%. But it remains well behind most other countries. Results reported at the ACC meeting from another large international study of coronary interventions in ST-elevation myocardial infarction patients showed a 68% worldwide rate of transradial access (N. Engl. J. Med. 2015 [doi:10.1056/NEJMoa1415098])

I believe that the MATRIX results will help further fuel change in U.S. practice. Soon, quality assurance programs at many U.S. hospitals may incorporate transradial access as a performance measure.

The accumulated evidence, now including the MATRIX results, supports transradial access as the default approach for vascular access during coronary procedures. However, in some patients transradial access is impossible, especially in some women, in the elderly, and in patients with a high body mass index.

Dr. David E. Kandzari, director of interventional cardiology at the Piedmont Heart Institute in Atlanta, made these comments in an interview. He has been a consultant to Medtronic and Boston Scientific and has received research support from Abbott Vascular, Biotronik, Boston Scientific, and Medtronic.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
transradial, transfemoral, PCI, acute coronary syndrome, Valgimigli, MATRIX, Kandzari, Grines
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event
Related Articles
Body

U.S. interventionalists have lagged in adopting transradial access for percutaneous coronary interventions. I reviewed the status within the past year and found that about 17% of all U.S. percutaneous coronary interventions were done by transradial access, and this level jibes with recent results from a survey of U.S. interventionalists.

Dr. David E. Kandzari

American use of transradial access grew markedly over the last decade. Ten years ago, the rate stood at about 3%. But it remains well behind most other countries. Results reported at the ACC meeting from another large international study of coronary interventions in ST-elevation myocardial infarction patients showed a 68% worldwide rate of transradial access (N. Engl. J. Med. 2015 [doi:10.1056/NEJMoa1415098])

I believe that the MATRIX results will help further fuel change in U.S. practice. Soon, quality assurance programs at many U.S. hospitals may incorporate transradial access as a performance measure.

The accumulated evidence, now including the MATRIX results, supports transradial access as the default approach for vascular access during coronary procedures. However, in some patients transradial access is impossible, especially in some women, in the elderly, and in patients with a high body mass index.

Dr. David E. Kandzari, director of interventional cardiology at the Piedmont Heart Institute in Atlanta, made these comments in an interview. He has been a consultant to Medtronic and Boston Scientific and has received research support from Abbott Vascular, Biotronik, Boston Scientific, and Medtronic.

Body

U.S. interventionalists have lagged in adopting transradial access for percutaneous coronary interventions. I reviewed the status within the past year and found that about 17% of all U.S. percutaneous coronary interventions were done by transradial access, and this level jibes with recent results from a survey of U.S. interventionalists.

Dr. David E. Kandzari

American use of transradial access grew markedly over the last decade. Ten years ago, the rate stood at about 3%. But it remains well behind most other countries. Results reported at the ACC meeting from another large international study of coronary interventions in ST-elevation myocardial infarction patients showed a 68% worldwide rate of transradial access (N. Engl. J. Med. 2015 [doi:10.1056/NEJMoa1415098])

I believe that the MATRIX results will help further fuel change in U.S. practice. Soon, quality assurance programs at many U.S. hospitals may incorporate transradial access as a performance measure.

The accumulated evidence, now including the MATRIX results, supports transradial access as the default approach for vascular access during coronary procedures. However, in some patients transradial access is impossible, especially in some women, in the elderly, and in patients with a high body mass index.

Dr. David E. Kandzari, director of interventional cardiology at the Piedmont Heart Institute in Atlanta, made these comments in an interview. He has been a consultant to Medtronic and Boston Scientific and has received research support from Abbott Vascular, Biotronik, Boston Scientific, and Medtronic.

Title
MATRIX results may boost U.S. transradial access
MATRIX results may boost U.S. transradial access

SAN DIEGO – The unshakable grip that transfemoral access has held on coronary artery catheterization for the U.S. practice of interventional cardiology may finally loosen with results from an 8,000-patient, multinational controlled trial.

MATRIX showed that transradial access for coronary catheterization of patients with acute coronary syndrome (ACS) produced significantly fewer access-site bleeding events and significantly improved patient survival, compared with transfemoral access.

“Our results, in conjunction with the updated meta-analysis, suggest that the radial approach should become the default access for patients with acute coronary syndrome undergoing invasive management,” Dr. Marco Valgimigli said at the annual meeting of the American College of Cardiology. “Access site does matter, and a reduction in access-site bleeding complications seems to translate into a mortality benefit,” said Dr. Valgimigli, an interventional cardiologist at Erasmus University Medical Center in Rotterdam, the Netherlands.

Dr. Marco Valgimigli

The MATRIX (Minimizing Adverse Haemorrhagic Events by Transradial Access Site and Systemic Implementation of Angiox) trial enrolled 8,404 ACS patients at 78 sites in four European countries: Italy, Spain, Sweden, and the Netherlands. The study randomized patients undergoing percutaneous coronary intervention (PCI) to catheterization via either the patient’s radial or femoral artery. After 30 days, the combined rate of death, myocardial infarction, stroke, and major bleeding was reduced by an absolute rate of 1.9% among the patients treated with transradial access, a 17% relative risk reduction that was statistically significant for one of the study’s two primary endpoints.

This outcome difference seemed driven primarily by significant reductions in major bleeds and specifically major access-site bleeds, and this led to a statistically significant reduction in all-cause death by 0.6%, a 28% relative risk reduction in 30-day mortality tied to transradial access, Dr. Valgimigli said.

“I think this will be the study that helps change guidelines, to make radial artery access the default approach,” commented Dr. Sanjit S. Jolly, an interventional cardiologist at McMaster University in Hamilton, Ont.

“The United States is very behind in the use of transradial access; it’s used in about 20% of coronary PCIs,” noted Dr. Cindy L. Grines, an interventional cardiologist at the Detroit Medical Center. “We need to make a concerted effort in the United States to retrain practitioners to do transradial procedures. This approach is initially more time consuming, involves more radiation exposure, and can be frustrating, so we probably need to incentivize physicians by increasing their reimbursement for transradial PCIs and by making it part of quality assurance programs. Unless we do something like that, transradial use may not change,” Dr. Grines said in an interview.

Dr. Cindy L. Grines

The significant superiority of transradial over transfemoral access for both patient survival and for one of the study’s primary endpoints contrasted with the neutral result seen in an earlier major study that compared the two access approaches, RIVAL (Radial Versus Femoral Access for Coronary Angiography and Intervention in Patients With Acute Coronary Syndromes; Lancet 2011;377:1409-20).

Dr. Valgimigli also reported results from a meta-analysis that combined the MATRIX and RIVAL results as well as data from a few additional much smaller trials. This combined analysis, which involved a total of more than 19,000 ACS patients randomized to PCI via one of the two access sites, further confirmed that transradial catheterization linked with statistically significant reductions in death, in major bleeds not associated with coronary artery surgery, and in the combined endpoint of death, myocardial infarction, and stroke, he said. Concurrent with his report at the meeting, the MATRIX results as well as the updated meta-analysis results, appeared in an article published online (Lancet 2015 [doi:10.1016/S0140-6736(15)60292-6]).

The MATRIX study used only highly experienced interventionalists who had extensive familiarity with performing PCI using both types of access. They successfully used transradial access in 94% of patients randomized to that approach, but in the other 6% technical difficulties resulted in a crossover to the transfemoral route. Among patients randomized to transfemoral access, 2% required crossover to a transradial procedure.

MATRIX was an investigator-initiated study that received grant support from Terumo and the Medicines Co. Dr. Valgimigli had no relevant financial disclosures. Dr. Jolly has been a consultant to AstraZeneca, has been a speaker on behalf of St. Jude, and has received research grants from Medtronic. Dr. Grines has been a consultant to and received honoraria from Abbott Vascular, the Medicines Co., Merck, and the Volcano Group.

[email protected]

On Twitter @mitchelzoler

SAN DIEGO – The unshakable grip that transfemoral access has held on coronary artery catheterization for the U.S. practice of interventional cardiology may finally loosen with results from an 8,000-patient, multinational controlled trial.

MATRIX showed that transradial access for coronary catheterization of patients with acute coronary syndrome (ACS) produced significantly fewer access-site bleeding events and significantly improved patient survival, compared with transfemoral access.

“Our results, in conjunction with the updated meta-analysis, suggest that the radial approach should become the default access for patients with acute coronary syndrome undergoing invasive management,” Dr. Marco Valgimigli said at the annual meeting of the American College of Cardiology. “Access site does matter, and a reduction in access-site bleeding complications seems to translate into a mortality benefit,” said Dr. Valgimigli, an interventional cardiologist at Erasmus University Medical Center in Rotterdam, the Netherlands.

Dr. Marco Valgimigli

The MATRIX (Minimizing Adverse Haemorrhagic Events by Transradial Access Site and Systemic Implementation of Angiox) trial enrolled 8,404 ACS patients at 78 sites in four European countries: Italy, Spain, Sweden, and the Netherlands. The study randomized patients undergoing percutaneous coronary intervention (PCI) to catheterization via either the patient’s radial or femoral artery. After 30 days, the combined rate of death, myocardial infarction, stroke, and major bleeding was reduced by an absolute rate of 1.9% among the patients treated with transradial access, a 17% relative risk reduction that was statistically significant for one of the study’s two primary endpoints.

This outcome difference seemed driven primarily by significant reductions in major bleeds and specifically major access-site bleeds, and this led to a statistically significant reduction in all-cause death by 0.6%, a 28% relative risk reduction in 30-day mortality tied to transradial access, Dr. Valgimigli said.

“I think this will be the study that helps change guidelines, to make radial artery access the default approach,” commented Dr. Sanjit S. Jolly, an interventional cardiologist at McMaster University in Hamilton, Ont.

“The United States is very behind in the use of transradial access; it’s used in about 20% of coronary PCIs,” noted Dr. Cindy L. Grines, an interventional cardiologist at the Detroit Medical Center. “We need to make a concerted effort in the United States to retrain practitioners to do transradial procedures. This approach is initially more time consuming, involves more radiation exposure, and can be frustrating, so we probably need to incentivize physicians by increasing their reimbursement for transradial PCIs and by making it part of quality assurance programs. Unless we do something like that, transradial use may not change,” Dr. Grines said in an interview.

Dr. Cindy L. Grines

The significant superiority of transradial over transfemoral access for both patient survival and for one of the study’s primary endpoints contrasted with the neutral result seen in an earlier major study that compared the two access approaches, RIVAL (Radial Versus Femoral Access for Coronary Angiography and Intervention in Patients With Acute Coronary Syndromes; Lancet 2011;377:1409-20).

Dr. Valgimigli also reported results from a meta-analysis that combined the MATRIX and RIVAL results as well as data from a few additional much smaller trials. This combined analysis, which involved a total of more than 19,000 ACS patients randomized to PCI via one of the two access sites, further confirmed that transradial catheterization linked with statistically significant reductions in death, in major bleeds not associated with coronary artery surgery, and in the combined endpoint of death, myocardial infarction, and stroke, he said. Concurrent with his report at the meeting, the MATRIX results as well as the updated meta-analysis results, appeared in an article published online (Lancet 2015 [doi:10.1016/S0140-6736(15)60292-6]).

The MATRIX study used only highly experienced interventionalists who had extensive familiarity with performing PCI using both types of access. They successfully used transradial access in 94% of patients randomized to that approach, but in the other 6% technical difficulties resulted in a crossover to the transfemoral route. Among patients randomized to transfemoral access, 2% required crossover to a transradial procedure.

MATRIX was an investigator-initiated study that received grant support from Terumo and the Medicines Co. Dr. Valgimigli had no relevant financial disclosures. Dr. Jolly has been a consultant to AstraZeneca, has been a speaker on behalf of St. Jude, and has received research grants from Medtronic. Dr. Grines has been a consultant to and received honoraria from Abbott Vascular, the Medicines Co., Merck, and the Volcano Group.

[email protected]

On Twitter @mitchelzoler

References

References

Publications
Publications
Topics
Article Type
Display Headline
Transradial PCI outperforms transfemoral for acute coronary syndromes
Display Headline
Transradial PCI outperforms transfemoral for acute coronary syndromes
Legacy Keywords
transradial, transfemoral, PCI, acute coronary syndrome, Valgimigli, MATRIX, Kandzari, Grines
Legacy Keywords
transradial, transfemoral, PCI, acute coronary syndrome, Valgimigli, MATRIX, Kandzari, Grines
Sections
Article Source

AT ACC 15

PURLs Copyright

Inside the Article

Vitals

Key clinical point: ACS patients randomized to transradial access for PCI had significantly better survival at 30 days, compared with those treated via the transfemoral route.

Major finding: Thirty-day all-cause death occurred in 0.6% fewer patients using transradial access, a 28% relative risk reduction.

Data source: MATRIX, a multicenter, randomized controlled trial with 8,404 patients.

Disclosures: MATRIX was an investigator-initiated study that received grant support from Terumo and the Medicines Co. Dr. Valgimigli had no relevant financial disclosures. Dr. Jolly has been a consultant to AstraZeneca, has been a speaker on behalf of St. Jude, and has received research grants from Medtronic. Dr. Grines has been a consultant to and received honoraria from Abbott Vascular, the Medicines Co., Merck, and the Volcano Group.

Time to listen

Article Type
Changed
Display Headline
Time to listen

Doctors talk more than they listen, especially when discussing end-of-life care with families, a recent study in Pediatrics suggests. Researchers from the University of Amsterdam followed 27 physicians and 37 parents as they navigated the difficult waters of end-of-life decision making for their children. By analyzing recorded conversations, they found that physicians spoke 67% of the time, while parents spoke only 30% of the time and nurses 3%. Additionally, they found that physicians “focused primarily on providing medical information, explaining the preferred course of action, and informing parents about the decision being reached by the team”(Pediatrics 2015;135:e465-76). Although parents were present during discussions, they were not routinely part of the decision-making process.

While this study was performed in Amsterdam and may not perfectly reflect the cultural norms of the United States, the results still should give us pause and raise important questions. Do we spend too much time talking, and too little listening? What role do parents have in decision making in our own country? Although we all participate in family-centered rounds, how often are the parents present but not involved? Do we pause often enough to explain in plain English what we had just rattled off in medicalese?

Dr. Bryan Sisk

The challenge of listening is that it takes time and energy. With many other patients to care for and a long list of notes and orders to be entered, spending more time listening to families can seem exhausting and less important. However, this is the crux of the physician-patient relationship, and this is what makes the role of physician so important. When sick children and their families are at the most vulnerable point in their lives, it is our presence as empathizing, listening humans that matters most. Treating the disease with the correct medications is important but insufficient. And the sicker the patient, the higher the stakes.

As medical trainees, we often can feel powerless in these high-intensity situations. Yet, we can play a key role by advocating on behalf of our patients and their families, by giving them a voice. We can do this only by taking time to ask questions and to listen. After spending a few more minutes with these families in need, we can better understand their hopes and values, and we can identify the ways in which our goals align. As our medical teams are zipping through family-centered-rounds, we can advocate for families by raising their questions and concerns, ensuring that their voices are heard. By taking time to listen, we can provide that pivotal bridge of understanding between the medical team and the family.

Dr. Sisk is a pediatrics resident at St. Louis Children’s Hospital. E-mail him at [email protected].

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
listen, end of life decisions, family-centered rounds
Sections
Author and Disclosure Information

Author and Disclosure Information

Doctors talk more than they listen, especially when discussing end-of-life care with families, a recent study in Pediatrics suggests. Researchers from the University of Amsterdam followed 27 physicians and 37 parents as they navigated the difficult waters of end-of-life decision making for their children. By analyzing recorded conversations, they found that physicians spoke 67% of the time, while parents spoke only 30% of the time and nurses 3%. Additionally, they found that physicians “focused primarily on providing medical information, explaining the preferred course of action, and informing parents about the decision being reached by the team”(Pediatrics 2015;135:e465-76). Although parents were present during discussions, they were not routinely part of the decision-making process.

While this study was performed in Amsterdam and may not perfectly reflect the cultural norms of the United States, the results still should give us pause and raise important questions. Do we spend too much time talking, and too little listening? What role do parents have in decision making in our own country? Although we all participate in family-centered rounds, how often are the parents present but not involved? Do we pause often enough to explain in plain English what we had just rattled off in medicalese?

Dr. Bryan Sisk

The challenge of listening is that it takes time and energy. With many other patients to care for and a long list of notes and orders to be entered, spending more time listening to families can seem exhausting and less important. However, this is the crux of the physician-patient relationship, and this is what makes the role of physician so important. When sick children and their families are at the most vulnerable point in their lives, it is our presence as empathizing, listening humans that matters most. Treating the disease with the correct medications is important but insufficient. And the sicker the patient, the higher the stakes.

As medical trainees, we often can feel powerless in these high-intensity situations. Yet, we can play a key role by advocating on behalf of our patients and their families, by giving them a voice. We can do this only by taking time to ask questions and to listen. After spending a few more minutes with these families in need, we can better understand their hopes and values, and we can identify the ways in which our goals align. As our medical teams are zipping through family-centered-rounds, we can advocate for families by raising their questions and concerns, ensuring that their voices are heard. By taking time to listen, we can provide that pivotal bridge of understanding between the medical team and the family.

Dr. Sisk is a pediatrics resident at St. Louis Children’s Hospital. E-mail him at [email protected].

Doctors talk more than they listen, especially when discussing end-of-life care with families, a recent study in Pediatrics suggests. Researchers from the University of Amsterdam followed 27 physicians and 37 parents as they navigated the difficult waters of end-of-life decision making for their children. By analyzing recorded conversations, they found that physicians spoke 67% of the time, while parents spoke only 30% of the time and nurses 3%. Additionally, they found that physicians “focused primarily on providing medical information, explaining the preferred course of action, and informing parents about the decision being reached by the team”(Pediatrics 2015;135:e465-76). Although parents were present during discussions, they were not routinely part of the decision-making process.

While this study was performed in Amsterdam and may not perfectly reflect the cultural norms of the United States, the results still should give us pause and raise important questions. Do we spend too much time talking, and too little listening? What role do parents have in decision making in our own country? Although we all participate in family-centered rounds, how often are the parents present but not involved? Do we pause often enough to explain in plain English what we had just rattled off in medicalese?

Dr. Bryan Sisk

The challenge of listening is that it takes time and energy. With many other patients to care for and a long list of notes and orders to be entered, spending more time listening to families can seem exhausting and less important. However, this is the crux of the physician-patient relationship, and this is what makes the role of physician so important. When sick children and their families are at the most vulnerable point in their lives, it is our presence as empathizing, listening humans that matters most. Treating the disease with the correct medications is important but insufficient. And the sicker the patient, the higher the stakes.

As medical trainees, we often can feel powerless in these high-intensity situations. Yet, we can play a key role by advocating on behalf of our patients and their families, by giving them a voice. We can do this only by taking time to ask questions and to listen. After spending a few more minutes with these families in need, we can better understand their hopes and values, and we can identify the ways in which our goals align. As our medical teams are zipping through family-centered-rounds, we can advocate for families by raising their questions and concerns, ensuring that their voices are heard. By taking time to listen, we can provide that pivotal bridge of understanding between the medical team and the family.

Dr. Sisk is a pediatrics resident at St. Louis Children’s Hospital. E-mail him at [email protected].

References

References

Publications
Publications
Topics
Article Type
Display Headline
Time to listen
Display Headline
Time to listen
Legacy Keywords
listen, end of life decisions, family-centered rounds
Legacy Keywords
listen, end of life decisions, family-centered rounds
Sections
Article Source

PURLs Copyright

Inside the Article

Procedure type found to drive readmissions

Article Type
Changed
Display Headline
Procedure type found to drive readmissions

The type of surgical procedure performed is a significant predictor of hospital readmission, reported Dr. Kevin R. Kasten and his associates at East Carolina University, Greenville, N.C.

In an analysis of 217,389 surgery patients, postoperative adverse events, specifically unplanned operating room return (odds ratio, 8.5; CI, 8.0-9.0), pulmonary embolism (OR, 8.2; CI, 7.1-9.6), deep incisional infection (OR, 7.5; CI, 6.7-8.5), and organ space infection (OR, 5.8; CI, 5.3-6.3), were significantly associated with an increased readmission risk. In addition, specific procedures associated with a higher risk for readmission included cystectomy, proctectomy, pancreatectomy, and lower-extremity vascular interventions.

The findings of this study suggest that “adverse events are a better predictor of 30[-day] readmission than patient comorbidity,” the authors said in the report. “As such, efforts to prevent adverse events such as return to the operating room, pulmonary embolism, surgical site infections, and myocardial infarction are crucial to prevention of readmission.”

Read the full article in the Journal of Surgical Research.

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
surgery, readmission
Author and Disclosure Information

Author and Disclosure Information

The type of surgical procedure performed is a significant predictor of hospital readmission, reported Dr. Kevin R. Kasten and his associates at East Carolina University, Greenville, N.C.

In an analysis of 217,389 surgery patients, postoperative adverse events, specifically unplanned operating room return (odds ratio, 8.5; CI, 8.0-9.0), pulmonary embolism (OR, 8.2; CI, 7.1-9.6), deep incisional infection (OR, 7.5; CI, 6.7-8.5), and organ space infection (OR, 5.8; CI, 5.3-6.3), were significantly associated with an increased readmission risk. In addition, specific procedures associated with a higher risk for readmission included cystectomy, proctectomy, pancreatectomy, and lower-extremity vascular interventions.

The findings of this study suggest that “adverse events are a better predictor of 30[-day] readmission than patient comorbidity,” the authors said in the report. “As such, efforts to prevent adverse events such as return to the operating room, pulmonary embolism, surgical site infections, and myocardial infarction are crucial to prevention of readmission.”

Read the full article in the Journal of Surgical Research.

The type of surgical procedure performed is a significant predictor of hospital readmission, reported Dr. Kevin R. Kasten and his associates at East Carolina University, Greenville, N.C.

In an analysis of 217,389 surgery patients, postoperative adverse events, specifically unplanned operating room return (odds ratio, 8.5; CI, 8.0-9.0), pulmonary embolism (OR, 8.2; CI, 7.1-9.6), deep incisional infection (OR, 7.5; CI, 6.7-8.5), and organ space infection (OR, 5.8; CI, 5.3-6.3), were significantly associated with an increased readmission risk. In addition, specific procedures associated with a higher risk for readmission included cystectomy, proctectomy, pancreatectomy, and lower-extremity vascular interventions.

The findings of this study suggest that “adverse events are a better predictor of 30[-day] readmission than patient comorbidity,” the authors said in the report. “As such, efforts to prevent adverse events such as return to the operating room, pulmonary embolism, surgical site infections, and myocardial infarction are crucial to prevention of readmission.”

Read the full article in the Journal of Surgical Research.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Procedure type found to drive readmissions
Display Headline
Procedure type found to drive readmissions
Legacy Keywords
surgery, readmission
Legacy Keywords
surgery, readmission
Article Source

PURLs Copyright

Inside the Article

Some VTE patients may receive anticoagulants for too long

Article Type
Changed
Display Headline
Some VTE patients may receive anticoagulants for too long

Even after excluding patients with atrial fibrillation, the duration of anticoagulant treatment for patients with unprovoked venous thromboembolism (VTE) and patients with transient risk factors often exceeded 1 year, according to a study published in Thrombosis Research.

In a large prospective cohort study, lead author Dr. Walter Ageno of the University of Insubria, Varese, Italy, and his associates examined 6,944 patients with a first episode of VTE. In this sample, 55% of patients with unprovoked events, 42% of patients with a transient risk factor, and 43% of patients with cancer received anticoagulant treatment for more than 12 months. The American College of Chest Physicians guideline recommends a 3-month treatment duration for patients with VTE secondary to transient risk factors. Pulmonary embolism at presentation, VTE recurrence while on treatment, chronic heart failure, and advanced age were independently associated with treatment for more than 12 months. Patients who died during the first year of treatment were excluded from the results.

Although clinicians tend to base their VTE treatment decisions on individual risk stratification, “this approach may expose a substantial proportion of patients, in particular those with VTE secondary to transient risk factors, to a possibly unnecessary risk of bleeding,” the investigators concluded.

Read the full article here: Thrombosis Research 2015.

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
VTE, anticoagulant,
Author and Disclosure Information

Author and Disclosure Information

Even after excluding patients with atrial fibrillation, the duration of anticoagulant treatment for patients with unprovoked venous thromboembolism (VTE) and patients with transient risk factors often exceeded 1 year, according to a study published in Thrombosis Research.

In a large prospective cohort study, lead author Dr. Walter Ageno of the University of Insubria, Varese, Italy, and his associates examined 6,944 patients with a first episode of VTE. In this sample, 55% of patients with unprovoked events, 42% of patients with a transient risk factor, and 43% of patients with cancer received anticoagulant treatment for more than 12 months. The American College of Chest Physicians guideline recommends a 3-month treatment duration for patients with VTE secondary to transient risk factors. Pulmonary embolism at presentation, VTE recurrence while on treatment, chronic heart failure, and advanced age were independently associated with treatment for more than 12 months. Patients who died during the first year of treatment were excluded from the results.

Although clinicians tend to base their VTE treatment decisions on individual risk stratification, “this approach may expose a substantial proportion of patients, in particular those with VTE secondary to transient risk factors, to a possibly unnecessary risk of bleeding,” the investigators concluded.

Read the full article here: Thrombosis Research 2015.

Even after excluding patients with atrial fibrillation, the duration of anticoagulant treatment for patients with unprovoked venous thromboembolism (VTE) and patients with transient risk factors often exceeded 1 year, according to a study published in Thrombosis Research.

In a large prospective cohort study, lead author Dr. Walter Ageno of the University of Insubria, Varese, Italy, and his associates examined 6,944 patients with a first episode of VTE. In this sample, 55% of patients with unprovoked events, 42% of patients with a transient risk factor, and 43% of patients with cancer received anticoagulant treatment for more than 12 months. The American College of Chest Physicians guideline recommends a 3-month treatment duration for patients with VTE secondary to transient risk factors. Pulmonary embolism at presentation, VTE recurrence while on treatment, chronic heart failure, and advanced age were independently associated with treatment for more than 12 months. Patients who died during the first year of treatment were excluded from the results.

Although clinicians tend to base their VTE treatment decisions on individual risk stratification, “this approach may expose a substantial proportion of patients, in particular those with VTE secondary to transient risk factors, to a possibly unnecessary risk of bleeding,” the investigators concluded.

Read the full article here: Thrombosis Research 2015.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Some VTE patients may receive anticoagulants for too long
Display Headline
Some VTE patients may receive anticoagulants for too long
Legacy Keywords
VTE, anticoagulant,
Legacy Keywords
VTE, anticoagulant,
Article Source

PURLs Copyright

Inside the Article