User login
METHODS: We used 5 data collection tools to evaluate the implementation of the intervention, and a combination of descriptive, quantitative, and qualitative analyses. Triangulation was used to attain a complete understanding of the quality of implementation. Twenty-two intervention practices with a total of 54 physicians participated in a randomized controlled trial that took place in Southwestern Ontario, Canada. The key measures of process were the frequency and time involved to deliver intervention components, the scope of the delivery and the utility of the components, and physician satisfaction with the intervention.
RESULTS: Of the 7 components in the intervention model, prevention facilitators (PFs) visited the practices most often to deliver the audit and feedback, consensus building, and reminder system components. All of the study practices received preventive performance audit and feedback, achieved consensus on a plan for improvement, and implemented a reminder system. Ninety percent of the practices implemented a customized flow sheet, and 10% used a computerized reminder system. Ninety-five percent of the intervention practices wanted critically appraised evidence for prevention, 82% participated in a workshop, and 100% received patient education materials in a binder. Content analysis of the physician interviews and bivariate analysis of physician self-reported changes between intervention and control group physicians revealed that the audit and feedback, consensus building, and development of reminder systems were the key intervention components. Ninety-five percent of the physicians were either satisfied or very satisfied with the intervention, and 90% would have been willing to have the PF continue working with their practice.
CONCLUSIONS: Primary care practices in Ontario can implement significant changes in their practice environments that will improve preventive care activity with the assistance of a facilitator. The main components for creating change are audit and feedback of preventive performance, achieving consensus on a plan for improvement, and implementing a reminder system.
A randomized controlled field trial of a multifaceted intervention to improve preventive care tailored to the needs of participating family practices was conducted in Southern Ontario and delivered by nurses trained in the facilitation of prevention. We focus on the process evaluation and complement the outcome evaluation1 by describing how the program was implemented in the intervention practices.
Improving preventive performance is both important and necessary. There is substantial room to improve the rates of appropriate preventive practice.2 The Canadian Task Force on the Periodic Health xamination3,4 has established guidelines for the delivery of preventive care that are supported by clinical evidence as effective in decreasing the impact of disease. However, evidence-based guidelines are not self-implementing.5-7 Changing physicians’ long-held patterns of behavior and the environments in which they work is complex and difficult. Unless the barriers to change can be overcome and actions taken to put preventive care guidelines into practice, evidence-based guideline development efforts will be wasted, and the quality of preventive care will not improve.8
Several reviews have focussed on the effectiveness of different interventions for implementing guidelines and improving care.6,7,9-13 Multifaceted interventions employing trained individuals who meet with providers in their practice settings to provide information and assist the practice in implementing evidence-based guidelines have been shown to be more effective than single interventions.11-14 Tailoring interventions to the requirements of the practice has also been proposed as important in supporting practice changes and in attaining more successful outcomes in preventive care performance compared with interventions that are fixed and lack this flexibility.15-17
As important as knowing what interventions work to improve preventive care performance is understanding why they work. The techniques of process evaluation allow the investigator to determine the extent to which the intervention designed to change practice patterns was actually implemented as planned. Adequate documentation of process facilitates replication and fine-tuning of the intervention.
Intervention Description
Our study built on the work of Fullard and colleagues18 and used a tailored multifaceted approach to getting evidence into action by focusing on the educational, attitudinal, and organizational barriers to change and tailoring interventions to the needs of the practice.17-24 The intervention employed 3 prevention facilitators (PFs) with both master’s degrees in community nursing and skills and previous experience in facilitation. Each PF had primary responsibility for up to 8 primary care practices with up to 6 physicians per practice.
The PFs underwent 30 weeks of intensive training before being assigned to randomly selected intervention practices. The training covered an orientation session, medical office computer systems, medical practice management, prevention in primary care, evidence-based medicine, and facilitation and audit skills development. Approximately 28 hours per week were spent in training and 7 hours per week in preparation and planning. Six of the 30 weeks of training were spent applying skills in a primary care office setting. Once in the field, they were instructed to offer 7 intervention strategies designed to change practice patterns and improve preventive care performance. The strategies were identified from reviews of the literature and constituted the multifaceted component of the intervention.10,11 The PFs were permitted to tailor these strategies to the needs and unique circumstances of the practice. The strategies were: (1) audit and ongoing feedback, (2) consensus building, (3) opinion leaders and networking, (4) academic detailing and education materials, (5) reminder systems, (6) patient-mediated activities, and (7) patient education materials.
The PFs worked with all physicians and allied health staff in the practice. They provided management support to practices and followed a quality improvement framework similar to that proposed by Leininger and coworkers.25 For each practice the PFs were to: (1) present baseline preventive performance rates, (2) facilitate the development of a practice policy for preventive care, (3) assist in setting goals and desirable levels of performance, (4) assist in the development of a written plan for implementing preventive care, (5) assist in the development and adaptation of tools and the strategies to implement the prevention plan, (6) facilitate meetings to assess progress and modify the plan if necessary, and (7) conduct chart audits to measure the impact of the changes made. The intervention period lasted 18 months and ended in December 1998.
The Figure is the program logic model describing each of the 7 intervention component strategies and the associated work activities, outputs, and short-term and long-term objectives associated with each component. It served as a framework for the evaluation of the intervention.26-28 The logic model allowed us to look inside the black box of the intervention29,30 by linking implementation activity to outcomes, and provided a framework to explore which elements worked and why.
Intervention Outcomes
The goal of the intervention was to increase the performance of 8 preventive maneuvers supported by evidence as appropriate and decrease the performance of 5 preventive maneuvers supported by evidence as inappropriate.1 An absolute change over time of 11.51% in preventive care performance in favor of intervention practices was found (F=19.29 [df=1,43], P<.0001. In other words, the intervention practices improved preventive performance by 36% going from 31% of eligible patients having received preventive care to 43% while the control practices remained at 32%.1
Methods
Research Questions
There were 2 objectives to our process evaluation: to document the extent to which the intervention was implemented with fidelity and to gain insight into how facilitation worked to improve preventive performance. The process evaluation was designed to answer questions concerning: (1) the time involved to deliver intervention components, (2) the quality of the delivery of intervention components, and (3) physician satisfaction with the intervention components. Quality was assessed by examining the scope or range of delivery of the intervention components and by analyzing the feedback received from practices on the usefulness of the intervention components.
Setting
The intervention arm of the trial included 22 practices with 54 physicians (Table 1). All health service organizations (HSOs) in Southwestern Ontario were approached to participate in the study. HSOs are primary care practices reimbursed primarily through capitation rather than fee for service. A total of 46 of the 100 primary care practices were recruited (response rate=46%). At follow-up only one intervention practice was lost, because the entire practice had moved. Intervention and control group practices did not differ significantly on any of the measured demographic characteristics (Table 2). Complete details on practice recruitment and attrition rates are published elsewhere.1
The practices covered a geographic area where the greatest distance between any 2 practices was more than 600 kilometers. PFs were assigned practices within a specific region of this geographic area. They arranged times to visit and work with intervention practices and traveled by car between visits to practices. PFs worked independently at their residences and corresponded with the project team through electronic mail regularly and quarterly with scheduled meetings.
Data Collection Tools
Each intervention practice was visited regularly by the same nurse facilitator who documented her activities and progress on 2 structured forms known as the weekly activity sheet and the monthly narrative report. Weekly activity sheets noted the number of hours spent on both on-site and offsite activities. Monthly narrative reports provided detailed information on the number of visits to a practice, the activities within each practice, the outcomes of those activities, the number of participants in meetings, and the plan for the following month. The activities in the narrative reports were summarized by intervention component to provide a cumulative overview of all intervention activity within a practice.
Also during the intervention, semistructured telephone interviews of participating physicians were conducted by 2 physician members of the project team at 6 months and 17 months. Participating physicians were asked what they had been happy and unhappy with and their ideas of improvement. Close-ended questions measured overall satisfaction with the intervention. The interview at 17 months also asked physicians if they would agree to have a nurse facilitator continue to visit their practice if funding were found.
At the end of the intervention, the PFs conducted interviews with each of the physicians identified as the primary contact in the intervention practices to solicit feedback on their experience. Physicians in both the intervention and control arm were sent a questionnaire by mail to report any changes that had taken place in their practice over the preceding 18 months.
Analysis
Data were analyzed to address the 3 research questions for the process evaluation utilizing the Logic Model as the conceptual framework (Figure). To determine how often various intervention components were delivered, the total hours spent at each practice and the total number of contacts with each practice by intervention component were calculated from the PF activity sheets.
To determine intervention quality, triangulation31 was used to attain a complete understanding of the quality of implementation. Multiple data sources and analysis methods were used to reveal the underlying dimensions of quality. All data sources were reviewed and analyses were conducted independently by 2 members of the investigation team. The members of the team held a debriefing session to discuss their findings and seek consensus. First, the monthly narrative reports across intervention sites were summarized to qualitatively describe the type, breadth, and scope of activity for each intervention component. Second, the activity descriptions and open-ended interview responses were content analyzed32 and coded, and frequencies were generated. The goal of this analysis was to identify significant descriptions of which intervention elements worked well and which did not. Finally, intervention and control practices were compared with contingency tables, and a chi-square statistic was used to determine differences on questionnaire responses concerning practice changes over the period of the intervention.
To determine physician satisfaction, open-ended satisfaction survey responses were coded and frequencies generated for ratings of overall satisfaction with the performance of the PF and the intervention.
Results
PF Program Implementation
Table 3 shows the number of hours spent on project activities during the period of the intervention. The PFs spent the largest proportion of their time (28%) on administrative duties, such as team meetings, telephone calls, internal reporting, preparing the project newsletter, coordinating networking conferences for intervention practices, photocopying, and filing. Sixteen percent of the PFs’ time was spent on-site facilitating changes to improve preventive care in the practice. Travel accounted for an average of 12% of the PFs’ time, although this varied depending on the distance to the practices.
Table 4 provides information on the number of contacts and hours spent on-site for each component of the intervention. On average, each intervention practice was contacted 33 times by a PF, with each visit lasting an average of 1 hour and 45 minutes. The most frequent forms of contact concerned developing reminder systems, conducting chart audits and providing feedback preventive care performance, and working to achieve consensus on the adoption of preventive care strategies. Both academic detailing to physicians and supplying critically appraised patient education materials averaged approximately 20 minutes but involved a great deal of preparation time. Few practices were interested in posters in the waiting room or a patient newsletter on prevention, so fewer contacts were made for those components.
Quality of Implementation
To assess quality, the frequency of each component of the intervention was tallied, physician feedback on the usefulness of intervention components was summarized, and self-reported practice changes between intervention and control physicians was reported.
Intervention Scope
Audit and Feedback. All 22 intervention practices received a presentation by the PF on the initial audit results to raise awareness of preventive care practice patterns. This was usually done in a kick-off meeting involving both physicians and nurses and often required more than one presentation to cover the various staff in the practice. Twenty practices requested subsequent analyses of data to follow their rates of performance. In addition, 18 practices requested audits of their charts for specific maneuvers, such as influenza vaccination and mammography
Consensus Building. All practices were involved in meetings with the PF to identify opportunities for improvement, assess needs, and select priority areas and strategies for improving preventive care performance. Interviews were conducted with nurses and other staff to promote their role in preventive care delivery.
Academic Detailing. Twenty-one out of 22 sites agreed to receive and discuss critically appraised evidence for the preventive maneuvers under study, and some requested similar information on other preventive areas, such as cholesterol and osteoporosis.
Reminder Systems. All of the intervention sites implemented some form of reminder system. Eighteen sites implemented a preventive care flow sheet; 2 sites used a chart stamp; and 2 sites implemented a computerized reminder system. Nineteen sites developed recall initiatives for flu vaccine, mammography, and Papanicolaou tests. Seventeen sites implemented chart stickers for smoking counseling or mammography.
Opinion Leaders. All sites received copies of the PF project newsletter that contained articles by influential individuals describing the importance of preventive care and descriptions of colleagues’ preventive care implementation efforts. Most practices attended a workshop that included an influential keynote speaker, and 27% of the participating physicians shared their knowledge about preventive care through publishing in the newsletter and/or public speaking.
Patient Education. All sites were provided with patient education materials from credible sources on request, and all received a binder of patient education materials constructed specifically to contain materials on the appropriate preventive maneuvers under study. The binders were regularly updated.
Patient Mediated. Posters designed to prompt patients to ask about folic acid, flu vaccine, and mammography were offered to all sites. Thirteen sites implemented a patient consent form for prostate-specific antigen (PSA) testing. Eight sites received preventive care diaries for patients. Five sites had a prevention newsletter for patients. Four sites agreed to pilot a health risk appraisal software program.
Physician Feedback
At the end of the intervention the facilitator asked physicians about their experience, including what was most and least useful to them. Table 5 provides a summary of the content analysis of physician responses.
Audit and feedback, both initially and subsequently, comprised the component most frequently considered to be important in creating change. Almost as often, the preventive care flow sheet was identified as useful. The facilitator sessions designed to seek consensus on preventive care guidelines and strategies for implementation were also appreciated.
Several physicians did not agree with the evidence on PSA testing. Others did not feel that counseling for folic acid was a priority. Some found the patient education binder cumbersome, and others found the sticker system for tobacco counseling unwieldy. Thus, both were underused. Two physicians noted that the preventive care wall chart was not helpful.
Physician Self-Reported Practice Changes
Eighty-six percent (93/108) of the intervention and control physicians responded to a questionnaire at the end of the study. Due to sample size, statistical power was limited to detecting an absolute difference of approximately 0.30 between groups, assuming an alpha of 0.05 and 80% power.33 Table 6 shows that 71% of intervention physicians compared with 28% of control physicians reported an audit of their practice for preventive services (P<.001). By the end of the study, 65% of the intervention physicians versus 48% of the control physicians indicated that they had a prevention policy or screening protocol in place, and 70% of intervention physicians compared with 58% of control physicians had created reminder systems for disease prevention.
Satisfaction with PF Intervention
At the telephone interview 6 months into the intervention, the mean satisfaction rating of intervention physicians was 4.08 on a scale of 1 (very dissatisfied) to 5 (very satisfied) with 80% satisfied with the intervention. At 17 months the mean satisfaction rating had risen to 4.5 with a 95% satisfaction rate.
At 6 months, 85% of the practices were satisfied with the frequency of visits of their assigned facilitator. At 17 months there was a 64% satisfaction rate, with the remaining 36% wanting more visits from the facilitator. The physicians commented on how the intervention had focused them on prevention in their practice. When the physicians were asked if they would agree to have a facilitator visit their practice in the future if given the opportunity, 90% agreed.
Concerns included not being able to continue the recall of patients at the end of the intervention and questioning the inappropriate maneuvers as too controversial. A physician from a large practice with 6 physicians commented that the facilitator could not easily work in the complex practice environment.
Discussion
Our study demonstrates that PFs can significantly improve the delivery of preventive services and in the process make quality contributions to a practice environment with high satisfaction rates from participating physicians.
Our intervention had a higher frequency and intensity of visits than other studies of this genre. The PFs had an average of almost 2 visits per month lasting approximately 105 minutes per visit. Dietrich and colleagues21 reported only 4 visits over a 3-month period lasting an average of 120 minutes, and Hulscher and coworkers22 reported approximately 25 visits with an average duration of 73 minutes. Others have been even less frequent,20 and in other studies it is not reported.34-36
The critical intervention components as evidenced by physician feedback, changes between control and intervention practices, and the amount of facilitator time spent on each component were: (1) audit and feedback, (2) sharing and discussing information to build consensus on an action plan, and (3) a reminder system. Similarly, the Cancer Prevention in Community Practice Project achieved 100% success in implementing change using customized preventive care flowsheets.37 Of the 7 intervention components, patient education materials and patient-mediated interventions such as posters in the waiting room were considered to be the least useful.
Overall, physicians and nurses working within the practices were very satisfied with the intervention, and 90% were willing to have the nurse facilitator continue working with their practice.
Lessons learned from the process evaluation for improving the delivery of the outreach facilitation intervention include:
Focusing on the 3 key intervention components (audit and feedback, seeking consensus on an action plan, and implementing a reminder system) and tailoring these to the needs of the practice
Preparing patient education and patient-mediated materials only if the practice requests such materials
Developing simpler strategies to encourage physicians to counsel their patients that smoke to quit smoking
Providing the facilitators an administrative assistant to reduce the amount of their time spent on administrative duties for the practices and increase time on-site
Strengths
The strengths of the study include the completeness of the data set, the theoretical framework for data collection, the use of multiple data sources and data collection methods, and the prospective data collection methodology.
Limitations
There are several limitations to the process evaluation methods. Much of the data was provided by the facilitators themselves, and therefore the possibility of bias exists. The study population consisted of HSOs, and therefore the results may not be generalizable. There is a possibility of social desirability bias in the satisfaction rates. Finally, our analyses of the process data were descriptive and exploratory.
Conclusions
Process evaluation often identifies future areas of research. Follow-up of the few practices that were dissatisfied with facilitation should be carried out to understand why they were dissatisfied. Sustainability needs to be addressed. For example, Dietrich and colleagues38 found that 5-year durability of a preventive services office system depended on the physician’s preventive care philosophy. McCowan and coworkers39 found that the effect of a facilitator was not sustained for 2 years. Finally, to maximize cost-effectiveness, more research is required to determine how much of a dose of facilitation is required and how frequently facilitators should visit to achieve a positive outcome.
Acknowledgments
We wish to acknowledge the financial support of the Ontario Ministry of Health, as well as the substantial contributions of the 3 nurse facilitators (Ingrid LeClaire, Ann MacLeod, and Ruth Blochlinger). We also wish to thank the many physicians and nurses who participated in the study.
1. Lemelin J, Hogg W, Baskerville B. Evidence to action: a tailored multi-faceted approach to changing family physician practice patterns and improving preventive care. CMAJ. In press.
2. Hutchison B, Woodward CA, Norman GR, Abelson J, Brown JA. Provision of preventive care to unannounced standardized patients. CMAJ 1998;158:185-93.
3. Spitzer WO. The Canadian Task Force on the Periodic Health Exanination: The Periodic Examination. CMAJ 1979;121:1193-254.
4. Canadian Task Force on the Periodic Health Examination: the Canadian guide to clinical preventive health care. Ottawa, Canada: Health Canada; 1994.
5. Tamblyn R, Battista RN. Changing clinical practice: what interventions work? J Cont Edu Health Prof 1993;13:273-88.
6. Davis DA, Thompson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA 1995;274:700-05.
7. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317:465-68.
8. The University of York. Effective health care: getting evidence into practice. NHS Centre for Reviews and Dissemination 1999;5:1-16.
9. Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from official to individual clinical policy. Am J Prev Med 1988;4(suppl):77-94.
10. Oxman AD, Thomson MA, Davis DA, Haynes B. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ 1995;153:1423-52.
11. Wensing M, Grol R. Single and combined strategies for implementing changes in primary care: a literature review. Int J Qual Health Care 1994;6:115-32.
12. Wensing M, van der Weijden T, Grol R. Implementing guidelines and innovations in general practice: which interventions are effective? Br J Gen Pract 1998;48:991-97.
13. Hulscher MEJL, Wensing M, Grol R, Weijden T, van Weel C. Interventions to improve the delivery of preventive services in primary care. Am J Public Health 1999;89:737-46.
14. Thomson MA, Oxman AD, Davis DA, Haynes RB, Freemantle N, Harvey E. Educational outreach visits: effects on professional practice and health care outcomes (Cochrane review). In: The Cochrane library. Oxford, England: Update Software; 1999.
15. Cohen SJ, Halvorson HW, Gosselink CA. Changing physician behavior to improve disease prevention. Prev Med 1994;23:284-91.
16. Main DS, Cohen SJ, DiClemente CC. Measuring physician readiness to change cancer screening: preliminary results. Am J Prev Med 1995;11:54-58.
17. Hulscher MEJL, Van Drenth BB, Mokkink HGA, et al. Tailored outreach visits as a method for implementing guidelines and improving preventive care. Int J Qual Health Care 1998;10:105-12.
18. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care. BMJ 1984;289:1585-87.
19. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care: a controlled trial of a low technology, low cost approach. BMJ 1987;294:1080-82.
20. Kottke TE, Solberg LI, Brekke ML. A controlled trial to integrate smoking cessation advice into primary care: doctors helping smokers, round III. J Fam Pract 1992;34:701-08.
21. Dietrich AJ, O’Connor GT, Keller A, Karney PA, Levy D, Whaley F. Cancer: improving early detection and prevention: a community practice randomised trial. BMJ 1992;304:687-91.
22. Hulscher M, Van Drenth B, van de Wouden J, Mokkink H, van Weel C, Grol R. Changing preventive practice: a controlled trial on the effects of outreach visits to organise prevention of cardiovascular disease. Qual Health Care 1997;6:19-24.
23. Dietrich AJ, Tobin JN, Sox CH, et al. Cancer early-detection services in community health centers for the underserved: a randomized controlled trial. Arch Fam Med 1998;7:320-27.
24. Dietrich AJ, Sox CH, Tosteson TD, Woodruff CB. Durability of improved physician early detection of cancer after conclusion of intervention support. Cancer Epidemiol Biomarkers Prev 1994;3:335-40.
25. Leininger LS, Leonard F, Larry D, et al. An Office system for organizing preventive services: a report by the American Cancer Society Advisory Group on Preventive Health Care Reminder Systems. Arch Fam Med 1996;5:108-15.
26. Rush B, Ogborne A. Program logic models: expanding their role and structure for program planning and evaluation. Can J Prog Eval 1991;6:96-106.
27. Wong-Reiger D, David L. Using program logic models to plan and evaluate education and prevention programs. Arnold Love, ed. Canadian Evaluation Society; 1995.
28. Kanouse D, Kallich J, Kahan J. Dissemination of effectiveness and outcomes research. Health Policy 1995;34:167-92.
29. Chen HT, Rossi PH. Evaluating with sense: the theory-driven approach. Eval Rev 1983;7:283.-
30. Stange KC, Zyzanski SJ, Jáen CR, et al. Illuminating the “black box”: a description of 4454 patient visits to 138 family physicians. J Fam Pract 1998;46:377-89.
31. Fielding N, Fielding J. Linking data. Beverly Hills, Calif: Sage Publications Inc; 1986.
32. Weber RP. Basic content analysis. 7-049 ed. Newbury Park, Calif: Sage Publications Inc; 1985.
33. Fleiss JL. Statistical methods for rates and proportions. 2nd ed. New York, NY: John Wiley & Sons; 1981.
34. Cockburn J, Ruth D, Silagy C, et al. Randomized trial of three approaches for marketing smoking cessation programmes to Australian general practitioners. BMJ 1992;304:691-94.
35. Manfredi C, Czaja R, Freels S, Trubitt M, Warnecke R, Lacey L. Prescribe for health: improving cancer screening in physician practices serving low-income and minority populations. Arch Fam Med 1998;7:329-37.
36. Kinsinger LS, Harris R, Qapish B, Strecher V, Kaluzny A. Using an office system intervention to increase breast cancer screening. JGIM 1998;13:507-14.
37. Carney PA, Dietrich AJ, Keller A, Landgraf J, O’Connor GT. Tools, teamwork and tenacity: an office system for cancer prevention. J Fam Prac 1992;35:388-94.
38. Rebelsky M, Sox CH, Dietrich AJ, Schwab BR, Labaree CE, Brown-Mckinney N. Physician preventive care philosophy and the five year durability of a preventive services office system. Soc Sci Med 1996;43:1073-81.
39. McCowan C, Neville RG, Crombie IK, Clark RA, Warner FC. The facilitator effect: results from a four-year follow-up of children with asthma. Br J Gen Pract 1997;47:156-60.
METHODS: We used 5 data collection tools to evaluate the implementation of the intervention, and a combination of descriptive, quantitative, and qualitative analyses. Triangulation was used to attain a complete understanding of the quality of implementation. Twenty-two intervention practices with a total of 54 physicians participated in a randomized controlled trial that took place in Southwestern Ontario, Canada. The key measures of process were the frequency and time involved to deliver intervention components, the scope of the delivery and the utility of the components, and physician satisfaction with the intervention.
RESULTS: Of the 7 components in the intervention model, prevention facilitators (PFs) visited the practices most often to deliver the audit and feedback, consensus building, and reminder system components. All of the study practices received preventive performance audit and feedback, achieved consensus on a plan for improvement, and implemented a reminder system. Ninety percent of the practices implemented a customized flow sheet, and 10% used a computerized reminder system. Ninety-five percent of the intervention practices wanted critically appraised evidence for prevention, 82% participated in a workshop, and 100% received patient education materials in a binder. Content analysis of the physician interviews and bivariate analysis of physician self-reported changes between intervention and control group physicians revealed that the audit and feedback, consensus building, and development of reminder systems were the key intervention components. Ninety-five percent of the physicians were either satisfied or very satisfied with the intervention, and 90% would have been willing to have the PF continue working with their practice.
CONCLUSIONS: Primary care practices in Ontario can implement significant changes in their practice environments that will improve preventive care activity with the assistance of a facilitator. The main components for creating change are audit and feedback of preventive performance, achieving consensus on a plan for improvement, and implementing a reminder system.
A randomized controlled field trial of a multifaceted intervention to improve preventive care tailored to the needs of participating family practices was conducted in Southern Ontario and delivered by nurses trained in the facilitation of prevention. We focus on the process evaluation and complement the outcome evaluation1 by describing how the program was implemented in the intervention practices.
Improving preventive performance is both important and necessary. There is substantial room to improve the rates of appropriate preventive practice.2 The Canadian Task Force on the Periodic Health xamination3,4 has established guidelines for the delivery of preventive care that are supported by clinical evidence as effective in decreasing the impact of disease. However, evidence-based guidelines are not self-implementing.5-7 Changing physicians’ long-held patterns of behavior and the environments in which they work is complex and difficult. Unless the barriers to change can be overcome and actions taken to put preventive care guidelines into practice, evidence-based guideline development efforts will be wasted, and the quality of preventive care will not improve.8
Several reviews have focussed on the effectiveness of different interventions for implementing guidelines and improving care.6,7,9-13 Multifaceted interventions employing trained individuals who meet with providers in their practice settings to provide information and assist the practice in implementing evidence-based guidelines have been shown to be more effective than single interventions.11-14 Tailoring interventions to the requirements of the practice has also been proposed as important in supporting practice changes and in attaining more successful outcomes in preventive care performance compared with interventions that are fixed and lack this flexibility.15-17
As important as knowing what interventions work to improve preventive care performance is understanding why they work. The techniques of process evaluation allow the investigator to determine the extent to which the intervention designed to change practice patterns was actually implemented as planned. Adequate documentation of process facilitates replication and fine-tuning of the intervention.
Intervention Description
Our study built on the work of Fullard and colleagues18 and used a tailored multifaceted approach to getting evidence into action by focusing on the educational, attitudinal, and organizational barriers to change and tailoring interventions to the needs of the practice.17-24 The intervention employed 3 prevention facilitators (PFs) with both master’s degrees in community nursing and skills and previous experience in facilitation. Each PF had primary responsibility for up to 8 primary care practices with up to 6 physicians per practice.
The PFs underwent 30 weeks of intensive training before being assigned to randomly selected intervention practices. The training covered an orientation session, medical office computer systems, medical practice management, prevention in primary care, evidence-based medicine, and facilitation and audit skills development. Approximately 28 hours per week were spent in training and 7 hours per week in preparation and planning. Six of the 30 weeks of training were spent applying skills in a primary care office setting. Once in the field, they were instructed to offer 7 intervention strategies designed to change practice patterns and improve preventive care performance. The strategies were identified from reviews of the literature and constituted the multifaceted component of the intervention.10,11 The PFs were permitted to tailor these strategies to the needs and unique circumstances of the practice. The strategies were: (1) audit and ongoing feedback, (2) consensus building, (3) opinion leaders and networking, (4) academic detailing and education materials, (5) reminder systems, (6) patient-mediated activities, and (7) patient education materials.
The PFs worked with all physicians and allied health staff in the practice. They provided management support to practices and followed a quality improvement framework similar to that proposed by Leininger and coworkers.25 For each practice the PFs were to: (1) present baseline preventive performance rates, (2) facilitate the development of a practice policy for preventive care, (3) assist in setting goals and desirable levels of performance, (4) assist in the development of a written plan for implementing preventive care, (5) assist in the development and adaptation of tools and the strategies to implement the prevention plan, (6) facilitate meetings to assess progress and modify the plan if necessary, and (7) conduct chart audits to measure the impact of the changes made. The intervention period lasted 18 months and ended in December 1998.
The Figure is the program logic model describing each of the 7 intervention component strategies and the associated work activities, outputs, and short-term and long-term objectives associated with each component. It served as a framework for the evaluation of the intervention.26-28 The logic model allowed us to look inside the black box of the intervention29,30 by linking implementation activity to outcomes, and provided a framework to explore which elements worked and why.
Intervention Outcomes
The goal of the intervention was to increase the performance of 8 preventive maneuvers supported by evidence as appropriate and decrease the performance of 5 preventive maneuvers supported by evidence as inappropriate.1 An absolute change over time of 11.51% in preventive care performance in favor of intervention practices was found (F=19.29 [df=1,43], P<.0001. In other words, the intervention practices improved preventive performance by 36% going from 31% of eligible patients having received preventive care to 43% while the control practices remained at 32%.1
Methods
Research Questions
There were 2 objectives to our process evaluation: to document the extent to which the intervention was implemented with fidelity and to gain insight into how facilitation worked to improve preventive performance. The process evaluation was designed to answer questions concerning: (1) the time involved to deliver intervention components, (2) the quality of the delivery of intervention components, and (3) physician satisfaction with the intervention components. Quality was assessed by examining the scope or range of delivery of the intervention components and by analyzing the feedback received from practices on the usefulness of the intervention components.
Setting
The intervention arm of the trial included 22 practices with 54 physicians (Table 1). All health service organizations (HSOs) in Southwestern Ontario were approached to participate in the study. HSOs are primary care practices reimbursed primarily through capitation rather than fee for service. A total of 46 of the 100 primary care practices were recruited (response rate=46%). At follow-up only one intervention practice was lost, because the entire practice had moved. Intervention and control group practices did not differ significantly on any of the measured demographic characteristics (Table 2). Complete details on practice recruitment and attrition rates are published elsewhere.1
The practices covered a geographic area where the greatest distance between any 2 practices was more than 600 kilometers. PFs were assigned practices within a specific region of this geographic area. They arranged times to visit and work with intervention practices and traveled by car between visits to practices. PFs worked independently at their residences and corresponded with the project team through electronic mail regularly and quarterly with scheduled meetings.
Data Collection Tools
Each intervention practice was visited regularly by the same nurse facilitator who documented her activities and progress on 2 structured forms known as the weekly activity sheet and the monthly narrative report. Weekly activity sheets noted the number of hours spent on both on-site and offsite activities. Monthly narrative reports provided detailed information on the number of visits to a practice, the activities within each practice, the outcomes of those activities, the number of participants in meetings, and the plan for the following month. The activities in the narrative reports were summarized by intervention component to provide a cumulative overview of all intervention activity within a practice.
Also during the intervention, semistructured telephone interviews of participating physicians were conducted by 2 physician members of the project team at 6 months and 17 months. Participating physicians were asked what they had been happy and unhappy with and their ideas of improvement. Close-ended questions measured overall satisfaction with the intervention. The interview at 17 months also asked physicians if they would agree to have a nurse facilitator continue to visit their practice if funding were found.
At the end of the intervention, the PFs conducted interviews with each of the physicians identified as the primary contact in the intervention practices to solicit feedback on their experience. Physicians in both the intervention and control arm were sent a questionnaire by mail to report any changes that had taken place in their practice over the preceding 18 months.
Analysis
Data were analyzed to address the 3 research questions for the process evaluation utilizing the Logic Model as the conceptual framework (Figure). To determine how often various intervention components were delivered, the total hours spent at each practice and the total number of contacts with each practice by intervention component were calculated from the PF activity sheets.
To determine intervention quality, triangulation31 was used to attain a complete understanding of the quality of implementation. Multiple data sources and analysis methods were used to reveal the underlying dimensions of quality. All data sources were reviewed and analyses were conducted independently by 2 members of the investigation team. The members of the team held a debriefing session to discuss their findings and seek consensus. First, the monthly narrative reports across intervention sites were summarized to qualitatively describe the type, breadth, and scope of activity for each intervention component. Second, the activity descriptions and open-ended interview responses were content analyzed32 and coded, and frequencies were generated. The goal of this analysis was to identify significant descriptions of which intervention elements worked well and which did not. Finally, intervention and control practices were compared with contingency tables, and a chi-square statistic was used to determine differences on questionnaire responses concerning practice changes over the period of the intervention.
To determine physician satisfaction, open-ended satisfaction survey responses were coded and frequencies generated for ratings of overall satisfaction with the performance of the PF and the intervention.
Results
PF Program Implementation
Table 3 shows the number of hours spent on project activities during the period of the intervention. The PFs spent the largest proportion of their time (28%) on administrative duties, such as team meetings, telephone calls, internal reporting, preparing the project newsletter, coordinating networking conferences for intervention practices, photocopying, and filing. Sixteen percent of the PFs’ time was spent on-site facilitating changes to improve preventive care in the practice. Travel accounted for an average of 12% of the PFs’ time, although this varied depending on the distance to the practices.
Table 4 provides information on the number of contacts and hours spent on-site for each component of the intervention. On average, each intervention practice was contacted 33 times by a PF, with each visit lasting an average of 1 hour and 45 minutes. The most frequent forms of contact concerned developing reminder systems, conducting chart audits and providing feedback preventive care performance, and working to achieve consensus on the adoption of preventive care strategies. Both academic detailing to physicians and supplying critically appraised patient education materials averaged approximately 20 minutes but involved a great deal of preparation time. Few practices were interested in posters in the waiting room or a patient newsletter on prevention, so fewer contacts were made for those components.
Quality of Implementation
To assess quality, the frequency of each component of the intervention was tallied, physician feedback on the usefulness of intervention components was summarized, and self-reported practice changes between intervention and control physicians was reported.
Intervention Scope
Audit and Feedback. All 22 intervention practices received a presentation by the PF on the initial audit results to raise awareness of preventive care practice patterns. This was usually done in a kick-off meeting involving both physicians and nurses and often required more than one presentation to cover the various staff in the practice. Twenty practices requested subsequent analyses of data to follow their rates of performance. In addition, 18 practices requested audits of their charts for specific maneuvers, such as influenza vaccination and mammography
Consensus Building. All practices were involved in meetings with the PF to identify opportunities for improvement, assess needs, and select priority areas and strategies for improving preventive care performance. Interviews were conducted with nurses and other staff to promote their role in preventive care delivery.
Academic Detailing. Twenty-one out of 22 sites agreed to receive and discuss critically appraised evidence for the preventive maneuvers under study, and some requested similar information on other preventive areas, such as cholesterol and osteoporosis.
Reminder Systems. All of the intervention sites implemented some form of reminder system. Eighteen sites implemented a preventive care flow sheet; 2 sites used a chart stamp; and 2 sites implemented a computerized reminder system. Nineteen sites developed recall initiatives for flu vaccine, mammography, and Papanicolaou tests. Seventeen sites implemented chart stickers for smoking counseling or mammography.
Opinion Leaders. All sites received copies of the PF project newsletter that contained articles by influential individuals describing the importance of preventive care and descriptions of colleagues’ preventive care implementation efforts. Most practices attended a workshop that included an influential keynote speaker, and 27% of the participating physicians shared their knowledge about preventive care through publishing in the newsletter and/or public speaking.
Patient Education. All sites were provided with patient education materials from credible sources on request, and all received a binder of patient education materials constructed specifically to contain materials on the appropriate preventive maneuvers under study. The binders were regularly updated.
Patient Mediated. Posters designed to prompt patients to ask about folic acid, flu vaccine, and mammography were offered to all sites. Thirteen sites implemented a patient consent form for prostate-specific antigen (PSA) testing. Eight sites received preventive care diaries for patients. Five sites had a prevention newsletter for patients. Four sites agreed to pilot a health risk appraisal software program.
Physician Feedback
At the end of the intervention the facilitator asked physicians about their experience, including what was most and least useful to them. Table 5 provides a summary of the content analysis of physician responses.
Audit and feedback, both initially and subsequently, comprised the component most frequently considered to be important in creating change. Almost as often, the preventive care flow sheet was identified as useful. The facilitator sessions designed to seek consensus on preventive care guidelines and strategies for implementation were also appreciated.
Several physicians did not agree with the evidence on PSA testing. Others did not feel that counseling for folic acid was a priority. Some found the patient education binder cumbersome, and others found the sticker system for tobacco counseling unwieldy. Thus, both were underused. Two physicians noted that the preventive care wall chart was not helpful.
Physician Self-Reported Practice Changes
Eighty-six percent (93/108) of the intervention and control physicians responded to a questionnaire at the end of the study. Due to sample size, statistical power was limited to detecting an absolute difference of approximately 0.30 between groups, assuming an alpha of 0.05 and 80% power.33 Table 6 shows that 71% of intervention physicians compared with 28% of control physicians reported an audit of their practice for preventive services (P<.001). By the end of the study, 65% of the intervention physicians versus 48% of the control physicians indicated that they had a prevention policy or screening protocol in place, and 70% of intervention physicians compared with 58% of control physicians had created reminder systems for disease prevention.
Satisfaction with PF Intervention
At the telephone interview 6 months into the intervention, the mean satisfaction rating of intervention physicians was 4.08 on a scale of 1 (very dissatisfied) to 5 (very satisfied) with 80% satisfied with the intervention. At 17 months the mean satisfaction rating had risen to 4.5 with a 95% satisfaction rate.
At 6 months, 85% of the practices were satisfied with the frequency of visits of their assigned facilitator. At 17 months there was a 64% satisfaction rate, with the remaining 36% wanting more visits from the facilitator. The physicians commented on how the intervention had focused them on prevention in their practice. When the physicians were asked if they would agree to have a facilitator visit their practice in the future if given the opportunity, 90% agreed.
Concerns included not being able to continue the recall of patients at the end of the intervention and questioning the inappropriate maneuvers as too controversial. A physician from a large practice with 6 physicians commented that the facilitator could not easily work in the complex practice environment.
Discussion
Our study demonstrates that PFs can significantly improve the delivery of preventive services and in the process make quality contributions to a practice environment with high satisfaction rates from participating physicians.
Our intervention had a higher frequency and intensity of visits than other studies of this genre. The PFs had an average of almost 2 visits per month lasting approximately 105 minutes per visit. Dietrich and colleagues21 reported only 4 visits over a 3-month period lasting an average of 120 minutes, and Hulscher and coworkers22 reported approximately 25 visits with an average duration of 73 minutes. Others have been even less frequent,20 and in other studies it is not reported.34-36
The critical intervention components as evidenced by physician feedback, changes between control and intervention practices, and the amount of facilitator time spent on each component were: (1) audit and feedback, (2) sharing and discussing information to build consensus on an action plan, and (3) a reminder system. Similarly, the Cancer Prevention in Community Practice Project achieved 100% success in implementing change using customized preventive care flowsheets.37 Of the 7 intervention components, patient education materials and patient-mediated interventions such as posters in the waiting room were considered to be the least useful.
Overall, physicians and nurses working within the practices were very satisfied with the intervention, and 90% were willing to have the nurse facilitator continue working with their practice.
Lessons learned from the process evaluation for improving the delivery of the outreach facilitation intervention include:
Focusing on the 3 key intervention components (audit and feedback, seeking consensus on an action plan, and implementing a reminder system) and tailoring these to the needs of the practice
Preparing patient education and patient-mediated materials only if the practice requests such materials
Developing simpler strategies to encourage physicians to counsel their patients that smoke to quit smoking
Providing the facilitators an administrative assistant to reduce the amount of their time spent on administrative duties for the practices and increase time on-site
Strengths
The strengths of the study include the completeness of the data set, the theoretical framework for data collection, the use of multiple data sources and data collection methods, and the prospective data collection methodology.
Limitations
There are several limitations to the process evaluation methods. Much of the data was provided by the facilitators themselves, and therefore the possibility of bias exists. The study population consisted of HSOs, and therefore the results may not be generalizable. There is a possibility of social desirability bias in the satisfaction rates. Finally, our analyses of the process data were descriptive and exploratory.
Conclusions
Process evaluation often identifies future areas of research. Follow-up of the few practices that were dissatisfied with facilitation should be carried out to understand why they were dissatisfied. Sustainability needs to be addressed. For example, Dietrich and colleagues38 found that 5-year durability of a preventive services office system depended on the physician’s preventive care philosophy. McCowan and coworkers39 found that the effect of a facilitator was not sustained for 2 years. Finally, to maximize cost-effectiveness, more research is required to determine how much of a dose of facilitation is required and how frequently facilitators should visit to achieve a positive outcome.
Acknowledgments
We wish to acknowledge the financial support of the Ontario Ministry of Health, as well as the substantial contributions of the 3 nurse facilitators (Ingrid LeClaire, Ann MacLeod, and Ruth Blochlinger). We also wish to thank the many physicians and nurses who participated in the study.
METHODS: We used 5 data collection tools to evaluate the implementation of the intervention, and a combination of descriptive, quantitative, and qualitative analyses. Triangulation was used to attain a complete understanding of the quality of implementation. Twenty-two intervention practices with a total of 54 physicians participated in a randomized controlled trial that took place in Southwestern Ontario, Canada. The key measures of process were the frequency and time involved to deliver intervention components, the scope of the delivery and the utility of the components, and physician satisfaction with the intervention.
RESULTS: Of the 7 components in the intervention model, prevention facilitators (PFs) visited the practices most often to deliver the audit and feedback, consensus building, and reminder system components. All of the study practices received preventive performance audit and feedback, achieved consensus on a plan for improvement, and implemented a reminder system. Ninety percent of the practices implemented a customized flow sheet, and 10% used a computerized reminder system. Ninety-five percent of the intervention practices wanted critically appraised evidence for prevention, 82% participated in a workshop, and 100% received patient education materials in a binder. Content analysis of the physician interviews and bivariate analysis of physician self-reported changes between intervention and control group physicians revealed that the audit and feedback, consensus building, and development of reminder systems were the key intervention components. Ninety-five percent of the physicians were either satisfied or very satisfied with the intervention, and 90% would have been willing to have the PF continue working with their practice.
CONCLUSIONS: Primary care practices in Ontario can implement significant changes in their practice environments that will improve preventive care activity with the assistance of a facilitator. The main components for creating change are audit and feedback of preventive performance, achieving consensus on a plan for improvement, and implementing a reminder system.
A randomized controlled field trial of a multifaceted intervention to improve preventive care tailored to the needs of participating family practices was conducted in Southern Ontario and delivered by nurses trained in the facilitation of prevention. We focus on the process evaluation and complement the outcome evaluation1 by describing how the program was implemented in the intervention practices.
Improving preventive performance is both important and necessary. There is substantial room to improve the rates of appropriate preventive practice.2 The Canadian Task Force on the Periodic Health xamination3,4 has established guidelines for the delivery of preventive care that are supported by clinical evidence as effective in decreasing the impact of disease. However, evidence-based guidelines are not self-implementing.5-7 Changing physicians’ long-held patterns of behavior and the environments in which they work is complex and difficult. Unless the barriers to change can be overcome and actions taken to put preventive care guidelines into practice, evidence-based guideline development efforts will be wasted, and the quality of preventive care will not improve.8
Several reviews have focussed on the effectiveness of different interventions for implementing guidelines and improving care.6,7,9-13 Multifaceted interventions employing trained individuals who meet with providers in their practice settings to provide information and assist the practice in implementing evidence-based guidelines have been shown to be more effective than single interventions.11-14 Tailoring interventions to the requirements of the practice has also been proposed as important in supporting practice changes and in attaining more successful outcomes in preventive care performance compared with interventions that are fixed and lack this flexibility.15-17
As important as knowing what interventions work to improve preventive care performance is understanding why they work. The techniques of process evaluation allow the investigator to determine the extent to which the intervention designed to change practice patterns was actually implemented as planned. Adequate documentation of process facilitates replication and fine-tuning of the intervention.
Intervention Description
Our study built on the work of Fullard and colleagues18 and used a tailored multifaceted approach to getting evidence into action by focusing on the educational, attitudinal, and organizational barriers to change and tailoring interventions to the needs of the practice.17-24 The intervention employed 3 prevention facilitators (PFs) with both master’s degrees in community nursing and skills and previous experience in facilitation. Each PF had primary responsibility for up to 8 primary care practices with up to 6 physicians per practice.
The PFs underwent 30 weeks of intensive training before being assigned to randomly selected intervention practices. The training covered an orientation session, medical office computer systems, medical practice management, prevention in primary care, evidence-based medicine, and facilitation and audit skills development. Approximately 28 hours per week were spent in training and 7 hours per week in preparation and planning. Six of the 30 weeks of training were spent applying skills in a primary care office setting. Once in the field, they were instructed to offer 7 intervention strategies designed to change practice patterns and improve preventive care performance. The strategies were identified from reviews of the literature and constituted the multifaceted component of the intervention.10,11 The PFs were permitted to tailor these strategies to the needs and unique circumstances of the practice. The strategies were: (1) audit and ongoing feedback, (2) consensus building, (3) opinion leaders and networking, (4) academic detailing and education materials, (5) reminder systems, (6) patient-mediated activities, and (7) patient education materials.
The PFs worked with all physicians and allied health staff in the practice. They provided management support to practices and followed a quality improvement framework similar to that proposed by Leininger and coworkers.25 For each practice the PFs were to: (1) present baseline preventive performance rates, (2) facilitate the development of a practice policy for preventive care, (3) assist in setting goals and desirable levels of performance, (4) assist in the development of a written plan for implementing preventive care, (5) assist in the development and adaptation of tools and the strategies to implement the prevention plan, (6) facilitate meetings to assess progress and modify the plan if necessary, and (7) conduct chart audits to measure the impact of the changes made. The intervention period lasted 18 months and ended in December 1998.
The Figure is the program logic model describing each of the 7 intervention component strategies and the associated work activities, outputs, and short-term and long-term objectives associated with each component. It served as a framework for the evaluation of the intervention.26-28 The logic model allowed us to look inside the black box of the intervention29,30 by linking implementation activity to outcomes, and provided a framework to explore which elements worked and why.
Intervention Outcomes
The goal of the intervention was to increase the performance of 8 preventive maneuvers supported by evidence as appropriate and decrease the performance of 5 preventive maneuvers supported by evidence as inappropriate.1 An absolute change over time of 11.51% in preventive care performance in favor of intervention practices was found (F=19.29 [df=1,43], P<.0001. In other words, the intervention practices improved preventive performance by 36% going from 31% of eligible patients having received preventive care to 43% while the control practices remained at 32%.1
Methods
Research Questions
There were 2 objectives to our process evaluation: to document the extent to which the intervention was implemented with fidelity and to gain insight into how facilitation worked to improve preventive performance. The process evaluation was designed to answer questions concerning: (1) the time involved to deliver intervention components, (2) the quality of the delivery of intervention components, and (3) physician satisfaction with the intervention components. Quality was assessed by examining the scope or range of delivery of the intervention components and by analyzing the feedback received from practices on the usefulness of the intervention components.
Setting
The intervention arm of the trial included 22 practices with 54 physicians (Table 1). All health service organizations (HSOs) in Southwestern Ontario were approached to participate in the study. HSOs are primary care practices reimbursed primarily through capitation rather than fee for service. A total of 46 of the 100 primary care practices were recruited (response rate=46%). At follow-up only one intervention practice was lost, because the entire practice had moved. Intervention and control group practices did not differ significantly on any of the measured demographic characteristics (Table 2). Complete details on practice recruitment and attrition rates are published elsewhere.1
The practices covered a geographic area where the greatest distance between any 2 practices was more than 600 kilometers. PFs were assigned practices within a specific region of this geographic area. They arranged times to visit and work with intervention practices and traveled by car between visits to practices. PFs worked independently at their residences and corresponded with the project team through electronic mail regularly and quarterly with scheduled meetings.
Data Collection Tools
Each intervention practice was visited regularly by the same nurse facilitator who documented her activities and progress on 2 structured forms known as the weekly activity sheet and the monthly narrative report. Weekly activity sheets noted the number of hours spent on both on-site and offsite activities. Monthly narrative reports provided detailed information on the number of visits to a practice, the activities within each practice, the outcomes of those activities, the number of participants in meetings, and the plan for the following month. The activities in the narrative reports were summarized by intervention component to provide a cumulative overview of all intervention activity within a practice.
Also during the intervention, semistructured telephone interviews of participating physicians were conducted by 2 physician members of the project team at 6 months and 17 months. Participating physicians were asked what they had been happy and unhappy with and their ideas of improvement. Close-ended questions measured overall satisfaction with the intervention. The interview at 17 months also asked physicians if they would agree to have a nurse facilitator continue to visit their practice if funding were found.
At the end of the intervention, the PFs conducted interviews with each of the physicians identified as the primary contact in the intervention practices to solicit feedback on their experience. Physicians in both the intervention and control arm were sent a questionnaire by mail to report any changes that had taken place in their practice over the preceding 18 months.
Analysis
Data were analyzed to address the 3 research questions for the process evaluation utilizing the Logic Model as the conceptual framework (Figure). To determine how often various intervention components were delivered, the total hours spent at each practice and the total number of contacts with each practice by intervention component were calculated from the PF activity sheets.
To determine intervention quality, triangulation31 was used to attain a complete understanding of the quality of implementation. Multiple data sources and analysis methods were used to reveal the underlying dimensions of quality. All data sources were reviewed and analyses were conducted independently by 2 members of the investigation team. The members of the team held a debriefing session to discuss their findings and seek consensus. First, the monthly narrative reports across intervention sites were summarized to qualitatively describe the type, breadth, and scope of activity for each intervention component. Second, the activity descriptions and open-ended interview responses were content analyzed32 and coded, and frequencies were generated. The goal of this analysis was to identify significant descriptions of which intervention elements worked well and which did not. Finally, intervention and control practices were compared with contingency tables, and a chi-square statistic was used to determine differences on questionnaire responses concerning practice changes over the period of the intervention.
To determine physician satisfaction, open-ended satisfaction survey responses were coded and frequencies generated for ratings of overall satisfaction with the performance of the PF and the intervention.
Results
PF Program Implementation
Table 3 shows the number of hours spent on project activities during the period of the intervention. The PFs spent the largest proportion of their time (28%) on administrative duties, such as team meetings, telephone calls, internal reporting, preparing the project newsletter, coordinating networking conferences for intervention practices, photocopying, and filing. Sixteen percent of the PFs’ time was spent on-site facilitating changes to improve preventive care in the practice. Travel accounted for an average of 12% of the PFs’ time, although this varied depending on the distance to the practices.
Table 4 provides information on the number of contacts and hours spent on-site for each component of the intervention. On average, each intervention practice was contacted 33 times by a PF, with each visit lasting an average of 1 hour and 45 minutes. The most frequent forms of contact concerned developing reminder systems, conducting chart audits and providing feedback preventive care performance, and working to achieve consensus on the adoption of preventive care strategies. Both academic detailing to physicians and supplying critically appraised patient education materials averaged approximately 20 minutes but involved a great deal of preparation time. Few practices were interested in posters in the waiting room or a patient newsletter on prevention, so fewer contacts were made for those components.
Quality of Implementation
To assess quality, the frequency of each component of the intervention was tallied, physician feedback on the usefulness of intervention components was summarized, and self-reported practice changes between intervention and control physicians was reported.
Intervention Scope
Audit and Feedback. All 22 intervention practices received a presentation by the PF on the initial audit results to raise awareness of preventive care practice patterns. This was usually done in a kick-off meeting involving both physicians and nurses and often required more than one presentation to cover the various staff in the practice. Twenty practices requested subsequent analyses of data to follow their rates of performance. In addition, 18 practices requested audits of their charts for specific maneuvers, such as influenza vaccination and mammography
Consensus Building. All practices were involved in meetings with the PF to identify opportunities for improvement, assess needs, and select priority areas and strategies for improving preventive care performance. Interviews were conducted with nurses and other staff to promote their role in preventive care delivery.
Academic Detailing. Twenty-one out of 22 sites agreed to receive and discuss critically appraised evidence for the preventive maneuvers under study, and some requested similar information on other preventive areas, such as cholesterol and osteoporosis.
Reminder Systems. All of the intervention sites implemented some form of reminder system. Eighteen sites implemented a preventive care flow sheet; 2 sites used a chart stamp; and 2 sites implemented a computerized reminder system. Nineteen sites developed recall initiatives for flu vaccine, mammography, and Papanicolaou tests. Seventeen sites implemented chart stickers for smoking counseling or mammography.
Opinion Leaders. All sites received copies of the PF project newsletter that contained articles by influential individuals describing the importance of preventive care and descriptions of colleagues’ preventive care implementation efforts. Most practices attended a workshop that included an influential keynote speaker, and 27% of the participating physicians shared their knowledge about preventive care through publishing in the newsletter and/or public speaking.
Patient Education. All sites were provided with patient education materials from credible sources on request, and all received a binder of patient education materials constructed specifically to contain materials on the appropriate preventive maneuvers under study. The binders were regularly updated.
Patient Mediated. Posters designed to prompt patients to ask about folic acid, flu vaccine, and mammography were offered to all sites. Thirteen sites implemented a patient consent form for prostate-specific antigen (PSA) testing. Eight sites received preventive care diaries for patients. Five sites had a prevention newsletter for patients. Four sites agreed to pilot a health risk appraisal software program.
Physician Feedback
At the end of the intervention the facilitator asked physicians about their experience, including what was most and least useful to them. Table 5 provides a summary of the content analysis of physician responses.
Audit and feedback, both initially and subsequently, comprised the component most frequently considered to be important in creating change. Almost as often, the preventive care flow sheet was identified as useful. The facilitator sessions designed to seek consensus on preventive care guidelines and strategies for implementation were also appreciated.
Several physicians did not agree with the evidence on PSA testing. Others did not feel that counseling for folic acid was a priority. Some found the patient education binder cumbersome, and others found the sticker system for tobacco counseling unwieldy. Thus, both were underused. Two physicians noted that the preventive care wall chart was not helpful.
Physician Self-Reported Practice Changes
Eighty-six percent (93/108) of the intervention and control physicians responded to a questionnaire at the end of the study. Due to sample size, statistical power was limited to detecting an absolute difference of approximately 0.30 between groups, assuming an alpha of 0.05 and 80% power.33 Table 6 shows that 71% of intervention physicians compared with 28% of control physicians reported an audit of their practice for preventive services (P<.001). By the end of the study, 65% of the intervention physicians versus 48% of the control physicians indicated that they had a prevention policy or screening protocol in place, and 70% of intervention physicians compared with 58% of control physicians had created reminder systems for disease prevention.
Satisfaction with PF Intervention
At the telephone interview 6 months into the intervention, the mean satisfaction rating of intervention physicians was 4.08 on a scale of 1 (very dissatisfied) to 5 (very satisfied) with 80% satisfied with the intervention. At 17 months the mean satisfaction rating had risen to 4.5 with a 95% satisfaction rate.
At 6 months, 85% of the practices were satisfied with the frequency of visits of their assigned facilitator. At 17 months there was a 64% satisfaction rate, with the remaining 36% wanting more visits from the facilitator. The physicians commented on how the intervention had focused them on prevention in their practice. When the physicians were asked if they would agree to have a facilitator visit their practice in the future if given the opportunity, 90% agreed.
Concerns included not being able to continue the recall of patients at the end of the intervention and questioning the inappropriate maneuvers as too controversial. A physician from a large practice with 6 physicians commented that the facilitator could not easily work in the complex practice environment.
Discussion
Our study demonstrates that PFs can significantly improve the delivery of preventive services and in the process make quality contributions to a practice environment with high satisfaction rates from participating physicians.
Our intervention had a higher frequency and intensity of visits than other studies of this genre. The PFs had an average of almost 2 visits per month lasting approximately 105 minutes per visit. Dietrich and colleagues21 reported only 4 visits over a 3-month period lasting an average of 120 minutes, and Hulscher and coworkers22 reported approximately 25 visits with an average duration of 73 minutes. Others have been even less frequent,20 and in other studies it is not reported.34-36
The critical intervention components as evidenced by physician feedback, changes between control and intervention practices, and the amount of facilitator time spent on each component were: (1) audit and feedback, (2) sharing and discussing information to build consensus on an action plan, and (3) a reminder system. Similarly, the Cancer Prevention in Community Practice Project achieved 100% success in implementing change using customized preventive care flowsheets.37 Of the 7 intervention components, patient education materials and patient-mediated interventions such as posters in the waiting room were considered to be the least useful.
Overall, physicians and nurses working within the practices were very satisfied with the intervention, and 90% were willing to have the nurse facilitator continue working with their practice.
Lessons learned from the process evaluation for improving the delivery of the outreach facilitation intervention include:
Focusing on the 3 key intervention components (audit and feedback, seeking consensus on an action plan, and implementing a reminder system) and tailoring these to the needs of the practice
Preparing patient education and patient-mediated materials only if the practice requests such materials
Developing simpler strategies to encourage physicians to counsel their patients that smoke to quit smoking
Providing the facilitators an administrative assistant to reduce the amount of their time spent on administrative duties for the practices and increase time on-site
Strengths
The strengths of the study include the completeness of the data set, the theoretical framework for data collection, the use of multiple data sources and data collection methods, and the prospective data collection methodology.
Limitations
There are several limitations to the process evaluation methods. Much of the data was provided by the facilitators themselves, and therefore the possibility of bias exists. The study population consisted of HSOs, and therefore the results may not be generalizable. There is a possibility of social desirability bias in the satisfaction rates. Finally, our analyses of the process data were descriptive and exploratory.
Conclusions
Process evaluation often identifies future areas of research. Follow-up of the few practices that were dissatisfied with facilitation should be carried out to understand why they were dissatisfied. Sustainability needs to be addressed. For example, Dietrich and colleagues38 found that 5-year durability of a preventive services office system depended on the physician’s preventive care philosophy. McCowan and coworkers39 found that the effect of a facilitator was not sustained for 2 years. Finally, to maximize cost-effectiveness, more research is required to determine how much of a dose of facilitation is required and how frequently facilitators should visit to achieve a positive outcome.
Acknowledgments
We wish to acknowledge the financial support of the Ontario Ministry of Health, as well as the substantial contributions of the 3 nurse facilitators (Ingrid LeClaire, Ann MacLeod, and Ruth Blochlinger). We also wish to thank the many physicians and nurses who participated in the study.
1. Lemelin J, Hogg W, Baskerville B. Evidence to action: a tailored multi-faceted approach to changing family physician practice patterns and improving preventive care. CMAJ. In press.
2. Hutchison B, Woodward CA, Norman GR, Abelson J, Brown JA. Provision of preventive care to unannounced standardized patients. CMAJ 1998;158:185-93.
3. Spitzer WO. The Canadian Task Force on the Periodic Health Exanination: The Periodic Examination. CMAJ 1979;121:1193-254.
4. Canadian Task Force on the Periodic Health Examination: the Canadian guide to clinical preventive health care. Ottawa, Canada: Health Canada; 1994.
5. Tamblyn R, Battista RN. Changing clinical practice: what interventions work? J Cont Edu Health Prof 1993;13:273-88.
6. Davis DA, Thompson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA 1995;274:700-05.
7. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317:465-68.
8. The University of York. Effective health care: getting evidence into practice. NHS Centre for Reviews and Dissemination 1999;5:1-16.
9. Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from official to individual clinical policy. Am J Prev Med 1988;4(suppl):77-94.
10. Oxman AD, Thomson MA, Davis DA, Haynes B. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ 1995;153:1423-52.
11. Wensing M, Grol R. Single and combined strategies for implementing changes in primary care: a literature review. Int J Qual Health Care 1994;6:115-32.
12. Wensing M, van der Weijden T, Grol R. Implementing guidelines and innovations in general practice: which interventions are effective? Br J Gen Pract 1998;48:991-97.
13. Hulscher MEJL, Wensing M, Grol R, Weijden T, van Weel C. Interventions to improve the delivery of preventive services in primary care. Am J Public Health 1999;89:737-46.
14. Thomson MA, Oxman AD, Davis DA, Haynes RB, Freemantle N, Harvey E. Educational outreach visits: effects on professional practice and health care outcomes (Cochrane review). In: The Cochrane library. Oxford, England: Update Software; 1999.
15. Cohen SJ, Halvorson HW, Gosselink CA. Changing physician behavior to improve disease prevention. Prev Med 1994;23:284-91.
16. Main DS, Cohen SJ, DiClemente CC. Measuring physician readiness to change cancer screening: preliminary results. Am J Prev Med 1995;11:54-58.
17. Hulscher MEJL, Van Drenth BB, Mokkink HGA, et al. Tailored outreach visits as a method for implementing guidelines and improving preventive care. Int J Qual Health Care 1998;10:105-12.
18. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care. BMJ 1984;289:1585-87.
19. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care: a controlled trial of a low technology, low cost approach. BMJ 1987;294:1080-82.
20. Kottke TE, Solberg LI, Brekke ML. A controlled trial to integrate smoking cessation advice into primary care: doctors helping smokers, round III. J Fam Pract 1992;34:701-08.
21. Dietrich AJ, O’Connor GT, Keller A, Karney PA, Levy D, Whaley F. Cancer: improving early detection and prevention: a community practice randomised trial. BMJ 1992;304:687-91.
22. Hulscher M, Van Drenth B, van de Wouden J, Mokkink H, van Weel C, Grol R. Changing preventive practice: a controlled trial on the effects of outreach visits to organise prevention of cardiovascular disease. Qual Health Care 1997;6:19-24.
23. Dietrich AJ, Tobin JN, Sox CH, et al. Cancer early-detection services in community health centers for the underserved: a randomized controlled trial. Arch Fam Med 1998;7:320-27.
24. Dietrich AJ, Sox CH, Tosteson TD, Woodruff CB. Durability of improved physician early detection of cancer after conclusion of intervention support. Cancer Epidemiol Biomarkers Prev 1994;3:335-40.
25. Leininger LS, Leonard F, Larry D, et al. An Office system for organizing preventive services: a report by the American Cancer Society Advisory Group on Preventive Health Care Reminder Systems. Arch Fam Med 1996;5:108-15.
26. Rush B, Ogborne A. Program logic models: expanding their role and structure for program planning and evaluation. Can J Prog Eval 1991;6:96-106.
27. Wong-Reiger D, David L. Using program logic models to plan and evaluate education and prevention programs. Arnold Love, ed. Canadian Evaluation Society; 1995.
28. Kanouse D, Kallich J, Kahan J. Dissemination of effectiveness and outcomes research. Health Policy 1995;34:167-92.
29. Chen HT, Rossi PH. Evaluating with sense: the theory-driven approach. Eval Rev 1983;7:283.-
30. Stange KC, Zyzanski SJ, Jáen CR, et al. Illuminating the “black box”: a description of 4454 patient visits to 138 family physicians. J Fam Pract 1998;46:377-89.
31. Fielding N, Fielding J. Linking data. Beverly Hills, Calif: Sage Publications Inc; 1986.
32. Weber RP. Basic content analysis. 7-049 ed. Newbury Park, Calif: Sage Publications Inc; 1985.
33. Fleiss JL. Statistical methods for rates and proportions. 2nd ed. New York, NY: John Wiley & Sons; 1981.
34. Cockburn J, Ruth D, Silagy C, et al. Randomized trial of three approaches for marketing smoking cessation programmes to Australian general practitioners. BMJ 1992;304:691-94.
35. Manfredi C, Czaja R, Freels S, Trubitt M, Warnecke R, Lacey L. Prescribe for health: improving cancer screening in physician practices serving low-income and minority populations. Arch Fam Med 1998;7:329-37.
36. Kinsinger LS, Harris R, Qapish B, Strecher V, Kaluzny A. Using an office system intervention to increase breast cancer screening. JGIM 1998;13:507-14.
37. Carney PA, Dietrich AJ, Keller A, Landgraf J, O’Connor GT. Tools, teamwork and tenacity: an office system for cancer prevention. J Fam Prac 1992;35:388-94.
38. Rebelsky M, Sox CH, Dietrich AJ, Schwab BR, Labaree CE, Brown-Mckinney N. Physician preventive care philosophy and the five year durability of a preventive services office system. Soc Sci Med 1996;43:1073-81.
39. McCowan C, Neville RG, Crombie IK, Clark RA, Warner FC. The facilitator effect: results from a four-year follow-up of children with asthma. Br J Gen Pract 1997;47:156-60.
1. Lemelin J, Hogg W, Baskerville B. Evidence to action: a tailored multi-faceted approach to changing family physician practice patterns and improving preventive care. CMAJ. In press.
2. Hutchison B, Woodward CA, Norman GR, Abelson J, Brown JA. Provision of preventive care to unannounced standardized patients. CMAJ 1998;158:185-93.
3. Spitzer WO. The Canadian Task Force on the Periodic Health Exanination: The Periodic Examination. CMAJ 1979;121:1193-254.
4. Canadian Task Force on the Periodic Health Examination: the Canadian guide to clinical preventive health care. Ottawa, Canada: Health Canada; 1994.
5. Tamblyn R, Battista RN. Changing clinical practice: what interventions work? J Cont Edu Health Prof 1993;13:273-88.
6. Davis DA, Thompson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA 1995;274:700-05.
7. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317:465-68.
8. The University of York. Effective health care: getting evidence into practice. NHS Centre for Reviews and Dissemination 1999;5:1-16.
9. Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from official to individual clinical policy. Am J Prev Med 1988;4(suppl):77-94.
10. Oxman AD, Thomson MA, Davis DA, Haynes B. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ 1995;153:1423-52.
11. Wensing M, Grol R. Single and combined strategies for implementing changes in primary care: a literature review. Int J Qual Health Care 1994;6:115-32.
12. Wensing M, van der Weijden T, Grol R. Implementing guidelines and innovations in general practice: which interventions are effective? Br J Gen Pract 1998;48:991-97.
13. Hulscher MEJL, Wensing M, Grol R, Weijden T, van Weel C. Interventions to improve the delivery of preventive services in primary care. Am J Public Health 1999;89:737-46.
14. Thomson MA, Oxman AD, Davis DA, Haynes RB, Freemantle N, Harvey E. Educational outreach visits: effects on professional practice and health care outcomes (Cochrane review). In: The Cochrane library. Oxford, England: Update Software; 1999.
15. Cohen SJ, Halvorson HW, Gosselink CA. Changing physician behavior to improve disease prevention. Prev Med 1994;23:284-91.
16. Main DS, Cohen SJ, DiClemente CC. Measuring physician readiness to change cancer screening: preliminary results. Am J Prev Med 1995;11:54-58.
17. Hulscher MEJL, Van Drenth BB, Mokkink HGA, et al. Tailored outreach visits as a method for implementing guidelines and improving preventive care. Int J Qual Health Care 1998;10:105-12.
18. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care. BMJ 1984;289:1585-87.
19. Fullard E, Fowler G, Gray J. Facilitating prevention in primary care: a controlled trial of a low technology, low cost approach. BMJ 1987;294:1080-82.
20. Kottke TE, Solberg LI, Brekke ML. A controlled trial to integrate smoking cessation advice into primary care: doctors helping smokers, round III. J Fam Pract 1992;34:701-08.
21. Dietrich AJ, O’Connor GT, Keller A, Karney PA, Levy D, Whaley F. Cancer: improving early detection and prevention: a community practice randomised trial. BMJ 1992;304:687-91.
22. Hulscher M, Van Drenth B, van de Wouden J, Mokkink H, van Weel C, Grol R. Changing preventive practice: a controlled trial on the effects of outreach visits to organise prevention of cardiovascular disease. Qual Health Care 1997;6:19-24.
23. Dietrich AJ, Tobin JN, Sox CH, et al. Cancer early-detection services in community health centers for the underserved: a randomized controlled trial. Arch Fam Med 1998;7:320-27.
24. Dietrich AJ, Sox CH, Tosteson TD, Woodruff CB. Durability of improved physician early detection of cancer after conclusion of intervention support. Cancer Epidemiol Biomarkers Prev 1994;3:335-40.
25. Leininger LS, Leonard F, Larry D, et al. An Office system for organizing preventive services: a report by the American Cancer Society Advisory Group on Preventive Health Care Reminder Systems. Arch Fam Med 1996;5:108-15.
26. Rush B, Ogborne A. Program logic models: expanding their role and structure for program planning and evaluation. Can J Prog Eval 1991;6:96-106.
27. Wong-Reiger D, David L. Using program logic models to plan and evaluate education and prevention programs. Arnold Love, ed. Canadian Evaluation Society; 1995.
28. Kanouse D, Kallich J, Kahan J. Dissemination of effectiveness and outcomes research. Health Policy 1995;34:167-92.
29. Chen HT, Rossi PH. Evaluating with sense: the theory-driven approach. Eval Rev 1983;7:283.-
30. Stange KC, Zyzanski SJ, Jáen CR, et al. Illuminating the “black box”: a description of 4454 patient visits to 138 family physicians. J Fam Pract 1998;46:377-89.
31. Fielding N, Fielding J. Linking data. Beverly Hills, Calif: Sage Publications Inc; 1986.
32. Weber RP. Basic content analysis. 7-049 ed. Newbury Park, Calif: Sage Publications Inc; 1985.
33. Fleiss JL. Statistical methods for rates and proportions. 2nd ed. New York, NY: John Wiley & Sons; 1981.
34. Cockburn J, Ruth D, Silagy C, et al. Randomized trial of three approaches for marketing smoking cessation programmes to Australian general practitioners. BMJ 1992;304:691-94.
35. Manfredi C, Czaja R, Freels S, Trubitt M, Warnecke R, Lacey L. Prescribe for health: improving cancer screening in physician practices serving low-income and minority populations. Arch Fam Med 1998;7:329-37.
36. Kinsinger LS, Harris R, Qapish B, Strecher V, Kaluzny A. Using an office system intervention to increase breast cancer screening. JGIM 1998;13:507-14.
37. Carney PA, Dietrich AJ, Keller A, Landgraf J, O’Connor GT. Tools, teamwork and tenacity: an office system for cancer prevention. J Fam Prac 1992;35:388-94.
38. Rebelsky M, Sox CH, Dietrich AJ, Schwab BR, Labaree CE, Brown-Mckinney N. Physician preventive care philosophy and the five year durability of a preventive services office system. Soc Sci Med 1996;43:1073-81.
39. McCowan C, Neville RG, Crombie IK, Clark RA, Warner FC. The facilitator effect: results from a four-year follow-up of children with asthma. Br J Gen Pract 1997;47:156-60.