User login
Like other adult learners, physicians will seek and retain new knowledge only when motivated to do so (ie, when they have the need to know). As a result, efforts to increase clinicians’ use of the best information at the point of care must focus on providing them with well-validated evidence showing a direct and relevant benefit to their patients (eg, Patient-Oriented Evidence that Matters [POEMs] reviews1).
Previously,2 we described our efforts to identify the relatively few research findings in the medical literature that provide both relevant and valid new information for practicing clinicians. Of 8085 articles published in 85 medical journals over a 6-month period, only 2.6% (211) met these criteria.
These 211 research articles were summarized in issues of the newsletter Evidence-Based Practice and are incorporated into InfoRetriever, an electronic database using POEMs to improve information access at the point of care.3 Other such services, such as Journal Watch and Best Evidence4 provide similar reviews of the recent medical literature. However, Journal Watch has no published criteria explaining how articles are chosen for inclusion5 or how the validity of the information is determined. Best Evidence focuses primarily on the validity of research, and the criteria for relevance are not clearly defined.6 This valuing of rigor over relevance may lead to providing information to clinicians that they do not really need or omitting important information that they do need. In this exploratory study we aimed to find out how much overlap in content exists between Evidence-Based Practice and Best Evidence.
Methods
To evaluate the differences between Best Evidence and Evidence-Based Practice, we compared the articles in ACP Journal Club and the discontinued Evidence-Based Medicine, which are now combined into Best Evidence, with those summarized in Evidence-Based Practice. We chose for comparison the 5 issues of Evidence-Based Practice published between January and May of 1998. Since the time to publication of the ACP Journal Club and Evidence-Based Medicine is longer than that for Evidence-Based Practice, we used for comparison 6 bimonthly issues of both ACP Journal Club and Evidence-Based Medicine, starting with the November-December 1997 issues and ending with the November-December 1998 issues.
Results
Over a 5-month period, 85 POEMs were published in Evidence-Based Practice. There was little overlap between the 3 publications. Only 11 (12.9%) of these POEMs were also published in either ACP Journal Club or Evidence-Based Medicine. To compare in the other direction, 3 bimonthly issues of Evidence-Based Medicine and ACP Journal Club were chosen and compared with all issues of Evidence-Based Practice. The results are summarized in the Table. A total of 109 synopses were published in the 2 Best Evidence publications during this time. Most of these synopses (n=82, 75.2%) were not considered POEMs and were not published in Evidence-Based Practice. Of the 49 distinct articles (33 articles were reviewed in both publications) found in these publications but not selected for Evidence-Based Practice, 22 (45%) studied interventions or diseases not relevant to family practice, 15 (31%) would not induce a change in practice, 8 (16%) were in journals not covered by Evidence-Based Practice (only 1 of these articles was a POEM), 3 (6%) evaluated disease-oriented outcomes, and 1 article (2%) was a POEM that had been earmarked for inclusion in Evidence Based Practice but was lost in transmission.
Discussion
This small informal study shows the marked difference between Evidence-Based Practice and the content of Best Evidence. Readers of only Best Evidence would miss a significant amount of high-quality information directly applicable to primary care practice. Although Evidence-Based Practice and Best Evidence use essentially the same validity criteria6,7 to screen preliminary research results, the key difference between them resides in the relevance of the information each source chooses to present. Evidence-Based Practice focuses on patient-oriented evidence that matters, which is information that must pass 3 relevance criteria: (1) the affected outcome must be one that patients care about (ie, not disease-oriented outcomes); (2) the proposed intervention must be feasible and deal with a common problem; and (3) the results being presented should require a change in practice on the part of most clinicians.
The concepts embodied in evidence-based medicine have been described as the long-awaited bridge between research and clinical practice. Although the techniques of evidence-based medicine have greatly enhanced and simplified the evaluation of the validity of clinical research, they are not practical to meet the day-to-day needs of busy, real-life clinicians. This method is problem driven; the search for information begins with the generation of a specific patient-based question. However, primary care clinicians are usually in a more general keeping-up mode, where foraging for information is just as important as hunting for answers to patients’ specific questions.8
The traditional evidence-based medicine approach, although attractive to academics, has not been widely embraced by clinicians because it focuses on identifying and validating information communicated by the written word, making it unrealistic and too time consuming for most clinicians. This approach of rigor over relevance is rooted deep in the foundation of pedagogy,9 but is less valid when applied to adult learners.
Two specific tools are needed to help physicians efficiently identify information that is highly relevant and valid. Clinicians need a first-alert method—a POEM bulletin board—for relevant new information as it becomes available. Resources—newsletters, Web sites, continuing education, and others—used by clinicians to update their knowledge should carefully filter out preliminary or unverified information so that this keeping-up process is efficient.
Clinicians also need a way of rapidly retrieving the information to which they have been alerted but that has not yet been cemented into their minds. Computer-based resources (especially handheld portable devices) are available that can provide information in less than 30 seconds. To be lifelong learners, physicians have to use tools that help them to hunt and forage through the jungle of information.
Irrelevant information, even if highly valid, is not useful in the scope of a busy daily practice. Sifting through valid, but irrelevant, information wastes valuable information-gathering time. To be relevant and complete, a comprehensive foraging source for new information must contain specialty-specific POEMs (a POEM alert system). The one-stop shopping approach of information for all specialties offered by Best Evidence does not meet this need.
1. DC, Shaughnessy AF, Bennett JH. Becoming a medical information master: feeling good about not knowing everything. J Fam Pract 1994;38:505-13.
2. MH, Barry HC, Slawson DC, Shaughnessy AF. Finding POEMs in the medical literature. J Fam Pract 1999;48:350-55.
3. Accessed August 22, 2000.
4. Accessed August 22, 2000.
5. Accessed August 22, 2000.
6. Accessed August 22, 2000.
7. Accessed August 22, 2000.
8. AF, Slawson DC. Are we providing doctors with the training and tools for lifelong learning? BMJ (www.bmj.com/cgi/content/full/319/7220/1280).
9. DA. The reflective practitioner: how professionals think in action. New York, NY: Basic Books; 1983;37:49.-
Like other adult learners, physicians will seek and retain new knowledge only when motivated to do so (ie, when they have the need to know). As a result, efforts to increase clinicians’ use of the best information at the point of care must focus on providing them with well-validated evidence showing a direct and relevant benefit to their patients (eg, Patient-Oriented Evidence that Matters [POEMs] reviews1).
Previously,2 we described our efforts to identify the relatively few research findings in the medical literature that provide both relevant and valid new information for practicing clinicians. Of 8085 articles published in 85 medical journals over a 6-month period, only 2.6% (211) met these criteria.
These 211 research articles were summarized in issues of the newsletter Evidence-Based Practice and are incorporated into InfoRetriever, an electronic database using POEMs to improve information access at the point of care.3 Other such services, such as Journal Watch and Best Evidence4 provide similar reviews of the recent medical literature. However, Journal Watch has no published criteria explaining how articles are chosen for inclusion5 or how the validity of the information is determined. Best Evidence focuses primarily on the validity of research, and the criteria for relevance are not clearly defined.6 This valuing of rigor over relevance may lead to providing information to clinicians that they do not really need or omitting important information that they do need. In this exploratory study we aimed to find out how much overlap in content exists between Evidence-Based Practice and Best Evidence.
Methods
To evaluate the differences between Best Evidence and Evidence-Based Practice, we compared the articles in ACP Journal Club and the discontinued Evidence-Based Medicine, which are now combined into Best Evidence, with those summarized in Evidence-Based Practice. We chose for comparison the 5 issues of Evidence-Based Practice published between January and May of 1998. Since the time to publication of the ACP Journal Club and Evidence-Based Medicine is longer than that for Evidence-Based Practice, we used for comparison 6 bimonthly issues of both ACP Journal Club and Evidence-Based Medicine, starting with the November-December 1997 issues and ending with the November-December 1998 issues.
Results
Over a 5-month period, 85 POEMs were published in Evidence-Based Practice. There was little overlap between the 3 publications. Only 11 (12.9%) of these POEMs were also published in either ACP Journal Club or Evidence-Based Medicine. To compare in the other direction, 3 bimonthly issues of Evidence-Based Medicine and ACP Journal Club were chosen and compared with all issues of Evidence-Based Practice. The results are summarized in the Table. A total of 109 synopses were published in the 2 Best Evidence publications during this time. Most of these synopses (n=82, 75.2%) were not considered POEMs and were not published in Evidence-Based Practice. Of the 49 distinct articles (33 articles were reviewed in both publications) found in these publications but not selected for Evidence-Based Practice, 22 (45%) studied interventions or diseases not relevant to family practice, 15 (31%) would not induce a change in practice, 8 (16%) were in journals not covered by Evidence-Based Practice (only 1 of these articles was a POEM), 3 (6%) evaluated disease-oriented outcomes, and 1 article (2%) was a POEM that had been earmarked for inclusion in Evidence Based Practice but was lost in transmission.
Discussion
This small informal study shows the marked difference between Evidence-Based Practice and the content of Best Evidence. Readers of only Best Evidence would miss a significant amount of high-quality information directly applicable to primary care practice. Although Evidence-Based Practice and Best Evidence use essentially the same validity criteria6,7 to screen preliminary research results, the key difference between them resides in the relevance of the information each source chooses to present. Evidence-Based Practice focuses on patient-oriented evidence that matters, which is information that must pass 3 relevance criteria: (1) the affected outcome must be one that patients care about (ie, not disease-oriented outcomes); (2) the proposed intervention must be feasible and deal with a common problem; and (3) the results being presented should require a change in practice on the part of most clinicians.
The concepts embodied in evidence-based medicine have been described as the long-awaited bridge between research and clinical practice. Although the techniques of evidence-based medicine have greatly enhanced and simplified the evaluation of the validity of clinical research, they are not practical to meet the day-to-day needs of busy, real-life clinicians. This method is problem driven; the search for information begins with the generation of a specific patient-based question. However, primary care clinicians are usually in a more general keeping-up mode, where foraging for information is just as important as hunting for answers to patients’ specific questions.8
The traditional evidence-based medicine approach, although attractive to academics, has not been widely embraced by clinicians because it focuses on identifying and validating information communicated by the written word, making it unrealistic and too time consuming for most clinicians. This approach of rigor over relevance is rooted deep in the foundation of pedagogy,9 but is less valid when applied to adult learners.
Two specific tools are needed to help physicians efficiently identify information that is highly relevant and valid. Clinicians need a first-alert method—a POEM bulletin board—for relevant new information as it becomes available. Resources—newsletters, Web sites, continuing education, and others—used by clinicians to update their knowledge should carefully filter out preliminary or unverified information so that this keeping-up process is efficient.
Clinicians also need a way of rapidly retrieving the information to which they have been alerted but that has not yet been cemented into their minds. Computer-based resources (especially handheld portable devices) are available that can provide information in less than 30 seconds. To be lifelong learners, physicians have to use tools that help them to hunt and forage through the jungle of information.
Irrelevant information, even if highly valid, is not useful in the scope of a busy daily practice. Sifting through valid, but irrelevant, information wastes valuable information-gathering time. To be relevant and complete, a comprehensive foraging source for new information must contain specialty-specific POEMs (a POEM alert system). The one-stop shopping approach of information for all specialties offered by Best Evidence does not meet this need.
Like other adult learners, physicians will seek and retain new knowledge only when motivated to do so (ie, when they have the need to know). As a result, efforts to increase clinicians’ use of the best information at the point of care must focus on providing them with well-validated evidence showing a direct and relevant benefit to their patients (eg, Patient-Oriented Evidence that Matters [POEMs] reviews1).
Previously,2 we described our efforts to identify the relatively few research findings in the medical literature that provide both relevant and valid new information for practicing clinicians. Of 8085 articles published in 85 medical journals over a 6-month period, only 2.6% (211) met these criteria.
These 211 research articles were summarized in issues of the newsletter Evidence-Based Practice and are incorporated into InfoRetriever, an electronic database using POEMs to improve information access at the point of care.3 Other such services, such as Journal Watch and Best Evidence4 provide similar reviews of the recent medical literature. However, Journal Watch has no published criteria explaining how articles are chosen for inclusion5 or how the validity of the information is determined. Best Evidence focuses primarily on the validity of research, and the criteria for relevance are not clearly defined.6 This valuing of rigor over relevance may lead to providing information to clinicians that they do not really need or omitting important information that they do need. In this exploratory study we aimed to find out how much overlap in content exists between Evidence-Based Practice and Best Evidence.
Methods
To evaluate the differences between Best Evidence and Evidence-Based Practice, we compared the articles in ACP Journal Club and the discontinued Evidence-Based Medicine, which are now combined into Best Evidence, with those summarized in Evidence-Based Practice. We chose for comparison the 5 issues of Evidence-Based Practice published between January and May of 1998. Since the time to publication of the ACP Journal Club and Evidence-Based Medicine is longer than that for Evidence-Based Practice, we used for comparison 6 bimonthly issues of both ACP Journal Club and Evidence-Based Medicine, starting with the November-December 1997 issues and ending with the November-December 1998 issues.
Results
Over a 5-month period, 85 POEMs were published in Evidence-Based Practice. There was little overlap between the 3 publications. Only 11 (12.9%) of these POEMs were also published in either ACP Journal Club or Evidence-Based Medicine. To compare in the other direction, 3 bimonthly issues of Evidence-Based Medicine and ACP Journal Club were chosen and compared with all issues of Evidence-Based Practice. The results are summarized in the Table. A total of 109 synopses were published in the 2 Best Evidence publications during this time. Most of these synopses (n=82, 75.2%) were not considered POEMs and were not published in Evidence-Based Practice. Of the 49 distinct articles (33 articles were reviewed in both publications) found in these publications but not selected for Evidence-Based Practice, 22 (45%) studied interventions or diseases not relevant to family practice, 15 (31%) would not induce a change in practice, 8 (16%) were in journals not covered by Evidence-Based Practice (only 1 of these articles was a POEM), 3 (6%) evaluated disease-oriented outcomes, and 1 article (2%) was a POEM that had been earmarked for inclusion in Evidence Based Practice but was lost in transmission.
Discussion
This small informal study shows the marked difference between Evidence-Based Practice and the content of Best Evidence. Readers of only Best Evidence would miss a significant amount of high-quality information directly applicable to primary care practice. Although Evidence-Based Practice and Best Evidence use essentially the same validity criteria6,7 to screen preliminary research results, the key difference between them resides in the relevance of the information each source chooses to present. Evidence-Based Practice focuses on patient-oriented evidence that matters, which is information that must pass 3 relevance criteria: (1) the affected outcome must be one that patients care about (ie, not disease-oriented outcomes); (2) the proposed intervention must be feasible and deal with a common problem; and (3) the results being presented should require a change in practice on the part of most clinicians.
The concepts embodied in evidence-based medicine have been described as the long-awaited bridge between research and clinical practice. Although the techniques of evidence-based medicine have greatly enhanced and simplified the evaluation of the validity of clinical research, they are not practical to meet the day-to-day needs of busy, real-life clinicians. This method is problem driven; the search for information begins with the generation of a specific patient-based question. However, primary care clinicians are usually in a more general keeping-up mode, where foraging for information is just as important as hunting for answers to patients’ specific questions.8
The traditional evidence-based medicine approach, although attractive to academics, has not been widely embraced by clinicians because it focuses on identifying and validating information communicated by the written word, making it unrealistic and too time consuming for most clinicians. This approach of rigor over relevance is rooted deep in the foundation of pedagogy,9 but is less valid when applied to adult learners.
Two specific tools are needed to help physicians efficiently identify information that is highly relevant and valid. Clinicians need a first-alert method—a POEM bulletin board—for relevant new information as it becomes available. Resources—newsletters, Web sites, continuing education, and others—used by clinicians to update their knowledge should carefully filter out preliminary or unverified information so that this keeping-up process is efficient.
Clinicians also need a way of rapidly retrieving the information to which they have been alerted but that has not yet been cemented into their minds. Computer-based resources (especially handheld portable devices) are available that can provide information in less than 30 seconds. To be lifelong learners, physicians have to use tools that help them to hunt and forage through the jungle of information.
Irrelevant information, even if highly valid, is not useful in the scope of a busy daily practice. Sifting through valid, but irrelevant, information wastes valuable information-gathering time. To be relevant and complete, a comprehensive foraging source for new information must contain specialty-specific POEMs (a POEM alert system). The one-stop shopping approach of information for all specialties offered by Best Evidence does not meet this need.
1. DC, Shaughnessy AF, Bennett JH. Becoming a medical information master: feeling good about not knowing everything. J Fam Pract 1994;38:505-13.
2. MH, Barry HC, Slawson DC, Shaughnessy AF. Finding POEMs in the medical literature. J Fam Pract 1999;48:350-55.
3. Accessed August 22, 2000.
4. Accessed August 22, 2000.
5. Accessed August 22, 2000.
6. Accessed August 22, 2000.
7. Accessed August 22, 2000.
8. AF, Slawson DC. Are we providing doctors with the training and tools for lifelong learning? BMJ (www.bmj.com/cgi/content/full/319/7220/1280).
9. DA. The reflective practitioner: how professionals think in action. New York, NY: Basic Books; 1983;37:49.-
1. DC, Shaughnessy AF, Bennett JH. Becoming a medical information master: feeling good about not knowing everything. J Fam Pract 1994;38:505-13.
2. MH, Barry HC, Slawson DC, Shaughnessy AF. Finding POEMs in the medical literature. J Fam Pract 1999;48:350-55.
3. Accessed August 22, 2000.
4. Accessed August 22, 2000.
5. Accessed August 22, 2000.
6. Accessed August 22, 2000.
7. Accessed August 22, 2000.
8. AF, Slawson DC. Are we providing doctors with the training and tools for lifelong learning? BMJ (www.bmj.com/cgi/content/full/319/7220/1280).
9. DA. The reflective practitioner: how professionals think in action. New York, NY: Basic Books; 1983;37:49.-