User login
Mad Cow disease: Dealing sensibly with a new concern
After a period out of the spotlight, Mad Cow disease is again causing a stir. Following the first documented case in this country on December 23, 2003,1 the US government is instituting new preventive measures, and patients may be asking for assurances of safe-ty (see “What to advise patients,” page 565).
Mad Cow’s connection to humans: vCJD
Mad Cow disease is the bovine form of transmissible spongiform encephalopathy (TSE), a disease that can also affect sheep, deer, goats, and humans (Table 1). The causative agent is thought to be an infective protein called a prion, discovered in 1997.
Bovine spongiform encephalopathy (BSE) was first identified in the United Kingdom in 1986 and caused a large outbreak in cattle, which peaked in 1993. Subsequently, it was discovered that BSE could rarely spread to humans, causing a variant of Creutzfeldt-Jakob disease (vCJD) that is universally fatal. As of December 2003, 153 cases of vCJD had been reported worldwide, most in the UK. Confirmation of either the classic or variant form requires pathology examination of brain tissue collected by biopsy or, if a patient has died, at autopsy.2
TABLE 1
Transmissible spongiform encephalopathies
Species affected | Prion disease | Transmissible to humans? |
---|---|---|
Mink | Transmissible mink encephalopathy | No |
Sheep and goats | Scrapie | Historically no; questionable in newly discovered atypical cases |
Deer and elk | Chronic wasting disease | Possible (under investigation) |
Cattle and bison | Bovine spongiform encephalopathy | Yes (variant CJD) |
Humans | Creutzfeldt-jakob disease; variant CJD, Gerstmann-Straussler-Scheinker disease, Kuru, fatal familial insomnia | Through contaminated medical products, instruments, possibly blood |
Uniqueness of vCJD
In medical school, family physicians learned about classic CJD, which is endemic throughout the world and, in the US, causes an average of 1 death per million people per year. The epidemiology of vCJD and CJD are quite different (Table 2). Because vCJD is a new disease, its incubation period is unknown, but it is likely to be years or decades. In the UK it is thought that exposure to BSE-contaminated food from 1984–86 and the onset of vCJD cases in 1994–96 is consistent with such a long incubation period.
Since 1986, BSE has been identified in 20 European countries, Japan, Israel, Canada, and now the US. The main method of its spread through herds is believed to be the former practice of feeding cattle the meat and bone meal products that, at some point, were contaminated with BSE. In 1997, the US and Canada prohibited the feeding of ruminant meat and bone meal to other ruminants. It is thought that most cases of vCJD are transmitted to people when they eat beef products containing brain or spinal cord material contaminated with BSE.
Neuropathology. Variant CJD deposits plaques, vacuoles, and prion protein in the brain. To date, all persons with vCJD have had methio-nine homozygosity at the polymorphic codon 129 of the prion protein gene, suggesting that persons not carrying this genotype (who make 60% of the population) have increased resistance to the disease. In addition, vCJD and BSE are both dose-dependent infections, so both genetics and exposure may explain why so few human cases have occurred despite the widespread outbreak of BSE in the UK.
TABLE 2
Characteristics distinguishing vCJD from CJD
Characteristic | UK vCJD | US classic CJD |
---|---|---|
Median age at death | 28 (range, 14–74) | 68 (range, 23–97)* |
Median illness duration (mo) | 13–14 | 4–5 |
Clinical presentation | Prominent psychiatric/behavioral symptoms; delayed neurologic signs | Dementia; early neurologic signs |
Periodic sharp waves on EEG | Absent | Often present |
“Pulvinar sign” on MRI† | Present in >75% of cases | Not reported |
Presense of “florid plaques” on neuropathology | Present in great numbers | Rare or absent |
Immunohistochemical analysis of brain tissue | Marked accumulation of PrPres | Variable accumulation |
Presense of agent in lymphoid tissue | Readily detected | Not readily detected |
Increased glycoform ratio on immunoblot analysis of PrPres | Present | Not present |
Genotype at codon 129 of prion protein | Methionine/Methionine | Polymorphic |
*Surveillance data 1997–2001. | ||
† High signal in the posterior thalmus. | ||
CJD; Creutzfeldt-Jakob disease; vCJD, variant CJD; EEG, electoencephalogram; MRI, magnetic resonance imaging; | ||
PrPres, protease-resistant prion protein. | ||
Source: Centers for Disease Control and Prevention, MMWR Morb Mortal Wkly Rep 2004; 52:1280–1285.1 |
Prevention measures have been updated
Before December 30, 2003, prevention measures in place to prevent BSE in this country were the following:
- Import restrictions on bovine-derived consumer products from high-risk BSE countries (initiated in 1989).
- Prohibition of the use of ruminant derived meat and bone meal in cattle feed (initiated in 1997).
- A surveillance system for BSE that involved annual testing of between 5000 and 20,000 cattle slaughtered for human consumption (out of about 35 million cattle slaughtered per year).
Since December 30, 2003, the US Department of Agriculture (USDA) and Food and Drug Administration (FDA) have added or proposed a number of additional provisions to prevent BSE:
- Defining high-risk materials banned for human consumption, including the entire verte-bral column.
- Banning the use of advanced meat recovery systems on vertebral columns. These systems use brushes and air to blast soft tissue off of bone and led to up to 30% of hamburger sampled to be contaminated with central nervous system tissue.
- Proposing an expanded annual surveillance to include about 200,000 high-risk cattle (sick, suspect, dead) and a random sample of 20,000 normal cattle over 30 months old.
But are these measures enough?
Concerns about these new measures center on the surveillance program. First, how long will it take the USDA to expand its testing? Second, will even this expanded testing be sufficient? Some scientists and consumer advocates propose adopting the policy of the European Union, which is to test all cattle over 30 months of age, since this age group can harbor BSE without being ill.
Other congressional proposals include ban-ning all high-risk meat products from all animal feeds and cosmetics, and creating a prion disease task force to coordinate surveillance and research for all prion diseases. Unfortunately, because we have been testing so few cattle for BSE, we don’t really know if there are more infected cattle in our food system. Interestingly, in Japan, where all cattle are tested for BSE after slaughter, 10 more infected animals were discovered, most of which lacked the characteristics that would put them at high risk.3
To date, the beef industry has supported the changes already put into effect, but not the additional ones noted above. Ironically, a number of small, upscale slaughterhouses have proposed testing all cattle they slaughter (mostly under 30 months old) so they may resume sales to Japan. The USDA has turned down their requests for the chemical reagents to run the BSE tests (the agency controls the sale of these kits), citing its concern that testing all cattle would give the impression it is necessary for the entire US herd—a proposition the USDA and many scientists believe is unnecessary. Thus, the controversy over BSE surveillance has now become an economic, political, and scientific issue.
What to advise patients
The risk of contracting vCJD from eating contaminated beef is extremely small.4
There has yet to be a case of BSE found in any native-born US cattle.
There is no association between BSE and milk or milk products.
If traveling to countries where BSE is endem-ic—ie, UK and Portugal—patients may avoid beef altogether or limit consumption to whole cuts, not ground beef or sausage.5
Avoid bovine-derived nutritional supplements, especially those containing bovine pitu-itary, thyroid, adrenal, thymus, or other organ tissue.
Avoid products containing bovine meat or bone meal, such as some types of garden fertilizers.
Deer and elk can develop chronic wasting disease (CWD), another form of TSE. States that have recorded CWD cases include Colorado, Illinois, Wisconsin, and Wyoming. CWD is not known to cause disease in humans, but the risk to hunters and those who eat the meat is unknown. Physicians may want to advise hunters to have deer and elk hunted in CWD areas tested and only CWD negative animals processed for meat. Guidelines for field dressing deer and elk to prevent possible contamination of meat are available at state Departments of Natural Resources.
Investigating suspected disease
Physicians who suspect a patient may have vCJD or CJD, or that a patient has died of such disease, should advocate for brain biopsy or autopsy. The National Prion Disease Pathology Surveillance Center at Case Western University (funded by the Centers for Disease Control and Prevention) provides diagnostic services free of charge to physicians and health departments (available at www.cjdsurveillance.com).
Federal agencies, Congress, and the public became more aware of BSE on December 23, 2003, when the US Department of Agriculture (USDA) diagnosed the disease in a dairy cow in Washington state.1 The cow, traced to a herd originating in Canada, was 6.5 years old and had been slaughtered on December 9. Whether the cow was a “downer” (nonambulatory) is still under investigation. Downer cows are automatically tested; however, it is possible this cow was tested as part of a routine surveillance system rather than because it was at high risk of disease. Regardless, the carcass was released for use as food while tissues considered more risky for BSE transmission (brain, spinal cord, and small intestine) were kept from the human food supply.
After the case was diagnosed, the USDA recalled all meat from cattle slaughtered at that plant the same day. Unfortunately about 30,000 pounds of potentially contaminated meat was never recovered and ended up on consumers’ plates.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Centers for Disease Control and Prevention (CDC) Bovine spongiform encephalopathy in a dairy cow—Washington State, 2003. MMWR Morb Mortal Wkly Rep 2004;52:1280-1285.Available at: www.cdc.gov/mmwr/preview/mmwrhtml/mm5253a2.htm. Accessed on July 15, 2004.
2. CDC.BSE and CJD information and resources. Available at: www.cdc.gov/ncidod/diseases/cjd/cjd.html. Accessed on July 15, 2004.
3. Kaufman M. They’re not allowed to test for Mad Cow. Washington Post National Weekly, May 3–9, 2004;21.-
4. US Food and Drug Administration, Center for Food Safety and Applied Nutrition Commonly asked questions about BSE in products regulated by FDA’s Center for Food Safety and Applied Nutrition (CFSAN). Available at: www.cfsan.fda.gov/~comm/bsefaq.html. Accessed on July 15, 2004.
5. CDC. Bovine spongiform encephalopathy and variant Creutzfeldt-Jakob disease. Available at: www.cdc.gov/travel/diseases/madcow.htm. Accessed on July 15, 2004.
After a period out of the spotlight, Mad Cow disease is again causing a stir. Following the first documented case in this country on December 23, 2003,1 the US government is instituting new preventive measures, and patients may be asking for assurances of safe-ty (see “What to advise patients,” page 565).
Mad Cow’s connection to humans: vCJD
Mad Cow disease is the bovine form of transmissible spongiform encephalopathy (TSE), a disease that can also affect sheep, deer, goats, and humans (Table 1). The causative agent is thought to be an infective protein called a prion, discovered in 1997.
Bovine spongiform encephalopathy (BSE) was first identified in the United Kingdom in 1986 and caused a large outbreak in cattle, which peaked in 1993. Subsequently, it was discovered that BSE could rarely spread to humans, causing a variant of Creutzfeldt-Jakob disease (vCJD) that is universally fatal. As of December 2003, 153 cases of vCJD had been reported worldwide, most in the UK. Confirmation of either the classic or variant form requires pathology examination of brain tissue collected by biopsy or, if a patient has died, at autopsy.2
TABLE 1
Transmissible spongiform encephalopathies
Species affected | Prion disease | Transmissible to humans? |
---|---|---|
Mink | Transmissible mink encephalopathy | No |
Sheep and goats | Scrapie | Historically no; questionable in newly discovered atypical cases |
Deer and elk | Chronic wasting disease | Possible (under investigation) |
Cattle and bison | Bovine spongiform encephalopathy | Yes (variant CJD) |
Humans | Creutzfeldt-jakob disease; variant CJD, Gerstmann-Straussler-Scheinker disease, Kuru, fatal familial insomnia | Through contaminated medical products, instruments, possibly blood |
Uniqueness of vCJD
In medical school, family physicians learned about classic CJD, which is endemic throughout the world and, in the US, causes an average of 1 death per million people per year. The epidemiology of vCJD and CJD are quite different (Table 2). Because vCJD is a new disease, its incubation period is unknown, but it is likely to be years or decades. In the UK it is thought that exposure to BSE-contaminated food from 1984–86 and the onset of vCJD cases in 1994–96 is consistent with such a long incubation period.
Since 1986, BSE has been identified in 20 European countries, Japan, Israel, Canada, and now the US. The main method of its spread through herds is believed to be the former practice of feeding cattle the meat and bone meal products that, at some point, were contaminated with BSE. In 1997, the US and Canada prohibited the feeding of ruminant meat and bone meal to other ruminants. It is thought that most cases of vCJD are transmitted to people when they eat beef products containing brain or spinal cord material contaminated with BSE.
Neuropathology. Variant CJD deposits plaques, vacuoles, and prion protein in the brain. To date, all persons with vCJD have had methio-nine homozygosity at the polymorphic codon 129 of the prion protein gene, suggesting that persons not carrying this genotype (who make 60% of the population) have increased resistance to the disease. In addition, vCJD and BSE are both dose-dependent infections, so both genetics and exposure may explain why so few human cases have occurred despite the widespread outbreak of BSE in the UK.
TABLE 2
Characteristics distinguishing vCJD from CJD
Characteristic | UK vCJD | US classic CJD |
---|---|---|
Median age at death | 28 (range, 14–74) | 68 (range, 23–97)* |
Median illness duration (mo) | 13–14 | 4–5 |
Clinical presentation | Prominent psychiatric/behavioral symptoms; delayed neurologic signs | Dementia; early neurologic signs |
Periodic sharp waves on EEG | Absent | Often present |
“Pulvinar sign” on MRI† | Present in >75% of cases | Not reported |
Presense of “florid plaques” on neuropathology | Present in great numbers | Rare or absent |
Immunohistochemical analysis of brain tissue | Marked accumulation of PrPres | Variable accumulation |
Presense of agent in lymphoid tissue | Readily detected | Not readily detected |
Increased glycoform ratio on immunoblot analysis of PrPres | Present | Not present |
Genotype at codon 129 of prion protein | Methionine/Methionine | Polymorphic |
*Surveillance data 1997–2001. | ||
† High signal in the posterior thalmus. | ||
CJD; Creutzfeldt-Jakob disease; vCJD, variant CJD; EEG, electoencephalogram; MRI, magnetic resonance imaging; | ||
PrPres, protease-resistant prion protein. | ||
Source: Centers for Disease Control and Prevention, MMWR Morb Mortal Wkly Rep 2004; 52:1280–1285.1 |
Prevention measures have been updated
Before December 30, 2003, prevention measures in place to prevent BSE in this country were the following:
- Import restrictions on bovine-derived consumer products from high-risk BSE countries (initiated in 1989).
- Prohibition of the use of ruminant derived meat and bone meal in cattle feed (initiated in 1997).
- A surveillance system for BSE that involved annual testing of between 5000 and 20,000 cattle slaughtered for human consumption (out of about 35 million cattle slaughtered per year).
Since December 30, 2003, the US Department of Agriculture (USDA) and Food and Drug Administration (FDA) have added or proposed a number of additional provisions to prevent BSE:
- Defining high-risk materials banned for human consumption, including the entire verte-bral column.
- Banning the use of advanced meat recovery systems on vertebral columns. These systems use brushes and air to blast soft tissue off of bone and led to up to 30% of hamburger sampled to be contaminated with central nervous system tissue.
- Proposing an expanded annual surveillance to include about 200,000 high-risk cattle (sick, suspect, dead) and a random sample of 20,000 normal cattle over 30 months old.
But are these measures enough?
Concerns about these new measures center on the surveillance program. First, how long will it take the USDA to expand its testing? Second, will even this expanded testing be sufficient? Some scientists and consumer advocates propose adopting the policy of the European Union, which is to test all cattle over 30 months of age, since this age group can harbor BSE without being ill.
Other congressional proposals include ban-ning all high-risk meat products from all animal feeds and cosmetics, and creating a prion disease task force to coordinate surveillance and research for all prion diseases. Unfortunately, because we have been testing so few cattle for BSE, we don’t really know if there are more infected cattle in our food system. Interestingly, in Japan, where all cattle are tested for BSE after slaughter, 10 more infected animals were discovered, most of which lacked the characteristics that would put them at high risk.3
To date, the beef industry has supported the changes already put into effect, but not the additional ones noted above. Ironically, a number of small, upscale slaughterhouses have proposed testing all cattle they slaughter (mostly under 30 months old) so they may resume sales to Japan. The USDA has turned down their requests for the chemical reagents to run the BSE tests (the agency controls the sale of these kits), citing its concern that testing all cattle would give the impression it is necessary for the entire US herd—a proposition the USDA and many scientists believe is unnecessary. Thus, the controversy over BSE surveillance has now become an economic, political, and scientific issue.
What to advise patients
The risk of contracting vCJD from eating contaminated beef is extremely small.4
There has yet to be a case of BSE found in any native-born US cattle.
There is no association between BSE and milk or milk products.
If traveling to countries where BSE is endem-ic—ie, UK and Portugal—patients may avoid beef altogether or limit consumption to whole cuts, not ground beef or sausage.5
Avoid bovine-derived nutritional supplements, especially those containing bovine pitu-itary, thyroid, adrenal, thymus, or other organ tissue.
Avoid products containing bovine meat or bone meal, such as some types of garden fertilizers.
Deer and elk can develop chronic wasting disease (CWD), another form of TSE. States that have recorded CWD cases include Colorado, Illinois, Wisconsin, and Wyoming. CWD is not known to cause disease in humans, but the risk to hunters and those who eat the meat is unknown. Physicians may want to advise hunters to have deer and elk hunted in CWD areas tested and only CWD negative animals processed for meat. Guidelines for field dressing deer and elk to prevent possible contamination of meat are available at state Departments of Natural Resources.
Investigating suspected disease
Physicians who suspect a patient may have vCJD or CJD, or that a patient has died of such disease, should advocate for brain biopsy or autopsy. The National Prion Disease Pathology Surveillance Center at Case Western University (funded by the Centers for Disease Control and Prevention) provides diagnostic services free of charge to physicians and health departments (available at www.cjdsurveillance.com).
Federal agencies, Congress, and the public became more aware of BSE on December 23, 2003, when the US Department of Agriculture (USDA) diagnosed the disease in a dairy cow in Washington state.1 The cow, traced to a herd originating in Canada, was 6.5 years old and had been slaughtered on December 9. Whether the cow was a “downer” (nonambulatory) is still under investigation. Downer cows are automatically tested; however, it is possible this cow was tested as part of a routine surveillance system rather than because it was at high risk of disease. Regardless, the carcass was released for use as food while tissues considered more risky for BSE transmission (brain, spinal cord, and small intestine) were kept from the human food supply.
After the case was diagnosed, the USDA recalled all meat from cattle slaughtered at that plant the same day. Unfortunately about 30,000 pounds of potentially contaminated meat was never recovered and ended up on consumers’ plates.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
After a period out of the spotlight, Mad Cow disease is again causing a stir. Following the first documented case in this country on December 23, 2003,1 the US government is instituting new preventive measures, and patients may be asking for assurances of safe-ty (see “What to advise patients,” page 565).
Mad Cow’s connection to humans: vCJD
Mad Cow disease is the bovine form of transmissible spongiform encephalopathy (TSE), a disease that can also affect sheep, deer, goats, and humans (Table 1). The causative agent is thought to be an infective protein called a prion, discovered in 1997.
Bovine spongiform encephalopathy (BSE) was first identified in the United Kingdom in 1986 and caused a large outbreak in cattle, which peaked in 1993. Subsequently, it was discovered that BSE could rarely spread to humans, causing a variant of Creutzfeldt-Jakob disease (vCJD) that is universally fatal. As of December 2003, 153 cases of vCJD had been reported worldwide, most in the UK. Confirmation of either the classic or variant form requires pathology examination of brain tissue collected by biopsy or, if a patient has died, at autopsy.2
TABLE 1
Transmissible spongiform encephalopathies
Species affected | Prion disease | Transmissible to humans? |
---|---|---|
Mink | Transmissible mink encephalopathy | No |
Sheep and goats | Scrapie | Historically no; questionable in newly discovered atypical cases |
Deer and elk | Chronic wasting disease | Possible (under investigation) |
Cattle and bison | Bovine spongiform encephalopathy | Yes (variant CJD) |
Humans | Creutzfeldt-jakob disease; variant CJD, Gerstmann-Straussler-Scheinker disease, Kuru, fatal familial insomnia | Through contaminated medical products, instruments, possibly blood |
Uniqueness of vCJD
In medical school, family physicians learned about classic CJD, which is endemic throughout the world and, in the US, causes an average of 1 death per million people per year. The epidemiology of vCJD and CJD are quite different (Table 2). Because vCJD is a new disease, its incubation period is unknown, but it is likely to be years or decades. In the UK it is thought that exposure to BSE-contaminated food from 1984–86 and the onset of vCJD cases in 1994–96 is consistent with such a long incubation period.
Since 1986, BSE has been identified in 20 European countries, Japan, Israel, Canada, and now the US. The main method of its spread through herds is believed to be the former practice of feeding cattle the meat and bone meal products that, at some point, were contaminated with BSE. In 1997, the US and Canada prohibited the feeding of ruminant meat and bone meal to other ruminants. It is thought that most cases of vCJD are transmitted to people when they eat beef products containing brain or spinal cord material contaminated with BSE.
Neuropathology. Variant CJD deposits plaques, vacuoles, and prion protein in the brain. To date, all persons with vCJD have had methio-nine homozygosity at the polymorphic codon 129 of the prion protein gene, suggesting that persons not carrying this genotype (who make 60% of the population) have increased resistance to the disease. In addition, vCJD and BSE are both dose-dependent infections, so both genetics and exposure may explain why so few human cases have occurred despite the widespread outbreak of BSE in the UK.
TABLE 2
Characteristics distinguishing vCJD from CJD
Characteristic | UK vCJD | US classic CJD |
---|---|---|
Median age at death | 28 (range, 14–74) | 68 (range, 23–97)* |
Median illness duration (mo) | 13–14 | 4–5 |
Clinical presentation | Prominent psychiatric/behavioral symptoms; delayed neurologic signs | Dementia; early neurologic signs |
Periodic sharp waves on EEG | Absent | Often present |
“Pulvinar sign” on MRI† | Present in >75% of cases | Not reported |
Presense of “florid plaques” on neuropathology | Present in great numbers | Rare or absent |
Immunohistochemical analysis of brain tissue | Marked accumulation of PrPres | Variable accumulation |
Presense of agent in lymphoid tissue | Readily detected | Not readily detected |
Increased glycoform ratio on immunoblot analysis of PrPres | Present | Not present |
Genotype at codon 129 of prion protein | Methionine/Methionine | Polymorphic |
*Surveillance data 1997–2001. | ||
† High signal in the posterior thalmus. | ||
CJD; Creutzfeldt-Jakob disease; vCJD, variant CJD; EEG, electoencephalogram; MRI, magnetic resonance imaging; | ||
PrPres, protease-resistant prion protein. | ||
Source: Centers for Disease Control and Prevention, MMWR Morb Mortal Wkly Rep 2004; 52:1280–1285.1 |
Prevention measures have been updated
Before December 30, 2003, prevention measures in place to prevent BSE in this country were the following:
- Import restrictions on bovine-derived consumer products from high-risk BSE countries (initiated in 1989).
- Prohibition of the use of ruminant derived meat and bone meal in cattle feed (initiated in 1997).
- A surveillance system for BSE that involved annual testing of between 5000 and 20,000 cattle slaughtered for human consumption (out of about 35 million cattle slaughtered per year).
Since December 30, 2003, the US Department of Agriculture (USDA) and Food and Drug Administration (FDA) have added or proposed a number of additional provisions to prevent BSE:
- Defining high-risk materials banned for human consumption, including the entire verte-bral column.
- Banning the use of advanced meat recovery systems on vertebral columns. These systems use brushes and air to blast soft tissue off of bone and led to up to 30% of hamburger sampled to be contaminated with central nervous system tissue.
- Proposing an expanded annual surveillance to include about 200,000 high-risk cattle (sick, suspect, dead) and a random sample of 20,000 normal cattle over 30 months old.
But are these measures enough?
Concerns about these new measures center on the surveillance program. First, how long will it take the USDA to expand its testing? Second, will even this expanded testing be sufficient? Some scientists and consumer advocates propose adopting the policy of the European Union, which is to test all cattle over 30 months of age, since this age group can harbor BSE without being ill.
Other congressional proposals include ban-ning all high-risk meat products from all animal feeds and cosmetics, and creating a prion disease task force to coordinate surveillance and research for all prion diseases. Unfortunately, because we have been testing so few cattle for BSE, we don’t really know if there are more infected cattle in our food system. Interestingly, in Japan, where all cattle are tested for BSE after slaughter, 10 more infected animals were discovered, most of which lacked the characteristics that would put them at high risk.3
To date, the beef industry has supported the changes already put into effect, but not the additional ones noted above. Ironically, a number of small, upscale slaughterhouses have proposed testing all cattle they slaughter (mostly under 30 months old) so they may resume sales to Japan. The USDA has turned down their requests for the chemical reagents to run the BSE tests (the agency controls the sale of these kits), citing its concern that testing all cattle would give the impression it is necessary for the entire US herd—a proposition the USDA and many scientists believe is unnecessary. Thus, the controversy over BSE surveillance has now become an economic, political, and scientific issue.
What to advise patients
The risk of contracting vCJD from eating contaminated beef is extremely small.4
There has yet to be a case of BSE found in any native-born US cattle.
There is no association between BSE and milk or milk products.
If traveling to countries where BSE is endem-ic—ie, UK and Portugal—patients may avoid beef altogether or limit consumption to whole cuts, not ground beef or sausage.5
Avoid bovine-derived nutritional supplements, especially those containing bovine pitu-itary, thyroid, adrenal, thymus, or other organ tissue.
Avoid products containing bovine meat or bone meal, such as some types of garden fertilizers.
Deer and elk can develop chronic wasting disease (CWD), another form of TSE. States that have recorded CWD cases include Colorado, Illinois, Wisconsin, and Wyoming. CWD is not known to cause disease in humans, but the risk to hunters and those who eat the meat is unknown. Physicians may want to advise hunters to have deer and elk hunted in CWD areas tested and only CWD negative animals processed for meat. Guidelines for field dressing deer and elk to prevent possible contamination of meat are available at state Departments of Natural Resources.
Investigating suspected disease
Physicians who suspect a patient may have vCJD or CJD, or that a patient has died of such disease, should advocate for brain biopsy or autopsy. The National Prion Disease Pathology Surveillance Center at Case Western University (funded by the Centers for Disease Control and Prevention) provides diagnostic services free of charge to physicians and health departments (available at www.cjdsurveillance.com).
Federal agencies, Congress, and the public became more aware of BSE on December 23, 2003, when the US Department of Agriculture (USDA) diagnosed the disease in a dairy cow in Washington state.1 The cow, traced to a herd originating in Canada, was 6.5 years old and had been slaughtered on December 9. Whether the cow was a “downer” (nonambulatory) is still under investigation. Downer cows are automatically tested; however, it is possible this cow was tested as part of a routine surveillance system rather than because it was at high risk of disease. Regardless, the carcass was released for use as food while tissues considered more risky for BSE transmission (brain, spinal cord, and small intestine) were kept from the human food supply.
After the case was diagnosed, the USDA recalled all meat from cattle slaughtered at that plant the same day. Unfortunately about 30,000 pounds of potentially contaminated meat was never recovered and ended up on consumers’ plates.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Centers for Disease Control and Prevention (CDC) Bovine spongiform encephalopathy in a dairy cow—Washington State, 2003. MMWR Morb Mortal Wkly Rep 2004;52:1280-1285.Available at: www.cdc.gov/mmwr/preview/mmwrhtml/mm5253a2.htm. Accessed on July 15, 2004.
2. CDC.BSE and CJD information and resources. Available at: www.cdc.gov/ncidod/diseases/cjd/cjd.html. Accessed on July 15, 2004.
3. Kaufman M. They’re not allowed to test for Mad Cow. Washington Post National Weekly, May 3–9, 2004;21.-
4. US Food and Drug Administration, Center for Food Safety and Applied Nutrition Commonly asked questions about BSE in products regulated by FDA’s Center for Food Safety and Applied Nutrition (CFSAN). Available at: www.cfsan.fda.gov/~comm/bsefaq.html. Accessed on July 15, 2004.
5. CDC. Bovine spongiform encephalopathy and variant Creutzfeldt-Jakob disease. Available at: www.cdc.gov/travel/diseases/madcow.htm. Accessed on July 15, 2004.
1. Centers for Disease Control and Prevention (CDC) Bovine spongiform encephalopathy in a dairy cow—Washington State, 2003. MMWR Morb Mortal Wkly Rep 2004;52:1280-1285.Available at: www.cdc.gov/mmwr/preview/mmwrhtml/mm5253a2.htm. Accessed on July 15, 2004.
2. CDC.BSE and CJD information and resources. Available at: www.cdc.gov/ncidod/diseases/cjd/cjd.html. Accessed on July 15, 2004.
3. Kaufman M. They’re not allowed to test for Mad Cow. Washington Post National Weekly, May 3–9, 2004;21.-
4. US Food and Drug Administration, Center for Food Safety and Applied Nutrition Commonly asked questions about BSE in products regulated by FDA’s Center for Food Safety and Applied Nutrition (CFSAN). Available at: www.cfsan.fda.gov/~comm/bsefaq.html. Accessed on July 15, 2004.
5. CDC. Bovine spongiform encephalopathy and variant Creutzfeldt-Jakob disease. Available at: www.cdc.gov/travel/diseases/madcow.htm. Accessed on July 15, 2004.
What the new Medicare prescription drug bill may mean for providers and patients
In November 2003, President Bush signed the Medicare prescription-drug bill, which will usher in the largest change in the Medicare program in terms of money and number of people affected since the program’s creation in 1965. The final version of the bill was controversial, passing by a small margin in both the House and Senate.
Conservatives criticized the bill for not giving a large enough role to the private sector as an alternative to the traditional Medicare program, for spending too much money, and for risking even larger budget deficits than already predicted.
Liberals criticized it for providing an inadequate drug benefit, for allowing the prescription program to be run by private industry, and for creating an experimental private-sector program that will compete with traditional Medicare.
In the end, passage was ensured with support from the American Association of Retired Persons (AARP), drug companies, private health insurers, and national medical groups—and with the usual political maneuvering.
Public support among seniors and other groups remains unclear. For example, the American Academy of Family Physicians supported the bill, but negative reaction by members led President Michael Fleming to write a letter explaining the reasons for the decision (www.aafp.org/medicareletter.xml). In addition, Republican concerns about the overall cost of the legislation seem borne out by the administration’s recent announcement projecting costs of $530 billion over 10 years, about one third more than the price tag used to convince Congress to pass the legislation about 2 months before.
This article reviews the bill and some of its health policy implications.
Not all details clear; more than drug benefits affected
Several generalizations about Federal legislation hold true with this bill.
First, while the bill establishes the intent of Congress, a number of details will not be made clear until it is implemented by the executive branch—the administration and the responsible cabinet departments such as the Center for Medicare and Medicaid Services. The importance of these implementation details is most relevant to the prescription drug benefit section of the bill.
Second, the bill changes or adds programs in a number of health areas besides prescription drugs (see Supplementary changes with the Medicare prescription drug bill). These additions partly reflected the need of proponents to satisfy diverse special interests (private insurers, hospitals and physicians, rural areas) and thereby gain their support for other parts of the bill that were more controversial, principally the drug benefit and private competition for Medicare. Thus, there is funding to increase Medicare payments to physicians and rural hospitals and to hospitals serving large numbers of low-income patients.
- Medicare payments to rural hospitals and doctors increase by $25 billion over 10 years.
- Payments to hospitals serving large numbers of low-income patients would increase.
- Hospitals can avoid some future cuts in Medicare payments by submitting quality of care data to the government.
- Doctors would receive increases of 1.5% per year in Medicare payments for 2004 and 2005 rather than the cuts currently planned.
- Medicare would cover an initial physical for new beneficiaries and screening for diabetes and cardiovascular disease.
- Support for development of health savings accounts that allow people with high-deductible health insurance to shelter income from taxes and obtain tax deductions if the money is used for health expenses.
- Home health agencies would see cuts in payments, but patient co-pays would not be required.
- Medicare Part B premiums (for physician and outpatient services) would be greater for those with incomes over $80,000.
Third, the changes also reflect genuine goals of improving health by expanding Medicare coverage of preventive services and requiring participating hospitals to submit quality-of-care data.
Prescription drug coverage under the new bill
Although many seniors have drug coverage through retirement health plans or Medigap policies purchased privately, about one quarter of beneficiaries (some 10 million people) do not have such coverage. Even those with drug coverage may have difficulty affording recommended medications since the median income for a senior is little more than $23,000. Many physicians have seen the ill effects of seniors not filling their prescriptions or skipping doses of prescribed medications.
Until the benefit takes effect. The actual prescription drug benefit will not begin until 2006. Until then, Medicare recipients will be given the option of purchasing a drug-discount card for $30 per year starting this spring. It is estimated these cards may save 10% to 15% of prescription costs. In addition, low-income seniors will receive $600 per year toward drug purchases.
After it takes effect. The drug benefit starting in 2006 will be funded through a complex arrangement of patient and government payments (Figure).
- Premium: A premium estimated to begin at $35 per month.
- Deductible: An annual deductible starting at $250 and indexed to increase to $445 in 2013.
- Co-pay: After paying the deductible, enrollees will pay 25% of additional drug costs up to $2250, at which point a $2850 gap in cover-age—the so-called “doughut hole”—leaves the onus of payment with the patient until $5100 is reached.
- Catastrophic coverage: After $5100, patients will pay 5% of any additional annual drug costs.
In 2006, catastrophic coverage will begin after $3600 in out-of-pocket costs ($250 deductible + $500 in co-pays to $2250 + the $2850 doughnut gap), not counting the premium. Indexing provisions are projected to raise this out-of-pocket cost requirement to $6400 in 2013. These indexing features have received less attention in the media, but may become increasingly important to seniors. For lower-income individuals, as determined by specific yearly income and total assets guidelines, a small per-prescription fee will replace the premium, deductible, and doughnut hole gap payments.
Coverage will vary. As the yearly cost of drugs changes, so will the relative contributions made by the patient and the government (Table). The new bill provides substantial benefit to those with catastrophic drug costs and to very low-income seniors. The idea of linking payments to income (for the drug benefit and the Part B premium) is a change in the Medicare program, as it has traditionally provided the same benefit to all beneficiaries, regardless of income.
Expected effects of privatization. The manner in which the drug benefit will be administered was controversial in Congress. The legislation, written primarily by Republicans, provides that beneficiaries can obtain coverage by participating with an HMO or PPO or by purchasing standalone coverage through a private prescription drug insurance program. Patient enrollment is voluntary. Managed care plans would be encouraged to participate in the prescription benefit program through eligibility for government subsidies. In turn, more beneficiaries would be encouraged to choose managed care plans, thus decreasing the number of patients covered by traditional Medicare. Furthermore, current private Medigap supplemental plans will be barred from offering drug benefits.
With the HMO/PPO and stand-alone programs, it is likely that any reduction in drug costs will result from private pharmacy benefit managers negotiating discounts from drug companies as they do now for many employer-sponsored plans. Presumably, formularies will vary from plan to plan, and it may be difficult for patients to know ahead of time whether the plan they join will cover their current medications. The legislation prohibits the government from using its vast purchasing power to negotiate substantial discounts from drug companies as it does now for the Medicaid program. There are provisions to increase availability of generic drugs, but importation of drugs from Canada is prohibited unless FDA approval is given (so far, the FDA has opposed this). Many Democrats who opposed the bill argued that it allowed too large a role for the private sector and constrained the ability of the government to control drug costs.
A final controversial measure in the bill provided for conducting an experiment in 6 cities beginning in 2010, in which at least 1 private insurance plan would be funded to compete directly with the traditional Medicare program. Many Republicans believe this type of competition is necessary to decrease the rate of cost increases in the Medicare program, while many Democrats believe the private market is a big reason for increasing problems with the quality and cost of the entire health care system.
FIGURE
Out-of-pocket spending under new legislation
Out-of-pocket drug spending in 2006 for Medicare beneficiaries under new Medicare legislation. Note: Benefit levels are indexed to growth in per capita expenditures for covered Part D drugs. As a result, the Part D deductible in projected to increase from $250 in 2006 to $445 in 2013; the catastrophic threshold is projected to increase from $5100 in 2006 to $9066 in 2013. From the Kaiser Family Foundation website(www.kff.org/medicare/medicarebenefitataglance.ctm).
Looming questions
The new Medicare legislation is vast in scope, cost, and controversy. In the coming months, a number of organizations—AARP, the Department of Health and Human Services, and various foundations—will attempt to explain its provisions to the public, most likely in different ways.
TABLE
Deciphering the 2006 drug benefit
The chart above shows what portion of yearly drug costs would be paid by the Medicare recipient and what portion would be paid by Medicare beginning in 2006. It does not include the $420 yearly premium.Family physicians may be asked by patients to explain provisions of the program and to offer advice in making decisions about their participation.
In addition, preoccupation with explaining and implementing the Medicare bill may keep Congress and the President from addressing other pressing health issues such as the growing number of uninsured.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Pear R. Bush’s aides put higher price tag on Medicare law. New York Times, January 30, 2004.
2. Altman D. The new Medicare prescription-drug legislation. N Engl J Med 2004;350:9-10
3. American Academy of Family Physicians. Medicare Prescription Drug, Improvement and Modernization Act. Available at: www.aafp.org/x25558.xml. Accessed on April 2, 2004.
4. National Association of Chain Drug Stores. Medicare Prescription Drug Benefit and Discount Card Program Q & A. Available at: www.nacds.org/user-assets/PDF_files/MedicareRx_Q&A.pdf. Accessed on April 2, 2004.
5. New Medicare law/key provisions. Christian Science Monitor, December 4, 2003
In November 2003, President Bush signed the Medicare prescription-drug bill, which will usher in the largest change in the Medicare program in terms of money and number of people affected since the program’s creation in 1965. The final version of the bill was controversial, passing by a small margin in both the House and Senate.
Conservatives criticized the bill for not giving a large enough role to the private sector as an alternative to the traditional Medicare program, for spending too much money, and for risking even larger budget deficits than already predicted.
Liberals criticized it for providing an inadequate drug benefit, for allowing the prescription program to be run by private industry, and for creating an experimental private-sector program that will compete with traditional Medicare.
In the end, passage was ensured with support from the American Association of Retired Persons (AARP), drug companies, private health insurers, and national medical groups—and with the usual political maneuvering.
Public support among seniors and other groups remains unclear. For example, the American Academy of Family Physicians supported the bill, but negative reaction by members led President Michael Fleming to write a letter explaining the reasons for the decision (www.aafp.org/medicareletter.xml). In addition, Republican concerns about the overall cost of the legislation seem borne out by the administration’s recent announcement projecting costs of $530 billion over 10 years, about one third more than the price tag used to convince Congress to pass the legislation about 2 months before.
This article reviews the bill and some of its health policy implications.
Not all details clear; more than drug benefits affected
Several generalizations about Federal legislation hold true with this bill.
First, while the bill establishes the intent of Congress, a number of details will not be made clear until it is implemented by the executive branch—the administration and the responsible cabinet departments such as the Center for Medicare and Medicaid Services. The importance of these implementation details is most relevant to the prescription drug benefit section of the bill.
Second, the bill changes or adds programs in a number of health areas besides prescription drugs (see Supplementary changes with the Medicare prescription drug bill). These additions partly reflected the need of proponents to satisfy diverse special interests (private insurers, hospitals and physicians, rural areas) and thereby gain their support for other parts of the bill that were more controversial, principally the drug benefit and private competition for Medicare. Thus, there is funding to increase Medicare payments to physicians and rural hospitals and to hospitals serving large numbers of low-income patients.
- Medicare payments to rural hospitals and doctors increase by $25 billion over 10 years.
- Payments to hospitals serving large numbers of low-income patients would increase.
- Hospitals can avoid some future cuts in Medicare payments by submitting quality of care data to the government.
- Doctors would receive increases of 1.5% per year in Medicare payments for 2004 and 2005 rather than the cuts currently planned.
- Medicare would cover an initial physical for new beneficiaries and screening for diabetes and cardiovascular disease.
- Support for development of health savings accounts that allow people with high-deductible health insurance to shelter income from taxes and obtain tax deductions if the money is used for health expenses.
- Home health agencies would see cuts in payments, but patient co-pays would not be required.
- Medicare Part B premiums (for physician and outpatient services) would be greater for those with incomes over $80,000.
Third, the changes also reflect genuine goals of improving health by expanding Medicare coverage of preventive services and requiring participating hospitals to submit quality-of-care data.
Prescription drug coverage under the new bill
Although many seniors have drug coverage through retirement health plans or Medigap policies purchased privately, about one quarter of beneficiaries (some 10 million people) do not have such coverage. Even those with drug coverage may have difficulty affording recommended medications since the median income for a senior is little more than $23,000. Many physicians have seen the ill effects of seniors not filling their prescriptions or skipping doses of prescribed medications.
Until the benefit takes effect. The actual prescription drug benefit will not begin until 2006. Until then, Medicare recipients will be given the option of purchasing a drug-discount card for $30 per year starting this spring. It is estimated these cards may save 10% to 15% of prescription costs. In addition, low-income seniors will receive $600 per year toward drug purchases.
After it takes effect. The drug benefit starting in 2006 will be funded through a complex arrangement of patient and government payments (Figure).
- Premium: A premium estimated to begin at $35 per month.
- Deductible: An annual deductible starting at $250 and indexed to increase to $445 in 2013.
- Co-pay: After paying the deductible, enrollees will pay 25% of additional drug costs up to $2250, at which point a $2850 gap in cover-age—the so-called “doughut hole”—leaves the onus of payment with the patient until $5100 is reached.
- Catastrophic coverage: After $5100, patients will pay 5% of any additional annual drug costs.
In 2006, catastrophic coverage will begin after $3600 in out-of-pocket costs ($250 deductible + $500 in co-pays to $2250 + the $2850 doughnut gap), not counting the premium. Indexing provisions are projected to raise this out-of-pocket cost requirement to $6400 in 2013. These indexing features have received less attention in the media, but may become increasingly important to seniors. For lower-income individuals, as determined by specific yearly income and total assets guidelines, a small per-prescription fee will replace the premium, deductible, and doughnut hole gap payments.
Coverage will vary. As the yearly cost of drugs changes, so will the relative contributions made by the patient and the government (Table). The new bill provides substantial benefit to those with catastrophic drug costs and to very low-income seniors. The idea of linking payments to income (for the drug benefit and the Part B premium) is a change in the Medicare program, as it has traditionally provided the same benefit to all beneficiaries, regardless of income.
Expected effects of privatization. The manner in which the drug benefit will be administered was controversial in Congress. The legislation, written primarily by Republicans, provides that beneficiaries can obtain coverage by participating with an HMO or PPO or by purchasing standalone coverage through a private prescription drug insurance program. Patient enrollment is voluntary. Managed care plans would be encouraged to participate in the prescription benefit program through eligibility for government subsidies. In turn, more beneficiaries would be encouraged to choose managed care plans, thus decreasing the number of patients covered by traditional Medicare. Furthermore, current private Medigap supplemental plans will be barred from offering drug benefits.
With the HMO/PPO and stand-alone programs, it is likely that any reduction in drug costs will result from private pharmacy benefit managers negotiating discounts from drug companies as they do now for many employer-sponsored plans. Presumably, formularies will vary from plan to plan, and it may be difficult for patients to know ahead of time whether the plan they join will cover their current medications. The legislation prohibits the government from using its vast purchasing power to negotiate substantial discounts from drug companies as it does now for the Medicaid program. There are provisions to increase availability of generic drugs, but importation of drugs from Canada is prohibited unless FDA approval is given (so far, the FDA has opposed this). Many Democrats who opposed the bill argued that it allowed too large a role for the private sector and constrained the ability of the government to control drug costs.
A final controversial measure in the bill provided for conducting an experiment in 6 cities beginning in 2010, in which at least 1 private insurance plan would be funded to compete directly with the traditional Medicare program. Many Republicans believe this type of competition is necessary to decrease the rate of cost increases in the Medicare program, while many Democrats believe the private market is a big reason for increasing problems with the quality and cost of the entire health care system.
FIGURE
Out-of-pocket spending under new legislation
Out-of-pocket drug spending in 2006 for Medicare beneficiaries under new Medicare legislation. Note: Benefit levels are indexed to growth in per capita expenditures for covered Part D drugs. As a result, the Part D deductible in projected to increase from $250 in 2006 to $445 in 2013; the catastrophic threshold is projected to increase from $5100 in 2006 to $9066 in 2013. From the Kaiser Family Foundation website(www.kff.org/medicare/medicarebenefitataglance.ctm).
Looming questions
The new Medicare legislation is vast in scope, cost, and controversy. In the coming months, a number of organizations—AARP, the Department of Health and Human Services, and various foundations—will attempt to explain its provisions to the public, most likely in different ways.
TABLE
Deciphering the 2006 drug benefit
The chart above shows what portion of yearly drug costs would be paid by the Medicare recipient and what portion would be paid by Medicare beginning in 2006. It does not include the $420 yearly premium.Family physicians may be asked by patients to explain provisions of the program and to offer advice in making decisions about their participation.
In addition, preoccupation with explaining and implementing the Medicare bill may keep Congress and the President from addressing other pressing health issues such as the growing number of uninsured.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
In November 2003, President Bush signed the Medicare prescription-drug bill, which will usher in the largest change in the Medicare program in terms of money and number of people affected since the program’s creation in 1965. The final version of the bill was controversial, passing by a small margin in both the House and Senate.
Conservatives criticized the bill for not giving a large enough role to the private sector as an alternative to the traditional Medicare program, for spending too much money, and for risking even larger budget deficits than already predicted.
Liberals criticized it for providing an inadequate drug benefit, for allowing the prescription program to be run by private industry, and for creating an experimental private-sector program that will compete with traditional Medicare.
In the end, passage was ensured with support from the American Association of Retired Persons (AARP), drug companies, private health insurers, and national medical groups—and with the usual political maneuvering.
Public support among seniors and other groups remains unclear. For example, the American Academy of Family Physicians supported the bill, but negative reaction by members led President Michael Fleming to write a letter explaining the reasons for the decision (www.aafp.org/medicareletter.xml). In addition, Republican concerns about the overall cost of the legislation seem borne out by the administration’s recent announcement projecting costs of $530 billion over 10 years, about one third more than the price tag used to convince Congress to pass the legislation about 2 months before.
This article reviews the bill and some of its health policy implications.
Not all details clear; more than drug benefits affected
Several generalizations about Federal legislation hold true with this bill.
First, while the bill establishes the intent of Congress, a number of details will not be made clear until it is implemented by the executive branch—the administration and the responsible cabinet departments such as the Center for Medicare and Medicaid Services. The importance of these implementation details is most relevant to the prescription drug benefit section of the bill.
Second, the bill changes or adds programs in a number of health areas besides prescription drugs (see Supplementary changes with the Medicare prescription drug bill). These additions partly reflected the need of proponents to satisfy diverse special interests (private insurers, hospitals and physicians, rural areas) and thereby gain their support for other parts of the bill that were more controversial, principally the drug benefit and private competition for Medicare. Thus, there is funding to increase Medicare payments to physicians and rural hospitals and to hospitals serving large numbers of low-income patients.
- Medicare payments to rural hospitals and doctors increase by $25 billion over 10 years.
- Payments to hospitals serving large numbers of low-income patients would increase.
- Hospitals can avoid some future cuts in Medicare payments by submitting quality of care data to the government.
- Doctors would receive increases of 1.5% per year in Medicare payments for 2004 and 2005 rather than the cuts currently planned.
- Medicare would cover an initial physical for new beneficiaries and screening for diabetes and cardiovascular disease.
- Support for development of health savings accounts that allow people with high-deductible health insurance to shelter income from taxes and obtain tax deductions if the money is used for health expenses.
- Home health agencies would see cuts in payments, but patient co-pays would not be required.
- Medicare Part B premiums (for physician and outpatient services) would be greater for those with incomes over $80,000.
Third, the changes also reflect genuine goals of improving health by expanding Medicare coverage of preventive services and requiring participating hospitals to submit quality-of-care data.
Prescription drug coverage under the new bill
Although many seniors have drug coverage through retirement health plans or Medigap policies purchased privately, about one quarter of beneficiaries (some 10 million people) do not have such coverage. Even those with drug coverage may have difficulty affording recommended medications since the median income for a senior is little more than $23,000. Many physicians have seen the ill effects of seniors not filling their prescriptions or skipping doses of prescribed medications.
Until the benefit takes effect. The actual prescription drug benefit will not begin until 2006. Until then, Medicare recipients will be given the option of purchasing a drug-discount card for $30 per year starting this spring. It is estimated these cards may save 10% to 15% of prescription costs. In addition, low-income seniors will receive $600 per year toward drug purchases.
After it takes effect. The drug benefit starting in 2006 will be funded through a complex arrangement of patient and government payments (Figure).
- Premium: A premium estimated to begin at $35 per month.
- Deductible: An annual deductible starting at $250 and indexed to increase to $445 in 2013.
- Co-pay: After paying the deductible, enrollees will pay 25% of additional drug costs up to $2250, at which point a $2850 gap in cover-age—the so-called “doughut hole”—leaves the onus of payment with the patient until $5100 is reached.
- Catastrophic coverage: After $5100, patients will pay 5% of any additional annual drug costs.
In 2006, catastrophic coverage will begin after $3600 in out-of-pocket costs ($250 deductible + $500 in co-pays to $2250 + the $2850 doughnut gap), not counting the premium. Indexing provisions are projected to raise this out-of-pocket cost requirement to $6400 in 2013. These indexing features have received less attention in the media, but may become increasingly important to seniors. For lower-income individuals, as determined by specific yearly income and total assets guidelines, a small per-prescription fee will replace the premium, deductible, and doughnut hole gap payments.
Coverage will vary. As the yearly cost of drugs changes, so will the relative contributions made by the patient and the government (Table). The new bill provides substantial benefit to those with catastrophic drug costs and to very low-income seniors. The idea of linking payments to income (for the drug benefit and the Part B premium) is a change in the Medicare program, as it has traditionally provided the same benefit to all beneficiaries, regardless of income.
Expected effects of privatization. The manner in which the drug benefit will be administered was controversial in Congress. The legislation, written primarily by Republicans, provides that beneficiaries can obtain coverage by participating with an HMO or PPO or by purchasing standalone coverage through a private prescription drug insurance program. Patient enrollment is voluntary. Managed care plans would be encouraged to participate in the prescription benefit program through eligibility for government subsidies. In turn, more beneficiaries would be encouraged to choose managed care plans, thus decreasing the number of patients covered by traditional Medicare. Furthermore, current private Medigap supplemental plans will be barred from offering drug benefits.
With the HMO/PPO and stand-alone programs, it is likely that any reduction in drug costs will result from private pharmacy benefit managers negotiating discounts from drug companies as they do now for many employer-sponsored plans. Presumably, formularies will vary from plan to plan, and it may be difficult for patients to know ahead of time whether the plan they join will cover their current medications. The legislation prohibits the government from using its vast purchasing power to negotiate substantial discounts from drug companies as it does now for the Medicaid program. There are provisions to increase availability of generic drugs, but importation of drugs from Canada is prohibited unless FDA approval is given (so far, the FDA has opposed this). Many Democrats who opposed the bill argued that it allowed too large a role for the private sector and constrained the ability of the government to control drug costs.
A final controversial measure in the bill provided for conducting an experiment in 6 cities beginning in 2010, in which at least 1 private insurance plan would be funded to compete directly with the traditional Medicare program. Many Republicans believe this type of competition is necessary to decrease the rate of cost increases in the Medicare program, while many Democrats believe the private market is a big reason for increasing problems with the quality and cost of the entire health care system.
FIGURE
Out-of-pocket spending under new legislation
Out-of-pocket drug spending in 2006 for Medicare beneficiaries under new Medicare legislation. Note: Benefit levels are indexed to growth in per capita expenditures for covered Part D drugs. As a result, the Part D deductible in projected to increase from $250 in 2006 to $445 in 2013; the catastrophic threshold is projected to increase from $5100 in 2006 to $9066 in 2013. From the Kaiser Family Foundation website(www.kff.org/medicare/medicarebenefitataglance.ctm).
Looming questions
The new Medicare legislation is vast in scope, cost, and controversy. In the coming months, a number of organizations—AARP, the Department of Health and Human Services, and various foundations—will attempt to explain its provisions to the public, most likely in different ways.
TABLE
Deciphering the 2006 drug benefit
The chart above shows what portion of yearly drug costs would be paid by the Medicare recipient and what portion would be paid by Medicare beginning in 2006. It does not include the $420 yearly premium.Family physicians may be asked by patients to explain provisions of the program and to offer advice in making decisions about their participation.
In addition, preoccupation with explaining and implementing the Medicare bill may keep Congress and the President from addressing other pressing health issues such as the growing number of uninsured.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Pear R. Bush’s aides put higher price tag on Medicare law. New York Times, January 30, 2004.
2. Altman D. The new Medicare prescription-drug legislation. N Engl J Med 2004;350:9-10
3. American Academy of Family Physicians. Medicare Prescription Drug, Improvement and Modernization Act. Available at: www.aafp.org/x25558.xml. Accessed on April 2, 2004.
4. National Association of Chain Drug Stores. Medicare Prescription Drug Benefit and Discount Card Program Q & A. Available at: www.nacds.org/user-assets/PDF_files/MedicareRx_Q&A.pdf. Accessed on April 2, 2004.
5. New Medicare law/key provisions. Christian Science Monitor, December 4, 2003
1. Pear R. Bush’s aides put higher price tag on Medicare law. New York Times, January 30, 2004.
2. Altman D. The new Medicare prescription-drug legislation. N Engl J Med 2004;350:9-10
3. American Academy of Family Physicians. Medicare Prescription Drug, Improvement and Modernization Act. Available at: www.aafp.org/x25558.xml. Accessed on April 2, 2004.
4. National Association of Chain Drug Stores. Medicare Prescription Drug Benefit and Discount Card Program Q & A. Available at: www.nacds.org/user-assets/PDF_files/MedicareRx_Q&A.pdf. Accessed on April 2, 2004.
5. New Medicare law/key provisions. Christian Science Monitor, December 4, 2003
10 steps for avoiding health disparities in your practice
We hope the answer to the question above is no. However, the evidence regarding differences in the care of patients based on race, ethnicity, gender, and socioeconomic status suggests that if this patient is a woman or African American or from a lower socioeconomic class, resultant morbidity or mortality will be higher.
Differences are seen in the provision of cardiovascular care, cancer diagnosis and treament, and HIV care. African Americans, Latino Americans, Asian Americans, and Native Americans have higher morbidity and mortality than Caucasian chemical dependency, diabetes, heart disease, infant Americans for multiple problems including cancer, mortality, and unintentional and intentional injuries.1
This article explores possible explanations for health care disparities and offers 10 practical strategies for tackling this challenging issue.
Examples of health disparities
The United States has dramatically improved the health status of its citizens—increasing longevity, reducing infant mortality and teenage pregnancies, and increasing the number of children being immunized. Despite these improvements, though, there remain persistent and disproportionate burdens of disease and illness borne by subgroups of the population (Table 1). 2,3
The Institute of Medicine in its recent report, “Unequal Treatment,” approaches the issue from another perspective: they define these disparities as “racial or ethnic differences in the quality of healthcare that are not due to access-related factors or clinical needs, preferences and appropriateness of intervention.”4
TABLE 1
Examples of health disparities that could be changed
Disparity in mortality |
Infant mortality |
Infant mortality is higher for infants of African American, Native Hawaiian, and Native American mothers (13.8, 10.0, and 9.3 deaths per 1000 live births, respectively) than for infants of other race groups. Infant mortality decreases as the mother’s level of education increases. |
Disparity in morbidity |
Cancer (males) |
The incidence of cancer among black males exceeds that of white males for prostate cancer (60%), lung and bronchial cancer (58% ), and colon and rectum cancers (14%). |
Disparity in health behaviors |
Cigarette smoking |
Smoking among persons aged 25 years and over ranges from 11% among college graduates to 32% for those without a high school diploma; 19% of adolescents in the most rural counties smoke compared to 11% in central counties. |
Disparity in preventive health care |
Mammography |
Poor women are 27% less likely to have had a recent mammogram than are women with family incomes above the poverty level. |
Disparity in access to care |
Health insurance coverage |
13% of children under aged <18 years have no health insurance coverage; 28% of children with family incomes of 1 to 1.5 times the poverty level are without coverage, compared with 5% of those with family incomes at least twice the poverty level. |
Source: Adapted from Health, United States, 2001. |
Hyattsville, Md: National Center for Health Statistics; 2001. |
Correcting health disparity begins with understanding its causes
A number of factors account for disparities in health and health care.
Population-influenced factors
Leading candidates are some population groups’ lower socioeconomic status (eg, income, occupation, education) and increased exposure to unhealthy environments. Individuals may also exhibit preferences for or against treatment (when appropriate treatment recommendations are offered) that mirror group preferences.
For example, African American patients’ distrust of the healthcare system may be based in part on their experience of discrimination as research subjects in the Tuskegee syphilis study and Los Angeles measles immunization study. Research has shown that while these issues are relevant, they do not fully account for observed disparities.
System factors
Problems with access to care are common: inadequate insurance, transportation difficulties, geographic barriers to needed services (rural/urban), and language barriers. Again, research has shown that access to care matters, but not necessarily more than other factors.
Individual factors
At the individual level, a clinical encounter may be adversely affected by physician-patient racial/ethnic discordance, patient health literacy, and physician cultural competence. Also, there is the high prevalence of risky behavior such as smoking.
Finally, provider-specific issues may be operative: bias (prejudice) against certain groups of patients, clinical uncertainty when dealing with patients, and stereotypes held by providers about the behavior or health of different groups of patients according to race, ethnicity, or culture.
Addressing disparities in practice
Clearly, improving the socioeconomic status and access to care for all people are among the most important ways to eliminate health disparities. Physicians can influence these areas through individual participation in political activities, in nonprofit organizations, and in their professional organizations.
Steps can also be taken in your own practice (Table 2).
TABLE 2
Ten practical measures for avoiding health disparity in your practice
Use evidence-based clinical guidelines as much as possible. |
Consider the health literacy level of your patients when planning care and treatment, when explaining medical recommendations, and when handing out written material. |
Ensure that front desk staff are sensitive to patient backgrounds and cultures. |
Provide culturally sensitive patient education materials (eg, brochures in Spanish). |
Keep a “black book” with the names and numbers of community health resources. |
Volunteer with a nonprofit community-based agency in your area. |
Ask your local health department or managed care plans if they have a community health improvement plan. Get involved in creating or implementing the plan. |
Create a special program for one or more of the populations you care for (eg, a school-based program to help reduce teenage pregnancy). |
Develop a plan for translation services. |
Browse through the Institute of Medicine report, “Unequal Treatment” (available at www.iom.edu/report.asp?id=4475). |
Use evidence-based guidelines
To minimize the effect of possible bias and stereotyping in caring for patients of different races, ethnicities, and cultures, an important foundation is to standardize care for all patients by using evidence-based practice guidelines when appropriate. Clinical guidelines such as those published by the US Preventive Services Task Force and those available on the Internet through the National Guideline Clearinghouse provide well-researched and substantiated recommendations (available at www.ngc.gov).
Using guidelines is consistent with national recommendations to incorporate more evidence-based practices in clinical care.
Make your office patient-friendly
Create an office environment that is sensitive to the needs of all patients. Addressing language issues, having front desk staff who are sensitive and unbiased, and providing culturally relevant patient education material (eg, posters, magazines) are important components of a supportive office environment.1
Advocate patient education
Strategies to improve patient health literacy and physician cultural competence may be of benefit. The literacy issue can be helped considerably by enabling patients to increase their understanding of health terminology, and there are national efforts to address patient health literacy. Physicians can also help by explaining options and care plans simply, carefully, and without medical jargon. The American Medical Association has a national campaign in support of health literacy (www.amaassn.org/ama/pub/category/8115.html).
Increase cross-cultural communication skills
The Institute of Medicine and academicians have increasingly recommended training healthcare professionals to be more culturally competent. Experts have agreed that the “essence of cultural competence is not the mastery of ‘facts’ about different ethnic groups, but rather a patient-centered approach that incorporates fundamental skills and attitudes that may be applicable across ethnic boundaries.”6
A recent national survey supported this idea by showing that racial differences in patient satisfaction disappeared after adjustment for the quality of physician behaviors (eg, showing respect for patients, spending adequate time with patients). The fact that these positive physician behaviors were reported more frequently by white than non-white patients points to the need for continued effort at improving physicians’ interpersonal skills.
Eliminating health disparities is one of the top 2 goals of Healthy People 2010, the document that guides the nation’s health promotion and disease prevention agenda. Healthy People 2010 (www.health.gov/healthypeople) is a compilation of important prevention objectives for the Nation identified by the US Public Health Service that helps to focus health care system and community efforts. The vision for Healthy People 2010 is “Healthy People in Healthy Communities,” a theme emphasizing that the health of the individual is closely linked with the health of the community.
The Leading Health Indicators are a subset of the Healthy People 2010 objectives and were chosen for emphasis because they account for more than 50% of the leading preventable causes of morbidity and premature morality in the US. 5 Data on these 10 objectives also point to disparities in health status and health outcomes among population groups in the US. Most states and many local communities have used the Healthy People 2010/Leading Health Indicators to develop and implement state and local “Healthy People” plans.
Physicians have an important role in efforts to meet these goals because many of them can only be met by utilizing multicomponent intervention strategies that include actions at the clinic, health care system and community level.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Tucker C, Herman K, Pedersen T, Higley B, Montrichard M, Ivery P. Cultural sensitivity in physician-patient relationships: perspectives of an ethnically diverse sample of low-income primary care patients. Med Care 2003;41:859-870.
2. Fiscella K, Franks P, Gold MR, Clancy CM. Inequality in quality: addressing socioeconomic, racial and ethnic disparities in health care. JAMA 2000;283:2579-2584.
3. Navarro V. Race or class versus race and class: mortality differentials in the United States. Lancet 1990;336:1238-1240.
4. Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academy Press; 2002. Available at: www.iom.edu/report.asp?id=4475. Accessed on February 13, 2004.
5. McGinnis JM, Foege W. Actual causes of death in the Unites States. JAMA 1993;270:2207-2212.
6. Saha S, Arbelaez JJ, Cooper LA. Patient-Physician relationships and racial disparities in the quality of health care. Am J Public Health 2003;93:1713-1719.
We hope the answer to the question above is no. However, the evidence regarding differences in the care of patients based on race, ethnicity, gender, and socioeconomic status suggests that if this patient is a woman or African American or from a lower socioeconomic class, resultant morbidity or mortality will be higher.
Differences are seen in the provision of cardiovascular care, cancer diagnosis and treament, and HIV care. African Americans, Latino Americans, Asian Americans, and Native Americans have higher morbidity and mortality than Caucasian chemical dependency, diabetes, heart disease, infant Americans for multiple problems including cancer, mortality, and unintentional and intentional injuries.1
This article explores possible explanations for health care disparities and offers 10 practical strategies for tackling this challenging issue.
Examples of health disparities
The United States has dramatically improved the health status of its citizens—increasing longevity, reducing infant mortality and teenage pregnancies, and increasing the number of children being immunized. Despite these improvements, though, there remain persistent and disproportionate burdens of disease and illness borne by subgroups of the population (Table 1). 2,3
The Institute of Medicine in its recent report, “Unequal Treatment,” approaches the issue from another perspective: they define these disparities as “racial or ethnic differences in the quality of healthcare that are not due to access-related factors or clinical needs, preferences and appropriateness of intervention.”4
TABLE 1
Examples of health disparities that could be changed
Disparity in mortality |
Infant mortality |
Infant mortality is higher for infants of African American, Native Hawaiian, and Native American mothers (13.8, 10.0, and 9.3 deaths per 1000 live births, respectively) than for infants of other race groups. Infant mortality decreases as the mother’s level of education increases. |
Disparity in morbidity |
Cancer (males) |
The incidence of cancer among black males exceeds that of white males for prostate cancer (60%), lung and bronchial cancer (58% ), and colon and rectum cancers (14%). |
Disparity in health behaviors |
Cigarette smoking |
Smoking among persons aged 25 years and over ranges from 11% among college graduates to 32% for those without a high school diploma; 19% of adolescents in the most rural counties smoke compared to 11% in central counties. |
Disparity in preventive health care |
Mammography |
Poor women are 27% less likely to have had a recent mammogram than are women with family incomes above the poverty level. |
Disparity in access to care |
Health insurance coverage |
13% of children under aged <18 years have no health insurance coverage; 28% of children with family incomes of 1 to 1.5 times the poverty level are without coverage, compared with 5% of those with family incomes at least twice the poverty level. |
Source: Adapted from Health, United States, 2001. |
Hyattsville, Md: National Center for Health Statistics; 2001. |
Correcting health disparity begins with understanding its causes
A number of factors account for disparities in health and health care.
Population-influenced factors
Leading candidates are some population groups’ lower socioeconomic status (eg, income, occupation, education) and increased exposure to unhealthy environments. Individuals may also exhibit preferences for or against treatment (when appropriate treatment recommendations are offered) that mirror group preferences.
For example, African American patients’ distrust of the healthcare system may be based in part on their experience of discrimination as research subjects in the Tuskegee syphilis study and Los Angeles measles immunization study. Research has shown that while these issues are relevant, they do not fully account for observed disparities.
System factors
Problems with access to care are common: inadequate insurance, transportation difficulties, geographic barriers to needed services (rural/urban), and language barriers. Again, research has shown that access to care matters, but not necessarily more than other factors.
Individual factors
At the individual level, a clinical encounter may be adversely affected by physician-patient racial/ethnic discordance, patient health literacy, and physician cultural competence. Also, there is the high prevalence of risky behavior such as smoking.
Finally, provider-specific issues may be operative: bias (prejudice) against certain groups of patients, clinical uncertainty when dealing with patients, and stereotypes held by providers about the behavior or health of different groups of patients according to race, ethnicity, or culture.
Addressing disparities in practice
Clearly, improving the socioeconomic status and access to care for all people are among the most important ways to eliminate health disparities. Physicians can influence these areas through individual participation in political activities, in nonprofit organizations, and in their professional organizations.
Steps can also be taken in your own practice (Table 2).
TABLE 2
Ten practical measures for avoiding health disparity in your practice
Use evidence-based clinical guidelines as much as possible. |
Consider the health literacy level of your patients when planning care and treatment, when explaining medical recommendations, and when handing out written material. |
Ensure that front desk staff are sensitive to patient backgrounds and cultures. |
Provide culturally sensitive patient education materials (eg, brochures in Spanish). |
Keep a “black book” with the names and numbers of community health resources. |
Volunteer with a nonprofit community-based agency in your area. |
Ask your local health department or managed care plans if they have a community health improvement plan. Get involved in creating or implementing the plan. |
Create a special program for one or more of the populations you care for (eg, a school-based program to help reduce teenage pregnancy). |
Develop a plan for translation services. |
Browse through the Institute of Medicine report, “Unequal Treatment” (available at www.iom.edu/report.asp?id=4475). |
Use evidence-based guidelines
To minimize the effect of possible bias and stereotyping in caring for patients of different races, ethnicities, and cultures, an important foundation is to standardize care for all patients by using evidence-based practice guidelines when appropriate. Clinical guidelines such as those published by the US Preventive Services Task Force and those available on the Internet through the National Guideline Clearinghouse provide well-researched and substantiated recommendations (available at www.ngc.gov).
Using guidelines is consistent with national recommendations to incorporate more evidence-based practices in clinical care.
Make your office patient-friendly
Create an office environment that is sensitive to the needs of all patients. Addressing language issues, having front desk staff who are sensitive and unbiased, and providing culturally relevant patient education material (eg, posters, magazines) are important components of a supportive office environment.1
Advocate patient education
Strategies to improve patient health literacy and physician cultural competence may be of benefit. The literacy issue can be helped considerably by enabling patients to increase their understanding of health terminology, and there are national efforts to address patient health literacy. Physicians can also help by explaining options and care plans simply, carefully, and without medical jargon. The American Medical Association has a national campaign in support of health literacy (www.amaassn.org/ama/pub/category/8115.html).
Increase cross-cultural communication skills
The Institute of Medicine and academicians have increasingly recommended training healthcare professionals to be more culturally competent. Experts have agreed that the “essence of cultural competence is not the mastery of ‘facts’ about different ethnic groups, but rather a patient-centered approach that incorporates fundamental skills and attitudes that may be applicable across ethnic boundaries.”6
A recent national survey supported this idea by showing that racial differences in patient satisfaction disappeared after adjustment for the quality of physician behaviors (eg, showing respect for patients, spending adequate time with patients). The fact that these positive physician behaviors were reported more frequently by white than non-white patients points to the need for continued effort at improving physicians’ interpersonal skills.
Eliminating health disparities is one of the top 2 goals of Healthy People 2010, the document that guides the nation’s health promotion and disease prevention agenda. Healthy People 2010 (www.health.gov/healthypeople) is a compilation of important prevention objectives for the Nation identified by the US Public Health Service that helps to focus health care system and community efforts. The vision for Healthy People 2010 is “Healthy People in Healthy Communities,” a theme emphasizing that the health of the individual is closely linked with the health of the community.
The Leading Health Indicators are a subset of the Healthy People 2010 objectives and were chosen for emphasis because they account for more than 50% of the leading preventable causes of morbidity and premature morality in the US. 5 Data on these 10 objectives also point to disparities in health status and health outcomes among population groups in the US. Most states and many local communities have used the Healthy People 2010/Leading Health Indicators to develop and implement state and local “Healthy People” plans.
Physicians have an important role in efforts to meet these goals because many of them can only be met by utilizing multicomponent intervention strategies that include actions at the clinic, health care system and community level.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
We hope the answer to the question above is no. However, the evidence regarding differences in the care of patients based on race, ethnicity, gender, and socioeconomic status suggests that if this patient is a woman or African American or from a lower socioeconomic class, resultant morbidity or mortality will be higher.
Differences are seen in the provision of cardiovascular care, cancer diagnosis and treament, and HIV care. African Americans, Latino Americans, Asian Americans, and Native Americans have higher morbidity and mortality than Caucasian chemical dependency, diabetes, heart disease, infant Americans for multiple problems including cancer, mortality, and unintentional and intentional injuries.1
This article explores possible explanations for health care disparities and offers 10 practical strategies for tackling this challenging issue.
Examples of health disparities
The United States has dramatically improved the health status of its citizens—increasing longevity, reducing infant mortality and teenage pregnancies, and increasing the number of children being immunized. Despite these improvements, though, there remain persistent and disproportionate burdens of disease and illness borne by subgroups of the population (Table 1). 2,3
The Institute of Medicine in its recent report, “Unequal Treatment,” approaches the issue from another perspective: they define these disparities as “racial or ethnic differences in the quality of healthcare that are not due to access-related factors or clinical needs, preferences and appropriateness of intervention.”4
TABLE 1
Examples of health disparities that could be changed
Disparity in mortality |
Infant mortality |
Infant mortality is higher for infants of African American, Native Hawaiian, and Native American mothers (13.8, 10.0, and 9.3 deaths per 1000 live births, respectively) than for infants of other race groups. Infant mortality decreases as the mother’s level of education increases. |
Disparity in morbidity |
Cancer (males) |
The incidence of cancer among black males exceeds that of white males for prostate cancer (60%), lung and bronchial cancer (58% ), and colon and rectum cancers (14%). |
Disparity in health behaviors |
Cigarette smoking |
Smoking among persons aged 25 years and over ranges from 11% among college graduates to 32% for those without a high school diploma; 19% of adolescents in the most rural counties smoke compared to 11% in central counties. |
Disparity in preventive health care |
Mammography |
Poor women are 27% less likely to have had a recent mammogram than are women with family incomes above the poverty level. |
Disparity in access to care |
Health insurance coverage |
13% of children under aged <18 years have no health insurance coverage; 28% of children with family incomes of 1 to 1.5 times the poverty level are without coverage, compared with 5% of those with family incomes at least twice the poverty level. |
Source: Adapted from Health, United States, 2001. |
Hyattsville, Md: National Center for Health Statistics; 2001. |
Correcting health disparity begins with understanding its causes
A number of factors account for disparities in health and health care.
Population-influenced factors
Leading candidates are some population groups’ lower socioeconomic status (eg, income, occupation, education) and increased exposure to unhealthy environments. Individuals may also exhibit preferences for or against treatment (when appropriate treatment recommendations are offered) that mirror group preferences.
For example, African American patients’ distrust of the healthcare system may be based in part on their experience of discrimination as research subjects in the Tuskegee syphilis study and Los Angeles measles immunization study. Research has shown that while these issues are relevant, they do not fully account for observed disparities.
System factors
Problems with access to care are common: inadequate insurance, transportation difficulties, geographic barriers to needed services (rural/urban), and language barriers. Again, research has shown that access to care matters, but not necessarily more than other factors.
Individual factors
At the individual level, a clinical encounter may be adversely affected by physician-patient racial/ethnic discordance, patient health literacy, and physician cultural competence. Also, there is the high prevalence of risky behavior such as smoking.
Finally, provider-specific issues may be operative: bias (prejudice) against certain groups of patients, clinical uncertainty when dealing with patients, and stereotypes held by providers about the behavior or health of different groups of patients according to race, ethnicity, or culture.
Addressing disparities in practice
Clearly, improving the socioeconomic status and access to care for all people are among the most important ways to eliminate health disparities. Physicians can influence these areas through individual participation in political activities, in nonprofit organizations, and in their professional organizations.
Steps can also be taken in your own practice (Table 2).
TABLE 2
Ten practical measures for avoiding health disparity in your practice
Use evidence-based clinical guidelines as much as possible. |
Consider the health literacy level of your patients when planning care and treatment, when explaining medical recommendations, and when handing out written material. |
Ensure that front desk staff are sensitive to patient backgrounds and cultures. |
Provide culturally sensitive patient education materials (eg, brochures in Spanish). |
Keep a “black book” with the names and numbers of community health resources. |
Volunteer with a nonprofit community-based agency in your area. |
Ask your local health department or managed care plans if they have a community health improvement plan. Get involved in creating or implementing the plan. |
Create a special program for one or more of the populations you care for (eg, a school-based program to help reduce teenage pregnancy). |
Develop a plan for translation services. |
Browse through the Institute of Medicine report, “Unequal Treatment” (available at www.iom.edu/report.asp?id=4475). |
Use evidence-based guidelines
To minimize the effect of possible bias and stereotyping in caring for patients of different races, ethnicities, and cultures, an important foundation is to standardize care for all patients by using evidence-based practice guidelines when appropriate. Clinical guidelines such as those published by the US Preventive Services Task Force and those available on the Internet through the National Guideline Clearinghouse provide well-researched and substantiated recommendations (available at www.ngc.gov).
Using guidelines is consistent with national recommendations to incorporate more evidence-based practices in clinical care.
Make your office patient-friendly
Create an office environment that is sensitive to the needs of all patients. Addressing language issues, having front desk staff who are sensitive and unbiased, and providing culturally relevant patient education material (eg, posters, magazines) are important components of a supportive office environment.1
Advocate patient education
Strategies to improve patient health literacy and physician cultural competence may be of benefit. The literacy issue can be helped considerably by enabling patients to increase their understanding of health terminology, and there are national efforts to address patient health literacy. Physicians can also help by explaining options and care plans simply, carefully, and without medical jargon. The American Medical Association has a national campaign in support of health literacy (www.amaassn.org/ama/pub/category/8115.html).
Increase cross-cultural communication skills
The Institute of Medicine and academicians have increasingly recommended training healthcare professionals to be more culturally competent. Experts have agreed that the “essence of cultural competence is not the mastery of ‘facts’ about different ethnic groups, but rather a patient-centered approach that incorporates fundamental skills and attitudes that may be applicable across ethnic boundaries.”6
A recent national survey supported this idea by showing that racial differences in patient satisfaction disappeared after adjustment for the quality of physician behaviors (eg, showing respect for patients, spending adequate time with patients). The fact that these positive physician behaviors were reported more frequently by white than non-white patients points to the need for continued effort at improving physicians’ interpersonal skills.
Eliminating health disparities is one of the top 2 goals of Healthy People 2010, the document that guides the nation’s health promotion and disease prevention agenda. Healthy People 2010 (www.health.gov/healthypeople) is a compilation of important prevention objectives for the Nation identified by the US Public Health Service that helps to focus health care system and community efforts. The vision for Healthy People 2010 is “Healthy People in Healthy Communities,” a theme emphasizing that the health of the individual is closely linked with the health of the community.
The Leading Health Indicators are a subset of the Healthy People 2010 objectives and were chosen for emphasis because they account for more than 50% of the leading preventable causes of morbidity and premature morality in the US. 5 Data on these 10 objectives also point to disparities in health status and health outcomes among population groups in the US. Most states and many local communities have used the Healthy People 2010/Leading Health Indicators to develop and implement state and local “Healthy People” plans.
Physicians have an important role in efforts to meet these goals because many of them can only be met by utilizing multicomponent intervention strategies that include actions at the clinic, health care system and community level.
Corresponding author
Eric Henley, MD, MPH, Co-Editor, Practice Alert, 1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Tucker C, Herman K, Pedersen T, Higley B, Montrichard M, Ivery P. Cultural sensitivity in physician-patient relationships: perspectives of an ethnically diverse sample of low-income primary care patients. Med Care 2003;41:859-870.
2. Fiscella K, Franks P, Gold MR, Clancy CM. Inequality in quality: addressing socioeconomic, racial and ethnic disparities in health care. JAMA 2000;283:2579-2584.
3. Navarro V. Race or class versus race and class: mortality differentials in the United States. Lancet 1990;336:1238-1240.
4. Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academy Press; 2002. Available at: www.iom.edu/report.asp?id=4475. Accessed on February 13, 2004.
5. McGinnis JM, Foege W. Actual causes of death in the Unites States. JAMA 1993;270:2207-2212.
6. Saha S, Arbelaez JJ, Cooper LA. Patient-Physician relationships and racial disparities in the quality of health care. Am J Public Health 2003;93:1713-1719.
1. Tucker C, Herman K, Pedersen T, Higley B, Montrichard M, Ivery P. Cultural sensitivity in physician-patient relationships: perspectives of an ethnically diverse sample of low-income primary care patients. Med Care 2003;41:859-870.
2. Fiscella K, Franks P, Gold MR, Clancy CM. Inequality in quality: addressing socioeconomic, racial and ethnic disparities in health care. JAMA 2000;283:2579-2584.
3. Navarro V. Race or class versus race and class: mortality differentials in the United States. Lancet 1990;336:1238-1240.
4. Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academy Press; 2002. Available at: www.iom.edu/report.asp?id=4475. Accessed on February 13, 2004.
5. McGinnis JM, Foege W. Actual causes of death in the Unites States. JAMA 1993;270:2207-2212.
6. Saha S, Arbelaez JJ, Cooper LA. Patient-Physician relationships and racial disparities in the quality of health care. Am J Public Health 2003;93:1713-1719.
Prevention and treatment of influenza
With this flu season, there are new indications for the traditional inactivated (killed) vaccine, a new intranasal vaccine, lab tests for rapid identification of influenza, and a need to review the role of antiviral treatments.
New prevention measures
Inactivated influenza vaccine is the best preventive measure against both type A and B strains of the virus. The vaccine’s effectiveness depends somewhat on how well it matches to circulating virus antigens. Table 1 lists the benefits of vaccination in various populations.
Table 2 identifies the usual target populations for vaccine coverage. In the past 5 years, research has yielded several findings: the very young are at excess risk of influenza-related hospitalizations; adults aged 50 to 64 years have more high-risk conditions than previously thought; and cost-benefit analyses show a large economic toll of flu outbreaks manifested mainly as work and school absence. Consequently, the Centers for Disease Control and Prevention (CDC) now recommends routine vaccination of persons older than 50 years, and encourages vaccination of children between 6 and 24 months.Children under 9 years being immunized for the first time must receive 2 vaccines at least a month apart to gain optimal protection. This requirement will make it challenging to immunize children aged <24 months, since they are already receiving a number of other vaccinations.
This year enough vaccine has been produced to allow both targeted and nontargeted groups to receive inactivated vaccine as soon as it is available.
TABLE 1
Effectiveness of inactivated influenza vaccine
In the patient group… | …the vaccine prevents a potential… |
---|---|
Healthy adults <65 years | 70%–90% of influenza illness |
Children 1–15 years | 77%–91% of influenza respiratory illness;no evidence that it prevents otitis media7 |
Adults >65 years | 58% of influenza respiratory illness |
30%–70% of hospitalizations for pneumonia and flu | |
Adults >65 years in nursing homes | 30%–40% of influenza illness |
50%–60% of hospitalizations | |
80% death rate |
TABLE 2
Persons who should receive inactivated influenza vaccine
Recommendations to date |
---|
|
New recommendations |
|
FluMist
The US Food and Drug Administration (FDA) recently approved FluMist, an intranasal vaccine with live, attenuated influenza virus, effective against both type A and B strains. Indications for its use are healthy people from 5 to 49 years. In this group, FluMist is an alternative to the traditional inactivated vaccine, but it is more expensive at $46 a dose (compared with $6 to $10 for inactivated vaccine). Unvaccinated children 5 to 8 years of age should receive 2 doses 6 to 10 weeks apart.4
People with chronic conditions such as asthma, cardiovascular disease, diabetes, and known or suspected immunodeficiency should not receive this vaccine until additional data are acquired about its effectiveness in these situations. In addition, because FluMist contains live influenza viruses, there is a potential for transmission from the vaccinated person to other persons. Therefore, clinicians should be cautious in its use when a patient requiring vaccination lives with immunosuppressed persons.
The rate of serious side effects with FluMist has been <1%, although mild side effects such as runny nose, fever, and headache occur slightly more often among vaccine than placebo recipients.
Improved diagnostic tests
The development of new outpatient treatments for influenza has increased the desirability of making an accurate diagnosis. Clinical symptoms, particularly fever and cough, are somewhat helpful (in adults, sensitivity is 63%–78% and specificity is 55%–71%). Diagnostic accuracy is enhanced by awareness of active flu cases in your community. This information is available from local or state health departments and the CDC, and it is based on active surveillance through networks of sentinel physician practices and emergency rooms. This is a good example of a reliable surveillance system helping physicians provide better clinical care.
Since 1989, a concerted public health effort has increased flu vaccine usage in adults older than 65 from 33% to 66% in 1999.Like many successful population health programs, this improvement has resulted from a focus on the core functions of public health:
- Assessment—regular surveys of vaccine coverage and local influenza outbreaks, continual identification of high-risk groups, and studies of vaccine efficacy and cost-effectiveness.
- Assurance—media campaigns to heighten awareness among consumers and providers of the benefits of vaccination and increased access to vaccine through physician offices, health departments, other health care worksites, and non-traditional sites such as malls and drug stores.
- Policy development — Medicare coverage of vaccine costs since 1993, Healthy People 2000 and 2010 national goals, and marketing campaigns to increase vaccine coverage supported by the Public Health Service in partnership with private organizations.
In the past several years, the FDA has approved an array of rapid diagnostic tests that may improve medical decision-making. Approved tests are now available for Clinical Laboratory Improvement Act (CLIA)-waived (QuickVue Influenza A/B; ZstatFlu) and nonwaived labs (BD Directigen Flu A+B; BD Directigen Flu A).
Nasal washes or swabs, not throat swabs, are the best method for obtaining specimens.
Compared with viral culture (the gold standard), reported sensitivities for these tests are 62%–73%, and specificities are 80%–99%.5
A study conducted in a private practice reported sensitivities of 72%–95% and specificities of 76%–84%.3 In this mainly pediatric population, the prevalence of influenza was about 50% by culture, and the positive predictive value (the likelihood that a positive test indicates true disease) ranged from 80%–86%, with a negative predictive value (the likelihood that a negative test indicates absence of disease) of 77%–90%.
In both studies, QuickVue was the best performing CLIA-waived test; the ZstatFlu test did not perform as well as the others. The tests generally give results in under 15 minutes, except ZstatFlu, which was more cumbersome to use. Since the prevalence of a condition in the population influences the predictive value of tests, all the tests perform better at finding true disease during active flu outbreaks than at the beginning or end of outbreaks when patients are less likely to have influenza.
The tests range in price from $15 to $25.
Antiviral treatments
Amantadine and rimantadine can reduce the duration of uncomplicated influenza A by about 1 day when started within 2 days of the onset of illness.
The newer drugs, zanamivir and oseltamivir, can reduce the duration of uncomplicated influenza A and B by about 1 day compared with placebo. Data are limited regarding the benefits of these drugs for patients at high risk of serious complications, or for children, although 1 study has shown a decrease in the incidence of otitis media among children taking oseltamivir.
Zanamivir is approved for adults and for children older than 7 years. It is administered via inhalation twice a day and costs $48 for the standard 5-day treatment.
Oseltamivir is approved for adults and for children older than 1 year. It is taken orally twice daily, with dose based on age and weight. It costs $60 for a 5-day treatment.
Prophylaxis. Amantadine and rimantadine are approved for prophylaxis against influenza A, and prevent 70%–90% of cases. Oseltamivir is approved for prophylaxis in adults and children older than 13 years. When used prophylactically, these drugs must be taken daily for the duration of influenza activity in the community. This can mean taking medication for weeks, which would be quite expensive in the case of oseltamivir.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Bridges CB, Harper SA, Fukuda K, Uyeki TM, Cox NJ, Singleton JA. Advisory Committee on Immunization Practices. Prevention and control of influenza; recommendations of the Advisory Committee on immunization Practices. MMWR Recomm Rep 2003;52(RR08):1-34.
2. Montalto N. An office-based approach to influenza: clinical diagnosis and laboratory testing. Am Fam Physician 2003;67:111-118.
3. Rodriguez W, Schwartz R, Thorne M. Evaluation of diagnostic tests for influenza in a pediatric practice. Pediatr Infect Dis J 2002;21:193-196.
4. Harper SA, Fukuda K, Cox NJ, Bridges CB. Using live, attenuated influenza vaccine for prevention and control of influenza; supplemental recommendations of the Advisory Committee on Immunization Practices (ACIP). MMWR Recomm Rep 2003;52(RR13):1-8.
5. US Food and Drug Administration. Performance and caution in using rapid influenza virus diagnostic tests. Available at www.fda.gov/cdrh/oivd/laboratory.html#tip2. Accessed on October 6, 2003.
6. Colgan R, Michocki R, Greisman L, Moore TA. Antiviral drugs in the immunocompetent host: part II. Treatment of influenza and respiratory syncytial virus infections. Am Fam Physician 2003;67:763-766.
7. Hoberman A, Greenberg DP, Paradise JL, et al. Effectiveness of inactivated influenza vaccine in preventing acute otitis media in young children: a randomized controlled trial. JAMA 2003;290:1608-1616.
With this flu season, there are new indications for the traditional inactivated (killed) vaccine, a new intranasal vaccine, lab tests for rapid identification of influenza, and a need to review the role of antiviral treatments.
New prevention measures
Inactivated influenza vaccine is the best preventive measure against both type A and B strains of the virus. The vaccine’s effectiveness depends somewhat on how well it matches to circulating virus antigens. Table 1 lists the benefits of vaccination in various populations.
Table 2 identifies the usual target populations for vaccine coverage. In the past 5 years, research has yielded several findings: the very young are at excess risk of influenza-related hospitalizations; adults aged 50 to 64 years have more high-risk conditions than previously thought; and cost-benefit analyses show a large economic toll of flu outbreaks manifested mainly as work and school absence. Consequently, the Centers for Disease Control and Prevention (CDC) now recommends routine vaccination of persons older than 50 years, and encourages vaccination of children between 6 and 24 months.Children under 9 years being immunized for the first time must receive 2 vaccines at least a month apart to gain optimal protection. This requirement will make it challenging to immunize children aged <24 months, since they are already receiving a number of other vaccinations.
This year enough vaccine has been produced to allow both targeted and nontargeted groups to receive inactivated vaccine as soon as it is available.
TABLE 1
Effectiveness of inactivated influenza vaccine
In the patient group… | …the vaccine prevents a potential… |
---|---|
Healthy adults <65 years | 70%–90% of influenza illness |
Children 1–15 years | 77%–91% of influenza respiratory illness;no evidence that it prevents otitis media7 |
Adults >65 years | 58% of influenza respiratory illness |
30%–70% of hospitalizations for pneumonia and flu | |
Adults >65 years in nursing homes | 30%–40% of influenza illness |
50%–60% of hospitalizations | |
80% death rate |
TABLE 2
Persons who should receive inactivated influenza vaccine
Recommendations to date |
---|
|
New recommendations |
|
FluMist
The US Food and Drug Administration (FDA) recently approved FluMist, an intranasal vaccine with live, attenuated influenza virus, effective against both type A and B strains. Indications for its use are healthy people from 5 to 49 years. In this group, FluMist is an alternative to the traditional inactivated vaccine, but it is more expensive at $46 a dose (compared with $6 to $10 for inactivated vaccine). Unvaccinated children 5 to 8 years of age should receive 2 doses 6 to 10 weeks apart.4
People with chronic conditions such as asthma, cardiovascular disease, diabetes, and known or suspected immunodeficiency should not receive this vaccine until additional data are acquired about its effectiveness in these situations. In addition, because FluMist contains live influenza viruses, there is a potential for transmission from the vaccinated person to other persons. Therefore, clinicians should be cautious in its use when a patient requiring vaccination lives with immunosuppressed persons.
The rate of serious side effects with FluMist has been <1%, although mild side effects such as runny nose, fever, and headache occur slightly more often among vaccine than placebo recipients.
Improved diagnostic tests
The development of new outpatient treatments for influenza has increased the desirability of making an accurate diagnosis. Clinical symptoms, particularly fever and cough, are somewhat helpful (in adults, sensitivity is 63%–78% and specificity is 55%–71%). Diagnostic accuracy is enhanced by awareness of active flu cases in your community. This information is available from local or state health departments and the CDC, and it is based on active surveillance through networks of sentinel physician practices and emergency rooms. This is a good example of a reliable surveillance system helping physicians provide better clinical care.
Since 1989, a concerted public health effort has increased flu vaccine usage in adults older than 65 from 33% to 66% in 1999.Like many successful population health programs, this improvement has resulted from a focus on the core functions of public health:
- Assessment—regular surveys of vaccine coverage and local influenza outbreaks, continual identification of high-risk groups, and studies of vaccine efficacy and cost-effectiveness.
- Assurance—media campaigns to heighten awareness among consumers and providers of the benefits of vaccination and increased access to vaccine through physician offices, health departments, other health care worksites, and non-traditional sites such as malls and drug stores.
- Policy development — Medicare coverage of vaccine costs since 1993, Healthy People 2000 and 2010 national goals, and marketing campaigns to increase vaccine coverage supported by the Public Health Service in partnership with private organizations.
In the past several years, the FDA has approved an array of rapid diagnostic tests that may improve medical decision-making. Approved tests are now available for Clinical Laboratory Improvement Act (CLIA)-waived (QuickVue Influenza A/B; ZstatFlu) and nonwaived labs (BD Directigen Flu A+B; BD Directigen Flu A).
Nasal washes or swabs, not throat swabs, are the best method for obtaining specimens.
Compared with viral culture (the gold standard), reported sensitivities for these tests are 62%–73%, and specificities are 80%–99%.5
A study conducted in a private practice reported sensitivities of 72%–95% and specificities of 76%–84%.3 In this mainly pediatric population, the prevalence of influenza was about 50% by culture, and the positive predictive value (the likelihood that a positive test indicates true disease) ranged from 80%–86%, with a negative predictive value (the likelihood that a negative test indicates absence of disease) of 77%–90%.
In both studies, QuickVue was the best performing CLIA-waived test; the ZstatFlu test did not perform as well as the others. The tests generally give results in under 15 minutes, except ZstatFlu, which was more cumbersome to use. Since the prevalence of a condition in the population influences the predictive value of tests, all the tests perform better at finding true disease during active flu outbreaks than at the beginning or end of outbreaks when patients are less likely to have influenza.
The tests range in price from $15 to $25.
Antiviral treatments
Amantadine and rimantadine can reduce the duration of uncomplicated influenza A by about 1 day when started within 2 days of the onset of illness.
The newer drugs, zanamivir and oseltamivir, can reduce the duration of uncomplicated influenza A and B by about 1 day compared with placebo. Data are limited regarding the benefits of these drugs for patients at high risk of serious complications, or for children, although 1 study has shown a decrease in the incidence of otitis media among children taking oseltamivir.
Zanamivir is approved for adults and for children older than 7 years. It is administered via inhalation twice a day and costs $48 for the standard 5-day treatment.
Oseltamivir is approved for adults and for children older than 1 year. It is taken orally twice daily, with dose based on age and weight. It costs $60 for a 5-day treatment.
Prophylaxis. Amantadine and rimantadine are approved for prophylaxis against influenza A, and prevent 70%–90% of cases. Oseltamivir is approved for prophylaxis in adults and children older than 13 years. When used prophylactically, these drugs must be taken daily for the duration of influenza activity in the community. This can mean taking medication for weeks, which would be quite expensive in the case of oseltamivir.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
With this flu season, there are new indications for the traditional inactivated (killed) vaccine, a new intranasal vaccine, lab tests for rapid identification of influenza, and a need to review the role of antiviral treatments.
New prevention measures
Inactivated influenza vaccine is the best preventive measure against both type A and B strains of the virus. The vaccine’s effectiveness depends somewhat on how well it matches to circulating virus antigens. Table 1 lists the benefits of vaccination in various populations.
Table 2 identifies the usual target populations for vaccine coverage. In the past 5 years, research has yielded several findings: the very young are at excess risk of influenza-related hospitalizations; adults aged 50 to 64 years have more high-risk conditions than previously thought; and cost-benefit analyses show a large economic toll of flu outbreaks manifested mainly as work and school absence. Consequently, the Centers for Disease Control and Prevention (CDC) now recommends routine vaccination of persons older than 50 years, and encourages vaccination of children between 6 and 24 months.Children under 9 years being immunized for the first time must receive 2 vaccines at least a month apart to gain optimal protection. This requirement will make it challenging to immunize children aged <24 months, since they are already receiving a number of other vaccinations.
This year enough vaccine has been produced to allow both targeted and nontargeted groups to receive inactivated vaccine as soon as it is available.
TABLE 1
Effectiveness of inactivated influenza vaccine
In the patient group… | …the vaccine prevents a potential… |
---|---|
Healthy adults <65 years | 70%–90% of influenza illness |
Children 1–15 years | 77%–91% of influenza respiratory illness;no evidence that it prevents otitis media7 |
Adults >65 years | 58% of influenza respiratory illness |
30%–70% of hospitalizations for pneumonia and flu | |
Adults >65 years in nursing homes | 30%–40% of influenza illness |
50%–60% of hospitalizations | |
80% death rate |
TABLE 2
Persons who should receive inactivated influenza vaccine
Recommendations to date |
---|
|
New recommendations |
|
FluMist
The US Food and Drug Administration (FDA) recently approved FluMist, an intranasal vaccine with live, attenuated influenza virus, effective against both type A and B strains. Indications for its use are healthy people from 5 to 49 years. In this group, FluMist is an alternative to the traditional inactivated vaccine, but it is more expensive at $46 a dose (compared with $6 to $10 for inactivated vaccine). Unvaccinated children 5 to 8 years of age should receive 2 doses 6 to 10 weeks apart.4
People with chronic conditions such as asthma, cardiovascular disease, diabetes, and known or suspected immunodeficiency should not receive this vaccine until additional data are acquired about its effectiveness in these situations. In addition, because FluMist contains live influenza viruses, there is a potential for transmission from the vaccinated person to other persons. Therefore, clinicians should be cautious in its use when a patient requiring vaccination lives with immunosuppressed persons.
The rate of serious side effects with FluMist has been <1%, although mild side effects such as runny nose, fever, and headache occur slightly more often among vaccine than placebo recipients.
Improved diagnostic tests
The development of new outpatient treatments for influenza has increased the desirability of making an accurate diagnosis. Clinical symptoms, particularly fever and cough, are somewhat helpful (in adults, sensitivity is 63%–78% and specificity is 55%–71%). Diagnostic accuracy is enhanced by awareness of active flu cases in your community. This information is available from local or state health departments and the CDC, and it is based on active surveillance through networks of sentinel physician practices and emergency rooms. This is a good example of a reliable surveillance system helping physicians provide better clinical care.
Since 1989, a concerted public health effort has increased flu vaccine usage in adults older than 65 from 33% to 66% in 1999.Like many successful population health programs, this improvement has resulted from a focus on the core functions of public health:
- Assessment—regular surveys of vaccine coverage and local influenza outbreaks, continual identification of high-risk groups, and studies of vaccine efficacy and cost-effectiveness.
- Assurance—media campaigns to heighten awareness among consumers and providers of the benefits of vaccination and increased access to vaccine through physician offices, health departments, other health care worksites, and non-traditional sites such as malls and drug stores.
- Policy development — Medicare coverage of vaccine costs since 1993, Healthy People 2000 and 2010 national goals, and marketing campaigns to increase vaccine coverage supported by the Public Health Service in partnership with private organizations.
In the past several years, the FDA has approved an array of rapid diagnostic tests that may improve medical decision-making. Approved tests are now available for Clinical Laboratory Improvement Act (CLIA)-waived (QuickVue Influenza A/B; ZstatFlu) and nonwaived labs (BD Directigen Flu A+B; BD Directigen Flu A).
Nasal washes or swabs, not throat swabs, are the best method for obtaining specimens.
Compared with viral culture (the gold standard), reported sensitivities for these tests are 62%–73%, and specificities are 80%–99%.5
A study conducted in a private practice reported sensitivities of 72%–95% and specificities of 76%–84%.3 In this mainly pediatric population, the prevalence of influenza was about 50% by culture, and the positive predictive value (the likelihood that a positive test indicates true disease) ranged from 80%–86%, with a negative predictive value (the likelihood that a negative test indicates absence of disease) of 77%–90%.
In both studies, QuickVue was the best performing CLIA-waived test; the ZstatFlu test did not perform as well as the others. The tests generally give results in under 15 minutes, except ZstatFlu, which was more cumbersome to use. Since the prevalence of a condition in the population influences the predictive value of tests, all the tests perform better at finding true disease during active flu outbreaks than at the beginning or end of outbreaks when patients are less likely to have influenza.
The tests range in price from $15 to $25.
Antiviral treatments
Amantadine and rimantadine can reduce the duration of uncomplicated influenza A by about 1 day when started within 2 days of the onset of illness.
The newer drugs, zanamivir and oseltamivir, can reduce the duration of uncomplicated influenza A and B by about 1 day compared with placebo. Data are limited regarding the benefits of these drugs for patients at high risk of serious complications, or for children, although 1 study has shown a decrease in the incidence of otitis media among children taking oseltamivir.
Zanamivir is approved for adults and for children older than 7 years. It is administered via inhalation twice a day and costs $48 for the standard 5-day treatment.
Oseltamivir is approved for adults and for children older than 1 year. It is taken orally twice daily, with dose based on age and weight. It costs $60 for a 5-day treatment.
Prophylaxis. Amantadine and rimantadine are approved for prophylaxis against influenza A, and prevent 70%–90% of cases. Oseltamivir is approved for prophylaxis in adults and children older than 13 years. When used prophylactically, these drugs must be taken daily for the duration of influenza activity in the community. This can mean taking medication for weeks, which would be quite expensive in the case of oseltamivir.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Bridges CB, Harper SA, Fukuda K, Uyeki TM, Cox NJ, Singleton JA. Advisory Committee on Immunization Practices. Prevention and control of influenza; recommendations of the Advisory Committee on immunization Practices. MMWR Recomm Rep 2003;52(RR08):1-34.
2. Montalto N. An office-based approach to influenza: clinical diagnosis and laboratory testing. Am Fam Physician 2003;67:111-118.
3. Rodriguez W, Schwartz R, Thorne M. Evaluation of diagnostic tests for influenza in a pediatric practice. Pediatr Infect Dis J 2002;21:193-196.
4. Harper SA, Fukuda K, Cox NJ, Bridges CB. Using live, attenuated influenza vaccine for prevention and control of influenza; supplemental recommendations of the Advisory Committee on Immunization Practices (ACIP). MMWR Recomm Rep 2003;52(RR13):1-8.
5. US Food and Drug Administration. Performance and caution in using rapid influenza virus diagnostic tests. Available at www.fda.gov/cdrh/oivd/laboratory.html#tip2. Accessed on October 6, 2003.
6. Colgan R, Michocki R, Greisman L, Moore TA. Antiviral drugs in the immunocompetent host: part II. Treatment of influenza and respiratory syncytial virus infections. Am Fam Physician 2003;67:763-766.
7. Hoberman A, Greenberg DP, Paradise JL, et al. Effectiveness of inactivated influenza vaccine in preventing acute otitis media in young children: a randomized controlled trial. JAMA 2003;290:1608-1616.
1. Bridges CB, Harper SA, Fukuda K, Uyeki TM, Cox NJ, Singleton JA. Advisory Committee on Immunization Practices. Prevention and control of influenza; recommendations of the Advisory Committee on immunization Practices. MMWR Recomm Rep 2003;52(RR08):1-34.
2. Montalto N. An office-based approach to influenza: clinical diagnosis and laboratory testing. Am Fam Physician 2003;67:111-118.
3. Rodriguez W, Schwartz R, Thorne M. Evaluation of diagnostic tests for influenza in a pediatric practice. Pediatr Infect Dis J 2002;21:193-196.
4. Harper SA, Fukuda K, Cox NJ, Bridges CB. Using live, attenuated influenza vaccine for prevention and control of influenza; supplemental recommendations of the Advisory Committee on Immunization Practices (ACIP). MMWR Recomm Rep 2003;52(RR13):1-8.
5. US Food and Drug Administration. Performance and caution in using rapid influenza virus diagnostic tests. Available at www.fda.gov/cdrh/oivd/laboratory.html#tip2. Accessed on October 6, 2003.
6. Colgan R, Michocki R, Greisman L, Moore TA. Antiviral drugs in the immunocompetent host: part II. Treatment of influenza and respiratory syncytial virus infections. Am Fam Physician 2003;67:763-766.
7. Hoberman A, Greenberg DP, Paradise JL, et al. Effectiveness of inactivated influenza vaccine in preventing acute otitis media in young children: a randomized controlled trial. JAMA 2003;290:1608-1616.
What FPs need to know about West Nile virus disease
The Centers for Disease Control and Prevention (CDC) reports that West Nile virus infection in humans or animals has occurred in most states, and mosquito bite is now known to be just 1 of several means of virus transmission. Since many infected persons are asymptomatic, the challenge of controlling the spread is made even more difficult. At the time this issue went to press, August 19, a total of 599 human cases and 11 deaths had been reported to the CDC by state and local health authorities. Last year, a total of 638 cases and 31 deaths had been reported by August 30, and for the entire year, 4156 lab-positive human cases and 284 deaths.
This article describes transmission, diagnosis, treatment, and prevention of West Nile virus infection.
How the virus spreads
The virus, an RNA virus from the Flaviviridae family, is maintained in a bird-mosquito-bird cycle that begins in the spring when mosquitoes emerge and ends in early fall when they become dormant. By mid to late summer, the virus population has sufficiently amplified in both these hosts. At this point, other mosquitoes act as "bridge vectors" that bite both humans and birds, thus initiating West Nile virus infection in humans.
“Advice from your doctor: How to prevent West Nile virus infection,” may be photocopied for distribution to patients.
Avian mortality is documented in 162 North American species. It approaches 100% in laboratory-infected crows (making crows an important marker for the spread of West Nile virus in a specific community). House sparrows may develop high-level viremia for several days without dying, making them important amplifying hosts. Viremia in humans is low-grade and short-lived.
Transmission to humans: new routes discovered
Most cases of West Nile virus infections in humans result from mosquito bites, but other mechanisms have been discovered: needle sticks in lab workers, possibly breast milk (1 case), and possible transplacental transmission to the fetus (1 case).
More importantly, there is evidence of transmission through contaminated blood. Since many people infected with West Nile virus are asymptomatic, screening donors by clinical history is inadequate. In June 2003, blood-testing centers began screening the blood supply for West Nile virus using an experimental kit approved by the FDA.
What you can do to prevent West Nile virus infection
Here are practical steps you can take to avoid being infected with West Nile virus, and to help stop the spread of disease.
To prevent mosquito bites
- Use an insect repellant that contains DEET at concentrations less than 50% for adults and less than 10% for children.
- Wear long sleeves and socks when outdoors.
- Be particularly careful during evening and early morning hours, when mosquitoes are most active.
Mosquito-proof your home
- Drain standing water regularly to minimize mosquito breeding grounds.
- Install or repair screens on windows and doors.
Report dead birds to local health authorities so they can better monitor the activity of West Nile virus transmission.
Clinical course
Like many other viral infections, West Nile virus infection manifests in several ways.
Asymptomatic infection. Infection is not clinically apparent in most people.
Mild infection. About 20% of persons infected exhibit West Nile fever, a mild illness that follows an incubation period of 3 to 14 days.
The syndrome lasts 3 to 6 days, and is characterized by a sudden febrile illness often with an array of nonspecific signs and symptoms such as malaise, anorexia, nausea and vomiting, eye pain, headache, myalgia, rash, and lymphadenopathy.
Severe infection. About 1 in 150 infected persons develops severe neurological disease (encephalitis or meningitis are most common).
Being older than 50 years is the most significant risk factor for neurologic disease. In hospitalized patients, the most common symptoms accompanying neurologic disease are fever, weakness, gastrointestinal symptoms, and mental status changes. A smaller number of patients have a maculopapular or morbilliform rash of the trunk or extremities.
Outcomes. Advanced age is the most important risk factor for death, with mortality reaching 20% among patients older than 70 years. Unfortunately, there appears to be substantial neurologic morbidity for those surviving hospitalization.
The US case-fatality rate in 2002 was 9% in patients with meningoencephalitis.
Diagnosis
Diagnosis of symptomatic West Nile virus disease is based on clinical findings, a suggestive epidemiologic context, and specific laboratory test results.
Clinical and epidemiologic clues. Unexplained encephalitis or meningitis, particularly in a patient older than 50 years, during the summer or early fall, should trigger suspicion. Evidence of locally active disease in birds or humans, or recent travel to an area of known active disease, increases suspicion.
MAC-ELISA test and PanBio assay. The best test for diagnosis is detection of immunoglobulin M (IgM) antibody to West Nile virus in serum or cerebrospinal fluid within 8 days of illness onset (MAC-ELISA test, available through local and state health departments).
The FDA recently approved a commercial product, the PanBio West Nile virus IgM assay, which correctly identified the antibody in 90%–99% of cases. IgM antibody does not cross the blood-brain barrier, so its detection in cerebrospinal fluid is presumptive evidence of central nervous system infection.
Other laboratory findings include:
- normal or increased white blood cell counts, sometimes with lymphopenia or anemia
- occasional hyponatremia, especially in patients with encephalitis
- cerebrospinal fluid pleocytosis (usually lymphocytic), increased protein and normal glucose
- normal computed tomography brain scans
- abnormal magnetic resonance images in one third of patients.
Treatment is supportive
Treatment of severe disease is supportive. No evidence indicates efficacy of ribavirin, interferon, steroids, or other agents.
How to use public health resources
Prevention of West Nile virus disease will require both clinical and public health efforts. A good surveillance system is vital, providing clinicians and the community with knowledge about disease activity in birds and humans.
- Local or state health departments must coordinate, investigate, and track reports of dead birds by community members.
- Clinicians must notify the health department about suspected infections in humans.
- By publicizing the results of an active surveillance program, the health department assists clinicians in identifying cases more quickly and helps motivate the community to take appropriate preventive measures.
In late August 1999, an infectious disease specialist reported 2 patients with encephalitis at 1 hospital in Queens to the New York City Department of Health. An ensuing investigation revealed 6 additional cases at nearby hospitals. The illnesses were characterized by fever, severe muscle weakness (7 of 8 persons), and flaccid paralysis (4 of 8). Cerebrospinal fluid test results suggested viral infection.
So began the saga of human West Nile virus in the United States.
The virus was first isolated from a patient in Uganda, and is now distributed throughout Africa, the Middle East, parts of Europe, southwestern Asia, and Australia. Disease outbreaks in other parts of the world were infrequent until 1996.
West Nile virus is thought to have come to North America from Israel, but it is not clear how. Since 1999, the virus has spread rapidly throughout the US. Interestingly, the number of human cases reported annually was low (20–60) until 2002, when more than 4000 cases were reported. Only 9 continental states had avoided human cases of West Nile virus, and only 4 had reported no human or animal cases.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected]
SOURCES
Centers for Disease Control and Prevention. West Nile virus Web page. Available at: www.cdc.gov/ncidod/dvbid/ westnile/index.htm. Accessed on August 19, 2003.
Nash D, Mostashari F, Fine A, et al. The outbreak of West Nile virus infection in the New York City area in 1999. N Engl J Med 2001; 344:1807-1814.
Petersen L, Marfin A, and Gubler D. West Nile virus. JAMA 2003; 290:524-528.
US Food and Drug Administration. West Nile virus Web page. Available at: www.fda.gov/oc/opacom/hottopics/westnile.html. Accessed on August 11, 2003.
The Centers for Disease Control and Prevention (CDC) reports that West Nile virus infection in humans or animals has occurred in most states, and mosquito bite is now known to be just 1 of several means of virus transmission. Since many infected persons are asymptomatic, the challenge of controlling the spread is made even more difficult. At the time this issue went to press, August 19, a total of 599 human cases and 11 deaths had been reported to the CDC by state and local health authorities. Last year, a total of 638 cases and 31 deaths had been reported by August 30, and for the entire year, 4156 lab-positive human cases and 284 deaths.
This article describes transmission, diagnosis, treatment, and prevention of West Nile virus infection.
How the virus spreads
The virus, an RNA virus from the Flaviviridae family, is maintained in a bird-mosquito-bird cycle that begins in the spring when mosquitoes emerge and ends in early fall when they become dormant. By mid to late summer, the virus population has sufficiently amplified in both these hosts. At this point, other mosquitoes act as "bridge vectors" that bite both humans and birds, thus initiating West Nile virus infection in humans.
“Advice from your doctor: How to prevent West Nile virus infection,” may be photocopied for distribution to patients.
Avian mortality is documented in 162 North American species. It approaches 100% in laboratory-infected crows (making crows an important marker for the spread of West Nile virus in a specific community). House sparrows may develop high-level viremia for several days without dying, making them important amplifying hosts. Viremia in humans is low-grade and short-lived.
Transmission to humans: new routes discovered
Most cases of West Nile virus infections in humans result from mosquito bites, but other mechanisms have been discovered: needle sticks in lab workers, possibly breast milk (1 case), and possible transplacental transmission to the fetus (1 case).
More importantly, there is evidence of transmission through contaminated blood. Since many people infected with West Nile virus are asymptomatic, screening donors by clinical history is inadequate. In June 2003, blood-testing centers began screening the blood supply for West Nile virus using an experimental kit approved by the FDA.
What you can do to prevent West Nile virus infection
Here are practical steps you can take to avoid being infected with West Nile virus, and to help stop the spread of disease.
To prevent mosquito bites
- Use an insect repellant that contains DEET at concentrations less than 50% for adults and less than 10% for children.
- Wear long sleeves and socks when outdoors.
- Be particularly careful during evening and early morning hours, when mosquitoes are most active.
Mosquito-proof your home
- Drain standing water regularly to minimize mosquito breeding grounds.
- Install or repair screens on windows and doors.
Report dead birds to local health authorities so they can better monitor the activity of West Nile virus transmission.
Clinical course
Like many other viral infections, West Nile virus infection manifests in several ways.
Asymptomatic infection. Infection is not clinically apparent in most people.
Mild infection. About 20% of persons infected exhibit West Nile fever, a mild illness that follows an incubation period of 3 to 14 days.
The syndrome lasts 3 to 6 days, and is characterized by a sudden febrile illness often with an array of nonspecific signs and symptoms such as malaise, anorexia, nausea and vomiting, eye pain, headache, myalgia, rash, and lymphadenopathy.
Severe infection. About 1 in 150 infected persons develops severe neurological disease (encephalitis or meningitis are most common).
Being older than 50 years is the most significant risk factor for neurologic disease. In hospitalized patients, the most common symptoms accompanying neurologic disease are fever, weakness, gastrointestinal symptoms, and mental status changes. A smaller number of patients have a maculopapular or morbilliform rash of the trunk or extremities.
Outcomes. Advanced age is the most important risk factor for death, with mortality reaching 20% among patients older than 70 years. Unfortunately, there appears to be substantial neurologic morbidity for those surviving hospitalization.
The US case-fatality rate in 2002 was 9% in patients with meningoencephalitis.
Diagnosis
Diagnosis of symptomatic West Nile virus disease is based on clinical findings, a suggestive epidemiologic context, and specific laboratory test results.
Clinical and epidemiologic clues. Unexplained encephalitis or meningitis, particularly in a patient older than 50 years, during the summer or early fall, should trigger suspicion. Evidence of locally active disease in birds or humans, or recent travel to an area of known active disease, increases suspicion.
MAC-ELISA test and PanBio assay. The best test for diagnosis is detection of immunoglobulin M (IgM) antibody to West Nile virus in serum or cerebrospinal fluid within 8 days of illness onset (MAC-ELISA test, available through local and state health departments).
The FDA recently approved a commercial product, the PanBio West Nile virus IgM assay, which correctly identified the antibody in 90%–99% of cases. IgM antibody does not cross the blood-brain barrier, so its detection in cerebrospinal fluid is presumptive evidence of central nervous system infection.
Other laboratory findings include:
- normal or increased white blood cell counts, sometimes with lymphopenia or anemia
- occasional hyponatremia, especially in patients with encephalitis
- cerebrospinal fluid pleocytosis (usually lymphocytic), increased protein and normal glucose
- normal computed tomography brain scans
- abnormal magnetic resonance images in one third of patients.
Treatment is supportive
Treatment of severe disease is supportive. No evidence indicates efficacy of ribavirin, interferon, steroids, or other agents.
How to use public health resources
Prevention of West Nile virus disease will require both clinical and public health efforts. A good surveillance system is vital, providing clinicians and the community with knowledge about disease activity in birds and humans.
- Local or state health departments must coordinate, investigate, and track reports of dead birds by community members.
- Clinicians must notify the health department about suspected infections in humans.
- By publicizing the results of an active surveillance program, the health department assists clinicians in identifying cases more quickly and helps motivate the community to take appropriate preventive measures.
In late August 1999, an infectious disease specialist reported 2 patients with encephalitis at 1 hospital in Queens to the New York City Department of Health. An ensuing investigation revealed 6 additional cases at nearby hospitals. The illnesses were characterized by fever, severe muscle weakness (7 of 8 persons), and flaccid paralysis (4 of 8). Cerebrospinal fluid test results suggested viral infection.
So began the saga of human West Nile virus in the United States.
The virus was first isolated from a patient in Uganda, and is now distributed throughout Africa, the Middle East, parts of Europe, southwestern Asia, and Australia. Disease outbreaks in other parts of the world were infrequent until 1996.
West Nile virus is thought to have come to North America from Israel, but it is not clear how. Since 1999, the virus has spread rapidly throughout the US. Interestingly, the number of human cases reported annually was low (20–60) until 2002, when more than 4000 cases were reported. Only 9 continental states had avoided human cases of West Nile virus, and only 4 had reported no human or animal cases.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected]
The Centers for Disease Control and Prevention (CDC) reports that West Nile virus infection in humans or animals has occurred in most states, and mosquito bite is now known to be just 1 of several means of virus transmission. Since many infected persons are asymptomatic, the challenge of controlling the spread is made even more difficult. At the time this issue went to press, August 19, a total of 599 human cases and 11 deaths had been reported to the CDC by state and local health authorities. Last year, a total of 638 cases and 31 deaths had been reported by August 30, and for the entire year, 4156 lab-positive human cases and 284 deaths.
This article describes transmission, diagnosis, treatment, and prevention of West Nile virus infection.
How the virus spreads
The virus, an RNA virus from the Flaviviridae family, is maintained in a bird-mosquito-bird cycle that begins in the spring when mosquitoes emerge and ends in early fall when they become dormant. By mid to late summer, the virus population has sufficiently amplified in both these hosts. At this point, other mosquitoes act as "bridge vectors" that bite both humans and birds, thus initiating West Nile virus infection in humans.
“Advice from your doctor: How to prevent West Nile virus infection,” may be photocopied for distribution to patients.
Avian mortality is documented in 162 North American species. It approaches 100% in laboratory-infected crows (making crows an important marker for the spread of West Nile virus in a specific community). House sparrows may develop high-level viremia for several days without dying, making them important amplifying hosts. Viremia in humans is low-grade and short-lived.
Transmission to humans: new routes discovered
Most cases of West Nile virus infections in humans result from mosquito bites, but other mechanisms have been discovered: needle sticks in lab workers, possibly breast milk (1 case), and possible transplacental transmission to the fetus (1 case).
More importantly, there is evidence of transmission through contaminated blood. Since many people infected with West Nile virus are asymptomatic, screening donors by clinical history is inadequate. In June 2003, blood-testing centers began screening the blood supply for West Nile virus using an experimental kit approved by the FDA.
What you can do to prevent West Nile virus infection
Here are practical steps you can take to avoid being infected with West Nile virus, and to help stop the spread of disease.
To prevent mosquito bites
- Use an insect repellant that contains DEET at concentrations less than 50% for adults and less than 10% for children.
- Wear long sleeves and socks when outdoors.
- Be particularly careful during evening and early morning hours, when mosquitoes are most active.
Mosquito-proof your home
- Drain standing water regularly to minimize mosquito breeding grounds.
- Install or repair screens on windows and doors.
Report dead birds to local health authorities so they can better monitor the activity of West Nile virus transmission.
Clinical course
Like many other viral infections, West Nile virus infection manifests in several ways.
Asymptomatic infection. Infection is not clinically apparent in most people.
Mild infection. About 20% of persons infected exhibit West Nile fever, a mild illness that follows an incubation period of 3 to 14 days.
The syndrome lasts 3 to 6 days, and is characterized by a sudden febrile illness often with an array of nonspecific signs and symptoms such as malaise, anorexia, nausea and vomiting, eye pain, headache, myalgia, rash, and lymphadenopathy.
Severe infection. About 1 in 150 infected persons develops severe neurological disease (encephalitis or meningitis are most common).
Being older than 50 years is the most significant risk factor for neurologic disease. In hospitalized patients, the most common symptoms accompanying neurologic disease are fever, weakness, gastrointestinal symptoms, and mental status changes. A smaller number of patients have a maculopapular or morbilliform rash of the trunk or extremities.
Outcomes. Advanced age is the most important risk factor for death, with mortality reaching 20% among patients older than 70 years. Unfortunately, there appears to be substantial neurologic morbidity for those surviving hospitalization.
The US case-fatality rate in 2002 was 9% in patients with meningoencephalitis.
Diagnosis
Diagnosis of symptomatic West Nile virus disease is based on clinical findings, a suggestive epidemiologic context, and specific laboratory test results.
Clinical and epidemiologic clues. Unexplained encephalitis or meningitis, particularly in a patient older than 50 years, during the summer or early fall, should trigger suspicion. Evidence of locally active disease in birds or humans, or recent travel to an area of known active disease, increases suspicion.
MAC-ELISA test and PanBio assay. The best test for diagnosis is detection of immunoglobulin M (IgM) antibody to West Nile virus in serum or cerebrospinal fluid within 8 days of illness onset (MAC-ELISA test, available through local and state health departments).
The FDA recently approved a commercial product, the PanBio West Nile virus IgM assay, which correctly identified the antibody in 90%–99% of cases. IgM antibody does not cross the blood-brain barrier, so its detection in cerebrospinal fluid is presumptive evidence of central nervous system infection.
Other laboratory findings include:
- normal or increased white blood cell counts, sometimes with lymphopenia or anemia
- occasional hyponatremia, especially in patients with encephalitis
- cerebrospinal fluid pleocytosis (usually lymphocytic), increased protein and normal glucose
- normal computed tomography brain scans
- abnormal magnetic resonance images in one third of patients.
Treatment is supportive
Treatment of severe disease is supportive. No evidence indicates efficacy of ribavirin, interferon, steroids, or other agents.
How to use public health resources
Prevention of West Nile virus disease will require both clinical and public health efforts. A good surveillance system is vital, providing clinicians and the community with knowledge about disease activity in birds and humans.
- Local or state health departments must coordinate, investigate, and track reports of dead birds by community members.
- Clinicians must notify the health department about suspected infections in humans.
- By publicizing the results of an active surveillance program, the health department assists clinicians in identifying cases more quickly and helps motivate the community to take appropriate preventive measures.
In late August 1999, an infectious disease specialist reported 2 patients with encephalitis at 1 hospital in Queens to the New York City Department of Health. An ensuing investigation revealed 6 additional cases at nearby hospitals. The illnesses were characterized by fever, severe muscle weakness (7 of 8 persons), and flaccid paralysis (4 of 8). Cerebrospinal fluid test results suggested viral infection.
So began the saga of human West Nile virus in the United States.
The virus was first isolated from a patient in Uganda, and is now distributed throughout Africa, the Middle East, parts of Europe, southwestern Asia, and Australia. Disease outbreaks in other parts of the world were infrequent until 1996.
West Nile virus is thought to have come to North America from Israel, but it is not clear how. Since 1999, the virus has spread rapidly throughout the US. Interestingly, the number of human cases reported annually was low (20–60) until 2002, when more than 4000 cases were reported. Only 9 continental states had avoided human cases of West Nile virus, and only 4 had reported no human or animal cases.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected]
SOURCES
Centers for Disease Control and Prevention. West Nile virus Web page. Available at: www.cdc.gov/ncidod/dvbid/ westnile/index.htm. Accessed on August 19, 2003.
Nash D, Mostashari F, Fine A, et al. The outbreak of West Nile virus infection in the New York City area in 1999. N Engl J Med 2001; 344:1807-1814.
Petersen L, Marfin A, and Gubler D. West Nile virus. JAMA 2003; 290:524-528.
US Food and Drug Administration. West Nile virus Web page. Available at: www.fda.gov/oc/opacom/hottopics/westnile.html. Accessed on August 11, 2003.
SOURCES
Centers for Disease Control and Prevention. West Nile virus Web page. Available at: www.cdc.gov/ncidod/dvbid/ westnile/index.htm. Accessed on August 19, 2003.
Nash D, Mostashari F, Fine A, et al. The outbreak of West Nile virus infection in the New York City area in 1999. N Engl J Med 2001; 344:1807-1814.
Petersen L, Marfin A, and Gubler D. West Nile virus. JAMA 2003; 290:524-528.
US Food and Drug Administration. West Nile virus Web page. Available at: www.fda.gov/oc/opacom/hottopics/westnile.html. Accessed on August 11, 2003.
SARS: Lessons learned thus far
The speed with which public health agencies such as the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) have addressed the outbreak of severe acute respiratory distress syndrome, known as SARS, has been impressive. Working with academic epidemiologists and researchers, they appear to have identified a new virus as the likely causative agent, characterized some of the basic epidemiology and clinical course of the infection, and developed confirmatory lab tests.
Understanding the SARS story is important both for its medical implications and the public health principles it illustrates. This article summarizes key points about SARS, mainly using reference material from the CDC.1
Diagnosis
Infection with the SARS virus produces a range of clinical responses:
- Asymptomatic or mild respiratory illness
- Moderate illness: temperature <100.4°F (<38°C) and 1 or more clinical findings such as cough, shortness of breath, or hypoxia
- Severe illness: the above findings plus radiologic evidence or autopsy findings of pneumonia, respiratory distress syndrome, with or without an identifiable cause.
Suspect SARS when patients presenting with any of the above symptoms meet one of these epidemiologic criteria.
- having traveled to areas under CDC travel alerts or advisories
- having had close contact within 10 days of developing symptoms with a person known or suspected to have SARS.
When evaluating such patients, use careful hand hygiene and precautions against airborne transmission (N-95 respirator or standard face mask if this is not available) and direct contact (gloves, gowns).
Probable cases (clinical criteria of severe respiratory illness of unknown cause since February 1, 2003, epidemiological criteria, with or without lab criteria) and suspected cases (same criteria, but with moderate respiratory illness only) should be reported to local or state health departments.
Diagnostic testing
Diagnostic testing should include chest x-ray, pulse oximetry, blood cultures, sputum gram stain and culture, and testing for viral pathogens (influenza and respiratory syncytial virus).
Legionella and pneumococcal urine antigen testing can be considered. Acute and convalescent (21 days) serum should be saved for lab testing.
In May, the CDC announced the development of an enzyme-linked immunosorbent assay (ELISA) blood test to identify antibody to the presumed SARS virus. The test is now available to local and state health departments for acute and convalescent testing of patients’ serum. A more sensitive polymerase chain reaction test is under development.
Treat with supportive measures
No specific treatment exists for SARS. Treat patients as you would any community-acquired pneumonia of unknown origin and provide supportive therapy as necessary. Hospitalization should be based on the usual indications.
Traditional infection control methods can work
Most importantly, public health departments have demonstrated that traditional infection control measures such as surveillance and isolation/quarantine may be successful in limiting the spread of the infection. Physicians should be aware of these important concepts.
The incubation period for SARS is believed to be up to 10 days. During this time, people are not contagious. Transmission is believed to occur mainly during close face-to-face contact such as happens in households or patient-care settings. Aerosol or airborne transmission is also a possibility, although believed to be much less likely.
Surveillance is the system and process of monitoring for specific conditions. Infectious disease surveillance requires the cooperation of local, state, and federal health departments, private and public laboratories, and clinicians working in private and public settings. A definition of the condition being monitored and a method of identifying and reporting cases are necessary. To maximize surveillance, it helps to have a reporting requirement such as we have for diseases like tuberculosis or measles.
On February 11, 2003, the World Health Organization (WHO) was first informed by Chinese health authorities of 305 cases of acute respiratory syndrome in Guangdong province in southern China. As it turned out, these cases had started in November 2002, and the disease was characterized by transmission to health-care workers and household contacts.
People who visited China in this period and were exposed to SARS became unwitting carriers. Disease outbreaks occurred subsequently in Vietnam, Hong Kong, Singapore, Toronto, and Taiwan. In April, as international pressure increased, Chinese authorities began acknowledging the wider extent of the SARS outbreak, including hundreds of cases in Beijing and smaller numbers in other parts of the country.
As of June 16, 2003, the WHO had reported 8460 cases of SARS worldwide with a death rate of 9%.2 At that time, the CDC had reported 72 probable and 329 suspected cases in the US, almost all travel-related.1
Theoretical source: civet cats
In April, scientific teams from several countries identified a new coronavirus as the likely cause of SARS. This family of viruses had previously been identified as a cause of mild upper respiratory illnesses. In mid-April, CDC and others announced they had sequenced the genome of the specific coronavirus thought to cause SARS. In late May, researchers from Hong Kong and China announced they had discovered a virtually identical virus in a species of tree-dwelling cat, the civet, that is eaten as wild game in southern China, where SARS is believed to have started. One theory is that the virus may have lived in animals and passed to humans through a mutation or other mechanism.3
Both isolation of suspected cases and quarantine of contacts have been used to control SARS. In the US, quarantine is usually implemented voluntarily, but, for certain conditions such as SARS, people can be quarantined involuntarily. In the case of a communicable disease such as SARS for which there is no known treatment and which can spread readily under certain circumstances, the strategies of isolation and quarantine are even more important.
A need for better defenses
As of early June, countries most affected were mainland China, Hong Kong, and Taiwan. These countries were subject to a CDC travel advisory, which means people should travel there only if they had essential business. In addition, the CDC issued a travel alert for Singapore, and re-issued one for Toronto after the city failed to contain the initial outbreak. Alerts advise travelers that if they have visited a specific SARS-affected area, they should seek medical attention if they get sick within 10 days.
Strategies against SARS. While SARS appears to have been brought under control in certain areas (Hanoi and maybe Singapore), this has not happened in others. To date, the US has been spared a serious outbreak. Use of strategies such as travel alerts and advisories, screening airline passengers from affected countries, and heightened vigilance in following up suspected cases and exposures have all helped.
Another emerging infection: monkeypox
As SARS was being contained, an infectious disease new to the US erupted: monkeypox. On June 16, when the number of cases stood at 82 persons An ELISA blood test is now available to identify antibodies to the presumed SARS virus in 5 states, the federal government banned the sale and distribution of prairie dogs and all rodents from Africa, in an effort to control the spread. Monkeypox is believed to have spread from an African rat imported by a pet store and housed with prairie dogs for sale to the public.
Most infected persons had direct contact with diseased prairie dogs that had been purchased as pets. In some instances, however, direct contact with infected animals could not be documented; therefore, health officials cannot rule out the possibility of human-to-human transmission of the monkeypox virus.
Monkeypox was first identified in monkeys in 1959, but certain African rodents were later identified as its real host. Outbreaks in people occurred in the Congo in the 1990s.
The Centers for Disease Control and Prevention issued an interim case definition for human cases of monkeypox and a recommendation that certain individuals be offered smallpox vaccination for protection (available at http://www.cdc.gov/ncidod/monkeypox/ casedefinition.htm).
Our best defense
Continued emergence of infectious diseases and the dramatic spread of SARS internationally through airline travel and close contact in hospitals should prompt us to strengthen our public health systems. A well functioning surveillance system coupled with the infrastructure to apply traditional techniques such as case finding, tracking, isolation, quarantine—and bans, as in the case of monkeypox—may be our best defense against communicable disease epidemics.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Centers for Disease Control and Prevention. Severe acute respiratory syndrome (SARS). Available at: www.cdc.gov/od/oc/media/sars.htm. Accessed on June 17, 2003.
2. World Health Organization. Cumulative number of reported probably cases or SARS. Available at: www.who.int/csr/sars/country/2003_06_16/en. Accessed on June 17, 2003.
3. Bradsher K, Crampton T. Hong Kong travel advisory lifted; cat virus tied to SARS.New York Times May 23, 2003.
The speed with which public health agencies such as the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) have addressed the outbreak of severe acute respiratory distress syndrome, known as SARS, has been impressive. Working with academic epidemiologists and researchers, they appear to have identified a new virus as the likely causative agent, characterized some of the basic epidemiology and clinical course of the infection, and developed confirmatory lab tests.
Understanding the SARS story is important both for its medical implications and the public health principles it illustrates. This article summarizes key points about SARS, mainly using reference material from the CDC.1
Diagnosis
Infection with the SARS virus produces a range of clinical responses:
- Asymptomatic or mild respiratory illness
- Moderate illness: temperature <100.4°F (<38°C) and 1 or more clinical findings such as cough, shortness of breath, or hypoxia
- Severe illness: the above findings plus radiologic evidence or autopsy findings of pneumonia, respiratory distress syndrome, with or without an identifiable cause.
Suspect SARS when patients presenting with any of the above symptoms meet one of these epidemiologic criteria.
- having traveled to areas under CDC travel alerts or advisories
- having had close contact within 10 days of developing symptoms with a person known or suspected to have SARS.
When evaluating such patients, use careful hand hygiene and precautions against airborne transmission (N-95 respirator or standard face mask if this is not available) and direct contact (gloves, gowns).
Probable cases (clinical criteria of severe respiratory illness of unknown cause since February 1, 2003, epidemiological criteria, with or without lab criteria) and suspected cases (same criteria, but with moderate respiratory illness only) should be reported to local or state health departments.
Diagnostic testing
Diagnostic testing should include chest x-ray, pulse oximetry, blood cultures, sputum gram stain and culture, and testing for viral pathogens (influenza and respiratory syncytial virus).
Legionella and pneumococcal urine antigen testing can be considered. Acute and convalescent (21 days) serum should be saved for lab testing.
In May, the CDC announced the development of an enzyme-linked immunosorbent assay (ELISA) blood test to identify antibody to the presumed SARS virus. The test is now available to local and state health departments for acute and convalescent testing of patients’ serum. A more sensitive polymerase chain reaction test is under development.
Treat with supportive measures
No specific treatment exists for SARS. Treat patients as you would any community-acquired pneumonia of unknown origin and provide supportive therapy as necessary. Hospitalization should be based on the usual indications.
Traditional infection control methods can work
Most importantly, public health departments have demonstrated that traditional infection control measures such as surveillance and isolation/quarantine may be successful in limiting the spread of the infection. Physicians should be aware of these important concepts.
The incubation period for SARS is believed to be up to 10 days. During this time, people are not contagious. Transmission is believed to occur mainly during close face-to-face contact such as happens in households or patient-care settings. Aerosol or airborne transmission is also a possibility, although believed to be much less likely.
Surveillance is the system and process of monitoring for specific conditions. Infectious disease surveillance requires the cooperation of local, state, and federal health departments, private and public laboratories, and clinicians working in private and public settings. A definition of the condition being monitored and a method of identifying and reporting cases are necessary. To maximize surveillance, it helps to have a reporting requirement such as we have for diseases like tuberculosis or measles.
On February 11, 2003, the World Health Organization (WHO) was first informed by Chinese health authorities of 305 cases of acute respiratory syndrome in Guangdong province in southern China. As it turned out, these cases had started in November 2002, and the disease was characterized by transmission to health-care workers and household contacts.
People who visited China in this period and were exposed to SARS became unwitting carriers. Disease outbreaks occurred subsequently in Vietnam, Hong Kong, Singapore, Toronto, and Taiwan. In April, as international pressure increased, Chinese authorities began acknowledging the wider extent of the SARS outbreak, including hundreds of cases in Beijing and smaller numbers in other parts of the country.
As of June 16, 2003, the WHO had reported 8460 cases of SARS worldwide with a death rate of 9%.2 At that time, the CDC had reported 72 probable and 329 suspected cases in the US, almost all travel-related.1
Theoretical source: civet cats
In April, scientific teams from several countries identified a new coronavirus as the likely cause of SARS. This family of viruses had previously been identified as a cause of mild upper respiratory illnesses. In mid-April, CDC and others announced they had sequenced the genome of the specific coronavirus thought to cause SARS. In late May, researchers from Hong Kong and China announced they had discovered a virtually identical virus in a species of tree-dwelling cat, the civet, that is eaten as wild game in southern China, where SARS is believed to have started. One theory is that the virus may have lived in animals and passed to humans through a mutation or other mechanism.3
Both isolation of suspected cases and quarantine of contacts have been used to control SARS. In the US, quarantine is usually implemented voluntarily, but, for certain conditions such as SARS, people can be quarantined involuntarily. In the case of a communicable disease such as SARS for which there is no known treatment and which can spread readily under certain circumstances, the strategies of isolation and quarantine are even more important.
A need for better defenses
As of early June, countries most affected were mainland China, Hong Kong, and Taiwan. These countries were subject to a CDC travel advisory, which means people should travel there only if they had essential business. In addition, the CDC issued a travel alert for Singapore, and re-issued one for Toronto after the city failed to contain the initial outbreak. Alerts advise travelers that if they have visited a specific SARS-affected area, they should seek medical attention if they get sick within 10 days.
Strategies against SARS. While SARS appears to have been brought under control in certain areas (Hanoi and maybe Singapore), this has not happened in others. To date, the US has been spared a serious outbreak. Use of strategies such as travel alerts and advisories, screening airline passengers from affected countries, and heightened vigilance in following up suspected cases and exposures have all helped.
Another emerging infection: monkeypox
As SARS was being contained, an infectious disease new to the US erupted: monkeypox. On June 16, when the number of cases stood at 82 persons An ELISA blood test is now available to identify antibodies to the presumed SARS virus in 5 states, the federal government banned the sale and distribution of prairie dogs and all rodents from Africa, in an effort to control the spread. Monkeypox is believed to have spread from an African rat imported by a pet store and housed with prairie dogs for sale to the public.
Most infected persons had direct contact with diseased prairie dogs that had been purchased as pets. In some instances, however, direct contact with infected animals could not be documented; therefore, health officials cannot rule out the possibility of human-to-human transmission of the monkeypox virus.
Monkeypox was first identified in monkeys in 1959, but certain African rodents were later identified as its real host. Outbreaks in people occurred in the Congo in the 1990s.
The Centers for Disease Control and Prevention issued an interim case definition for human cases of monkeypox and a recommendation that certain individuals be offered smallpox vaccination for protection (available at http://www.cdc.gov/ncidod/monkeypox/ casedefinition.htm).
Our best defense
Continued emergence of infectious diseases and the dramatic spread of SARS internationally through airline travel and close contact in hospitals should prompt us to strengthen our public health systems. A well functioning surveillance system coupled with the infrastructure to apply traditional techniques such as case finding, tracking, isolation, quarantine—and bans, as in the case of monkeypox—may be our best defense against communicable disease epidemics.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
The speed with which public health agencies such as the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) have addressed the outbreak of severe acute respiratory distress syndrome, known as SARS, has been impressive. Working with academic epidemiologists and researchers, they appear to have identified a new virus as the likely causative agent, characterized some of the basic epidemiology and clinical course of the infection, and developed confirmatory lab tests.
Understanding the SARS story is important both for its medical implications and the public health principles it illustrates. This article summarizes key points about SARS, mainly using reference material from the CDC.1
Diagnosis
Infection with the SARS virus produces a range of clinical responses:
- Asymptomatic or mild respiratory illness
- Moderate illness: temperature <100.4°F (<38°C) and 1 or more clinical findings such as cough, shortness of breath, or hypoxia
- Severe illness: the above findings plus radiologic evidence or autopsy findings of pneumonia, respiratory distress syndrome, with or without an identifiable cause.
Suspect SARS when patients presenting with any of the above symptoms meet one of these epidemiologic criteria.
- having traveled to areas under CDC travel alerts or advisories
- having had close contact within 10 days of developing symptoms with a person known or suspected to have SARS.
When evaluating such patients, use careful hand hygiene and precautions against airborne transmission (N-95 respirator or standard face mask if this is not available) and direct contact (gloves, gowns).
Probable cases (clinical criteria of severe respiratory illness of unknown cause since February 1, 2003, epidemiological criteria, with or without lab criteria) and suspected cases (same criteria, but with moderate respiratory illness only) should be reported to local or state health departments.
Diagnostic testing
Diagnostic testing should include chest x-ray, pulse oximetry, blood cultures, sputum gram stain and culture, and testing for viral pathogens (influenza and respiratory syncytial virus).
Legionella and pneumococcal urine antigen testing can be considered. Acute and convalescent (21 days) serum should be saved for lab testing.
In May, the CDC announced the development of an enzyme-linked immunosorbent assay (ELISA) blood test to identify antibody to the presumed SARS virus. The test is now available to local and state health departments for acute and convalescent testing of patients’ serum. A more sensitive polymerase chain reaction test is under development.
Treat with supportive measures
No specific treatment exists for SARS. Treat patients as you would any community-acquired pneumonia of unknown origin and provide supportive therapy as necessary. Hospitalization should be based on the usual indications.
Traditional infection control methods can work
Most importantly, public health departments have demonstrated that traditional infection control measures such as surveillance and isolation/quarantine may be successful in limiting the spread of the infection. Physicians should be aware of these important concepts.
The incubation period for SARS is believed to be up to 10 days. During this time, people are not contagious. Transmission is believed to occur mainly during close face-to-face contact such as happens in households or patient-care settings. Aerosol or airborne transmission is also a possibility, although believed to be much less likely.
Surveillance is the system and process of monitoring for specific conditions. Infectious disease surveillance requires the cooperation of local, state, and federal health departments, private and public laboratories, and clinicians working in private and public settings. A definition of the condition being monitored and a method of identifying and reporting cases are necessary. To maximize surveillance, it helps to have a reporting requirement such as we have for diseases like tuberculosis or measles.
On February 11, 2003, the World Health Organization (WHO) was first informed by Chinese health authorities of 305 cases of acute respiratory syndrome in Guangdong province in southern China. As it turned out, these cases had started in November 2002, and the disease was characterized by transmission to health-care workers and household contacts.
People who visited China in this period and were exposed to SARS became unwitting carriers. Disease outbreaks occurred subsequently in Vietnam, Hong Kong, Singapore, Toronto, and Taiwan. In April, as international pressure increased, Chinese authorities began acknowledging the wider extent of the SARS outbreak, including hundreds of cases in Beijing and smaller numbers in other parts of the country.
As of June 16, 2003, the WHO had reported 8460 cases of SARS worldwide with a death rate of 9%.2 At that time, the CDC had reported 72 probable and 329 suspected cases in the US, almost all travel-related.1
Theoretical source: civet cats
In April, scientific teams from several countries identified a new coronavirus as the likely cause of SARS. This family of viruses had previously been identified as a cause of mild upper respiratory illnesses. In mid-April, CDC and others announced they had sequenced the genome of the specific coronavirus thought to cause SARS. In late May, researchers from Hong Kong and China announced they had discovered a virtually identical virus in a species of tree-dwelling cat, the civet, that is eaten as wild game in southern China, where SARS is believed to have started. One theory is that the virus may have lived in animals and passed to humans through a mutation or other mechanism.3
Both isolation of suspected cases and quarantine of contacts have been used to control SARS. In the US, quarantine is usually implemented voluntarily, but, for certain conditions such as SARS, people can be quarantined involuntarily. In the case of a communicable disease such as SARS for which there is no known treatment and which can spread readily under certain circumstances, the strategies of isolation and quarantine are even more important.
A need for better defenses
As of early June, countries most affected were mainland China, Hong Kong, and Taiwan. These countries were subject to a CDC travel advisory, which means people should travel there only if they had essential business. In addition, the CDC issued a travel alert for Singapore, and re-issued one for Toronto after the city failed to contain the initial outbreak. Alerts advise travelers that if they have visited a specific SARS-affected area, they should seek medical attention if they get sick within 10 days.
Strategies against SARS. While SARS appears to have been brought under control in certain areas (Hanoi and maybe Singapore), this has not happened in others. To date, the US has been spared a serious outbreak. Use of strategies such as travel alerts and advisories, screening airline passengers from affected countries, and heightened vigilance in following up suspected cases and exposures have all helped.
Another emerging infection: monkeypox
As SARS was being contained, an infectious disease new to the US erupted: monkeypox. On June 16, when the number of cases stood at 82 persons An ELISA blood test is now available to identify antibodies to the presumed SARS virus in 5 states, the federal government banned the sale and distribution of prairie dogs and all rodents from Africa, in an effort to control the spread. Monkeypox is believed to have spread from an African rat imported by a pet store and housed with prairie dogs for sale to the public.
Most infected persons had direct contact with diseased prairie dogs that had been purchased as pets. In some instances, however, direct contact with infected animals could not be documented; therefore, health officials cannot rule out the possibility of human-to-human transmission of the monkeypox virus.
Monkeypox was first identified in monkeys in 1959, but certain African rodents were later identified as its real host. Outbreaks in people occurred in the Congo in the 1990s.
The Centers for Disease Control and Prevention issued an interim case definition for human cases of monkeypox and a recommendation that certain individuals be offered smallpox vaccination for protection (available at http://www.cdc.gov/ncidod/monkeypox/ casedefinition.htm).
Our best defense
Continued emergence of infectious diseases and the dramatic spread of SARS internationally through airline travel and close contact in hospitals should prompt us to strengthen our public health systems. A well functioning surveillance system coupled with the infrastructure to apply traditional techniques such as case finding, tracking, isolation, quarantine—and bans, as in the case of monkeypox—may be our best defense against communicable disease epidemics.
Correspondence
1601 Parkview Avenue, Rockford, IL 61107. E-mail: [email protected].
1. Centers for Disease Control and Prevention. Severe acute respiratory syndrome (SARS). Available at: www.cdc.gov/od/oc/media/sars.htm. Accessed on June 17, 2003.
2. World Health Organization. Cumulative number of reported probably cases or SARS. Available at: www.who.int/csr/sars/country/2003_06_16/en. Accessed on June 17, 2003.
3. Bradsher K, Crampton T. Hong Kong travel advisory lifted; cat virus tied to SARS.New York Times May 23, 2003.
1. Centers for Disease Control and Prevention. Severe acute respiratory syndrome (SARS). Available at: www.cdc.gov/od/oc/media/sars.htm. Accessed on June 17, 2003.
2. World Health Organization. Cumulative number of reported probably cases or SARS. Available at: www.who.int/csr/sars/country/2003_06_16/en. Accessed on June 17, 2003.
3. Bradsher K, Crampton T. Hong Kong travel advisory lifted; cat virus tied to SARS.New York Times May 23, 2003.
HRT and vitamins C and E do not improve coronary disease in women
Hormone replacement therapy (HRT) and antioxidant vitamin supplements (vitamins E and C) do not provide cardiovascular benefit for postmenopausal women with known coronary heart disease. Moreover, a potential for harm exists with each of the treatments. Therefore, neither should be prescribed specifically for cardiovascular benefit for postmenopausal women with coronary heart disease.
Hormone replacement therapy (HRT) and antioxidant vitamin supplements (vitamins E and C) do not provide cardiovascular benefit for postmenopausal women with known coronary heart disease. Moreover, a potential for harm exists with each of the treatments. Therefore, neither should be prescribed specifically for cardiovascular benefit for postmenopausal women with coronary heart disease.
Hormone replacement therapy (HRT) and antioxidant vitamin supplements (vitamins E and C) do not provide cardiovascular benefit for postmenopausal women with known coronary heart disease. Moreover, a potential for harm exists with each of the treatments. Therefore, neither should be prescribed specifically for cardiovascular benefit for postmenopausal women with coronary heart disease.
Treatment of Hyperlipidemia
- The new NCEP III provides revised guidelines for the treatment of hyperlipidemia.
- Combining traditional risk factor assessment with the calculated 10-year risk of coronary artery disease allows for optimal patient-centered counseling.
- Statins are normally the first-line therapy for hyperlipidemia.
In 1995 and 1996, US adults made more than 18 million office visits for the evaluation and treatment of hyperlipidemia, including 3.4% of all visits to family physicians. Among visits to family physicians, 4.1% included measurement of cholesterol levels.1 Overall, mean cholesterol levels decreased from 220 in 1960–1962 to 203 in 1988–1994. During the same time period, the proportion of adults with elevated total cholesterol levels (> 240) decreased from 32% to 19%.2 Despite this progress, the availability of more effective drugs, guidelines advocating increasingly aggressive treatment, and populationwide goals established in Healthy People 2010 will continue to increase the number of patients seen by family physicians for screening, diagnosis, and treatment of hyperlipidemia.3
When to treat
The National Cholesterol Education Program (NCEP), a program within the National Institute of Health’s Heart, Lung, and Blood Institute, published a guideline in 1993 for screening and treating hyperlipidemia. Physicians have since become familiar with the NCEP concept of basing treatment decisions on assessment of patient risk factors (smoking, age, diabetes, hypertension, family history of early coronary artery disease [CAD]) and application of algorithms linked to desired low-density lipoprotein (LDL) cholesterol levels. The advantage of this strategy is its simplicity. Physicians assess whether the NCEP risk factors are present and then work with their patients to achieve the desired LDL level through lifestyle modification, drug therapy, or both.
Unfortunately, the NCEP guideline did not assess the individual’s actual risk of CAD. In its recently released Third Report, the NCEP has recognized the value of this strategy by incorporating the Framingham tables to calculate the 10-year risk of developing clinical CAD based on a patient’s individual risk factors, including cholesterol levels (Table 1).4 This new NCEP III guideline recommends traditional risk factor counting coupled, in certain situations, with the 10-year risk derived from the Framingham scoring system.
TABLE 1
FRAMINGHAM TABLES FOR CALCULATING CORONARY ARTERY DISEASE RISK
Therapy is based on the individual patient’s risk category and LDL levels (Figure). Patients whose 10-year risk is greater than 20% or those who have CAD-equivalent conditions (ie, diabetes, peripheral arterial disease, abdominal aortic aneurysm, symptomatic carotid artery disease) are considered to have a risk equivalent to that of patients with known CAD; all have an LDL goal of 100 or less.
For those with a 10-year CAD risk less than 20%, the number of positive risk factors determines the LDL goal. This new method allows physicians to communicate with their patients more clearly about individual risk and enhances shared decision making. While the NCEP III report is based on extensive literature review, the recommendations of its expert panel are not characterized according to the strength of the supporting evidence, as is done by the US Preventive Services Task Force.
Figure
TREATMENT STRATEGY BASED ON LDL LEVEL AND RISK CATEGORY
Explaining treatment benefits
The NCEP III report does not make explicit the effect of the treatment on the patient; that is, how much the proposed treatment will reduce the risk of CAD. This determination depends in part on whether the patient being treated has known CAD or a CAD-equivalent condition (secondary prevention) versus no known CAD (primary prevention). The benefits of treatment have been most clearly quantified for drug treatment and are most easily evaluated using the number needed to treat (NNT). The NNT refers to the number of patients who would have to be treated for 5 years to prevent 1 CAD event. Physicians may use the NNTs to assist patients in determining their preferences for treatment, bearing in mind that the NNT refers to an outcome for a population, such as men with high cholesterol levels. For a given individual, their risk of an adverse outcome is all or none. Nonetheless, patients may find the NNT a useful way to assess their personal values in making treatment decisions.
Treatment
Lifestyle modification
Diet modification is the cornerstone of therapy for mild to moderate hyperlipidemia. Modifying the diet is also recommended along with pharmacologic therapy in people at higher risk of CAD. NCEP III recommends a diet for “therapeutic lifestyle changes” that includes < 200 mg cholesterol per day, < 7% saturated fat, 25% to 35% total fat, 50% to 60% carbohydrates, and 15% protein of total calories.4
Although diet therapy has shown a modest redution in total cholesterol in clinical trials, no clear evidence shows that a diet low in saturated fat and cholesterol will reduce cardiovascular morbidity and mortality.5,6 Many people find it difficult to change their dietary habits and to maintain healthier ones. Systematic reviews of observational studies have found that increased consumption of fruits and vegetables is associated with lower incidence of heart attack and stroke. However, the potential for bias and confounding factors in such studies makes them less convincing than randomized controlled trials (RCTs).5
The Ornish program, in which CAD or CAD-equivalent patients pursue intense lifestyle modification for up to 3 years, has shown that revascularization procedures can be avoided. Treatment groups that ate a very-low-fat diet, received intervention on stress management, and followed a prescribed exercise program showed similar improvement in angina symptoms versus the revascularization group. Another trial showed regression of atherosclerotic plaque on angiograms.10-12
Other nonpharmacologic options include plant stanols (2 grams/day) and soluble fiber (10 to 25 grams/day) to reduce LDL-cholesterol. Plant stanols have a structure similar to that of cholesterol and interfere with cholesterol absorption when eaten along with a typical diet, resulting in reduction of blood cholesterol levels. Plant stanols and sterols can be found in certain margarines and salad oils and can be taken with each meal as substitutes for other sources of dietary fat.7-9 No RCTs have shown that these substances reduce cardiovascular events or overall mortality.
Herbal products and dietary supplements
A survey found that as many as 50% of respondents with elevated cholesterol levels would prefer an over-the-counter product such as garlic, yeast, or soy products.13 Studies of products promoted for lipid-lowering effects were found to have a modest effect on lipid levels13-18 (Table 2); however, no RCTs were found that assessed patient-oriented outcomes. Because herbal products and supplements have modest effects on lipid levels and because long-term safety data are lacking, such products should be used with caution for treatment of hyperlipidemia.
TABLE 2
PHARMACOLOGIC AND NONPHARMACOLOGIC INTERVENTIONS
Strength of Recommendation* | Treatment | Type of Benefit | Cost Per Month ($) | Comments |
---|---|---|---|---|
A | Statins | OM, CVM | 40–110 | Well tolerated |
B | Fibric acids | CVE | 60–70 | All male subjects in both primary and secondary trials |
B | Niacin | CVE | 10–80 | Watch for adverse reactions (flushing, elevated glucose, liver function tests) |
B | Bile acid resin | CVE | 40–60 | Ideal agent for patients with severe liver disease; watch for drug interactions |
B | Lifestyle modification | Lipid | Varies | No strong evidence from randomized clinical trials on primary prevention of major coronary events or mortality |
B | Soy products | Lipid | 20 | FDA has approved labeling soy products for cholesterol reduction |
B | Red yeast | Lipid | 20–30 | Active ingredient is lovastatin; should be treated as lovastatin |
B | Plant stanols | Lipid | 20–30 | Substitute for other source of fat calories; must be taken with each meal |
C | Fish oils | Lipid | 5–10 | Use with caution because of high caloric value and cholesterol content in products; may increase cholesterol level with long-term use |
C | Garlic | Lipid | 10–20 | Conflicting results with clinical trials |
C | Green tea | Lipid | 15 | Epidemiologic study data |
* Criteria correspond to US Preventive Services Task Force categories (A = strong evidence to support recommendation, B = fair evidence to support recommendation, C = insufficient evidence to recommend for or against). CVE denotes reduction in cardiovascular events; CVM, reduction in cardiovascular mortality; lipid, reduction in lipid levels only; OM, reduction in overall mortality. |
Pharmacologic treatment
Clinical trials of hyperlipidemia therapy should address outcomes that matter most to patients, such as morbidity, mortality, quality of life, and cost, rather than stressing disease-oriented evidence, such as the ability to reduce cholesterol levels. For this review we identified major long-term RCTs that included significant coronary events or mortality as the primary outcomes. Table 3 summarizes the results of primary and secondary prevention studies.
TABLE 3
PHARMACOLOGIC INTERVENTION
Reduction in Risk | ||||
---|---|---|---|---|
Intervention | Major Coronary Events | All-Cause Mortality | Comments | |
ARR (%) | NNT | NNT | ||
Primary Prevention | ||||
Statins | 2.0–2.3 | 44–49 | NS | Studies on normal and hypercholesterolemic patients. Mean age was 47–58 years; all patients were men except for 1 statin study that included a small number of women |
Gemfibrozil | 1.4 | 71 | NS | |
Cholestyramine | 1.7 | 59 | NS | |
Secondary Prevention | ||||
Statins | 3–3.6 | 28–33 | 24–28 | Mean age was 55–64 years. Participants were male except for the 3 statin studies and the benzafibrate study that enrolled a small number of women. Cholesterol eligibility criteria varied among the studies and included patients with normal or elevated total and LDL levels or low HDL levels |
Gemfibrozil | 4.4 | 23 | NS | |
Benzafibrate | 1.4 | 71 | NS | |
Niacin | 6.2 | 17 | NS | |
ARR denotes absolute risk reduction in percent; NNT, number of needed to treat for 5 years to prevent 1 adverse outcome; NS, not significant. |
Primary prevention
Primary prevention studies have investigated the treatment of middle-aged men with hyperlipidemia and of men and women with average cholesterol levels.19-23 Results showed similar positive outcomes on reducing coronary events in all groups (Table 3). A systematic review and a meta-analysis of primary prevention studies also demonstrated that drug therapy reduced cholesterol levels and resulted in statistically significant lowering of cardiovascular events in the treated group compared with placebo without any significant reduction in overall mortality.5,24 Absolute risk reductions ranged from 1.4% to 2.3%. In other words, the number of patients that would have to be treated for 5 years to prevent a single major coronary event was 44 to 49 for the statins, 71 for gemfibrozil, and 59 for cholestyramine.
Secondary prevention
In secondary prevention trials, RCTs have demonstrated a strong, consistent relationship between cholesterol lowering and the reduction of risk for a coronary event Table 3.25-30 Patients with preexisting CAD and elevated or average lipid levels benefit from medical therapy. The relative risk of cardiovascular events was reduced by an average of 30% in the active treatment groups.
In these trials, the NNT for 5 years to prevent 1 coronary heart event or nonfatal myocardial infarction (MI) was 28 to 33 for statins, 23 for gemfibrozil, 71 for bezafibrate, and 17 for niacin. There was also a significant risk reduction for all-cause mortality in the statin trials.27,28 These data support the recommendations from NCEP III to treat patients with preexisting CAD aggressively. People with diabetes should receive similar treatment because they are more prone to the development of new CAD within 10 years.4 In addition, subgroup analyses of diabetics treated with statins in primary prevention trials demonstrated a decreased risk of cardiovascular events.26,29
While cholesterol-modifying agents include 4 different classes—statins, fibric acid derivatives, bile acid resins, and nicotinic acid—studies cited in this paper predominantly involved statins and fibric acids. In systematic reviews of both primary and secondary prevention trials, statins were the most effective agents for both cholesterol lowering and cardiovascular risk reduction.5 We found no RCTs that directly compared outcomes between cholesterol-lowering medications. Although women represented a small number of participants in these trials, a meta-analysis showed that statin therapy decreased their risk of heart disease, with an NNT of 31 for reduction of major coronary events.31 No evidence was found to support the effectiveness of hyperlipidemia therapy for people aged more than 75 years. For people aged 65 to 75 years, there is evidence to support drug therapy for secondary prevention but not for primary prevention.
Statins are well tolerated; the most common adverse reactions are gastrointestinal related and occur in approximately 3% of patients. The more serious but uncommon events associated with the use of statins are hepatitis and myopathy. Asymptomatic increases in hepatic transaminases to more than 3 times the upper normal limit occur in approximately 1% of patients.32 Therapy can be discontinued for 1 to 2 weeks; enzyme levels should return to normal if the elevations are medication related. It is not necessary to stop therapy when enzymes are elevated at less than 3 times the upper normal limit.
General guidelines on liver monitoring call for performing a baseline liver function test and repeating it 6 weeks later.33 Once a stable dose has been established, the manufacturer recommends periodic testing; however, no clear evidence supports a specific interval. Clinicians may choose to individualize decisions on testing frequency based on factors such as potential drug interactions (statins with fibric acids or niacin) or the presence of conditions that increase the risk of liver disease.34
Myopathy, defined as generalized muscle aches and pain with a serum creatine kinase level greater than 1000 U/L, occurs rarely (< 0.1%) but may be more likely to occur when statins are used concomitantly with medications such as fibric acid, antifungals, erythromycin, and cyclosporine.31,35 The best preventive strategy is to educate patients about early recognition of the signs and symptoms of myopathy. Because most statins are metabolized by the cytochrome P450-3A4, any medications that inhibit this enzyme can increase statin serum levels and increase the risk of hepatotoxicity and myopathy.
The NCEP III recommends the use of statins as firstline therapy. A standard dose of a statin decreases LDL levels by 20% to 50%, increases HDL levels by 5% to 10%, and reduces triglyceride levels by 10% to 20%. Atorvastatin and simvastatin can produce the highest reductions in LDL levels: up to 50%. Only pravastatin, simvastatin, and lovastatin have been involved in longterm RCTs of primary and secondary prevention. Atorvastatin had positive benefits in a short-term secondary prevention trial.37 Unfortunately, the only head-to-head comparisons of statins have looked at disease-oriented outcomes such as lipid levels.37 Statins are patient friendly. They require a daily evening dose because cholesterol synthesis is more active during the night. Atorvastatin can be given at any time of day because of its long half-life.
Gemfibrozil, a fibric acid, is often used to treat hypertriglyceridemia and as an adjunctive agent to statin therapy. It decreases triglycerides by 40% to 50% but has minimal effects on the rest of the lipid panel. Adverse effects are generally mild. Liver function monitoring is recommended. The usual dosage regimen for fibric acids is 2 times a day and should be adjusted for renal function.
Niacin can increase HDL by 30% and decrease triglycerides by 30% and LDL by 20%. Major adverse reactions include flushing, gastrointestinal symptoms, elevation of liver function tests, uric acid, and serum glucose levels. The new longer-acting formulation has been associated with less flushing. Another class, the bile acid resins, including cholestyramine and colestipol, may play an adjunctive role in therapy. Their effect on the lipid panel is mild compared with those of the other class and they can increase triglyceride levels. Many patients find the gritty taste of the granular formulation unpalatable. The bile acid resins have a favorable safety profile. Most adverse events occur locally in the gut.
Conclusions
The emergence of statins as a safe and effective, although costly, therapy for hyperlipidemia and the development of clinical guidelines advocating their increased use will place family physicians under added pressure to screen for and treat hyperlipidemia. While the general value of lifestyle changes is recognized in national recommendations, more effective ways for physicians to implement them successfully in ambulatory settings are needed.
An optimal evidence-based approach to hyperlipidemia uses the new NCEP III guideline, which combines traditional risk factor assessment with assessment for CAD using the Framingham tables to determine LDL goals and appropriate treatment modalities. Statins are first-line agents for patients who are candidates for drug therapy. Discussions between clinicians and patients of the NNTs for primary and secondary prevention will help foster patient-centered discussions on the role of medical, economic, and quality-of-life issues in the decision-making process.
1. Schappert SM, Nelson C. National Ambulatory Medical Care Survey: 1995–1996 summary. National Center for Health Statistics. Vital Health Stat 1999;13(142).-
2. National Center for Health Statistics. Health, United States, 1999, with health and aging chartbook. Hyattsville, Md: 1999.
3. US Department of Health and Human Services. Tracking Healthy People 2010. Washington, DC: US Government Printing Office. November 2000. Available at: http://www.cdc.gov/hchs/hphome.htm.
4. Third report of the National Cholesterol Education Program (NCEP) Expert Panel on Detection, Evaluation and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III). Executive Summary. Available at: http://www.nhlbi.nih.gov/guidelines/cholesterol/atp3xsum.pdf Accessed April 16, 2001.
5. Clinical evidence London, England: BMJ Publishing Group; June 2001. Available at: www.clinicalevidence.org.
6. Henkin Y, Shai I, Zuk R, et al. Dietary treatment of hyperlipidemia: Do dietitians do it better? A randomized, controlled trial. Am J Med 2000;109:549-55.
7. Ornish D. Avoiding revascularization with lifestyle changes: the multicenter lifestyle demonstration project. Am J Cardiol 1998;82:72T-76T.
8. Ornish D, Scherwitz LW, Billings JH, et al. Intensive lifestyle changes for reversal of coronary heart disease. JAMA 1998;280:2001-7.
9. Gould KL, Ornish D, Scherwitz L, et al. Changes in myocardial perfusion abnormalities by positron emission tomography after long-term, intense risk factor modification. JAMA 1995;274:894-901.
10. Mensink RP, Plat J. Efficacy of dietary plant stanols. In: New developments in the dietary management of high cholesterol. New York: McGraw-Hill; 1998;27-31.
11. Blair SN, Capuzzi DM, Gottlieb SO, et al. Incremental reduction of serum total cholesterol and LDL with the addition of plant stanol ester-containing spread to statin therapy. Am J Cardiol 2000;86:46-52.
12. Miettinen TA, Puska P, Gylling H, et al. Reduction of serum cholesterol with sitostanol-ester margarine in a mildly hypercholesteremic population. N Engl J Med 1995;333:1308-12.
13. Caron MF, White CM. Evaluation of the antihyperlipidemic properties of dietary supplements. Pharmacotherapy 2001;21:481-7.
14. Harris WS. Nonpharmacologic treatment of hypertriglyceridemia: focus on fish oils. Clin Cardiol 1999;22(suppl 2):II40-3.
15. Stevinson C, Pittler MH, Ernst E. Garlic for treating hyperlipidemia. Ann Intern Med 2000;133:420-9.
16. EBM Reviews. Database of abstracts of reviews of effectiveness [database online]. Psyllium-enriched cereals lower blood total cholesterol and LDL cholesterol, but not HDL cholesterol, in hypercholesterolemic adults: results of a meta-analysis. July 2001; v1, accession no. 00125498-100000000-00737. Available at: http://www.ovid.com/products/databases. Accessed Oct. 29, 2001.
17. EBM Reviews. ACP Journal Club [database online]. Soy protein intake decreases total and LDL cholesterol and triglyceride levels. March/April 1996; 124:41, accession no. 00021607-199603000-00013. Available at: http://www.ovid.com/products/databases. Accessed April 4, 2001.
18. Jellin JM, Batz F, Hitchens K. Natural Medicines Comprehensive Database, 3rd ed. Stockton, Calif: Therapeutic Research Faculty; 2000.
19. Frick MH, Elo O, Haapa K, et al. Helsinki heart study: primary-prevention trial with gemfibrozil in middle-aged men with dyslipidemia. N Engl J Med 1987;317:1237-45.
20. The lipid research clinics coronary primary prevention trial results I. Reduction in incidence of coronary heart disease. JAMA 1984;251:351-64.
21. The lipid research clinics coronary primary prevention trial results II. The relationship of reduction in incidence of coronary heart disease to cholesterol lowering. JAMA 1984;251:365-74.
22. Shepherd J, Cobbe SM, Ford I, et al. Prevention of coronary heart disease with pravastatin in men with hyperlipidemia. N Engl J Med 1995;333:1301-6.
23. Downs JR, Clearfield M, Weis S, et al. Primary prevention of acute coronary events with lovastatin in men and women with average cholesterol levels. JAMA 1998;279:1615-22.
24. Pignone M, Phillips C, Mulrow C. Use of lipid lowering drugs for primary prevention of coronary heart disease: meta-analysis of randomized trials. BMJ 2000;321:983-5.
25. Secondary prevention by raising HDL cholesterol and reducing triglycerides in patients with coronary artery disease: the Bezafibrate Infarction Prevention (BIP) Study. Circulation 2000;102:21-7.
26. Rubins HB, Robins SJ, Collins D, et al. Gemfibrozil for the secondary prevention of coronary heart disease in men with low levels of HDL-cholesterol. N Engl J Med 1999;341:410-8.
27. Randomised trial of cholesterol lowering in 4444 patients with coronary heart disease: the Scandinavian simvastatin survival study (4S). Lancet 1994;344:1383-9.
28. Preventio of cardiovascular events and death with pravastatin in patients with coronary heart disease and a broad range of initial cholesterol levels. The Long-Term Intervention with Pravastatin in Ischaemic Disease. N Engl J Med 1998;339:1349-57.
29. Sacks FM, Pfeffer MA, Moye LA, et al. The effect of pravastatin on coronary events after myocardial infarction in patients with average cholesterol levels. N Engl J Med 1996;335:1001-9.
30. Canner PL, Berge KG, Wenger NK, et al. Fifteen year mortality in coronary drug project patients: long-term benefit with niacin. J Am Coll Cardiol 1986;8:1245-55.
31. LaRosa JC, He J, Vupputuri S. Effect of statins on risk of coronary disease; a meta-analysis of randomized controlled trials. JAMA 1999;24:2340-6.
32. Hsu I, Spinler SA, Johnson N. Comparative evaluation of the safety and efficacy of HMG-CoA reductase inhibitor monotherapy in the treatment of primary hyperlipidemia. Ann Pharmacother 1995;29:743-59.
33. Tice SA, Parry D. Medications that require hepatic monitoring. Hosp Pharm 2001;36:456-64.
34. Weismantel D. What lab monitoring is appropriate to detect adverse drug reactions in patients on cholesterol-lowering agents? J Fam Pract 2001;50:927.-
35. American College of Clinical Pharmacy. PSAP: pharmacotherapy self-assessment program, 4th ed. Kansas City, Mo: ACCP; 2001;66-7.
36. Schwartz GG, Olsson AG, Ezekowitz MD, et al. Effects of atorvastatin on early recurrent ischemic events in acute coronary syndromes. JAMA 2001;285:1711-8.
37. Jones P, Kafonek S, Laurora I, et al. Comparative dose efficacy study of atorvastatin vs. simvastatin, pravastatin, lovastatin, and fluvastatin in patients with hyperlipidemia (the CURVES study). Am J Cardiol 1998;81:582-7.
- The new NCEP III provides revised guidelines for the treatment of hyperlipidemia.
- Combining traditional risk factor assessment with the calculated 10-year risk of coronary artery disease allows for optimal patient-centered counseling.
- Statins are normally the first-line therapy for hyperlipidemia.
In 1995 and 1996, US adults made more than 18 million office visits for the evaluation and treatment of hyperlipidemia, including 3.4% of all visits to family physicians. Among visits to family physicians, 4.1% included measurement of cholesterol levels.1 Overall, mean cholesterol levels decreased from 220 in 1960–1962 to 203 in 1988–1994. During the same time period, the proportion of adults with elevated total cholesterol levels (> 240) decreased from 32% to 19%.2 Despite this progress, the availability of more effective drugs, guidelines advocating increasingly aggressive treatment, and populationwide goals established in Healthy People 2010 will continue to increase the number of patients seen by family physicians for screening, diagnosis, and treatment of hyperlipidemia.3
When to treat
The National Cholesterol Education Program (NCEP), a program within the National Institute of Health’s Heart, Lung, and Blood Institute, published a guideline in 1993 for screening and treating hyperlipidemia. Physicians have since become familiar with the NCEP concept of basing treatment decisions on assessment of patient risk factors (smoking, age, diabetes, hypertension, family history of early coronary artery disease [CAD]) and application of algorithms linked to desired low-density lipoprotein (LDL) cholesterol levels. The advantage of this strategy is its simplicity. Physicians assess whether the NCEP risk factors are present and then work with their patients to achieve the desired LDL level through lifestyle modification, drug therapy, or both.
Unfortunately, the NCEP guideline did not assess the individual’s actual risk of CAD. In its recently released Third Report, the NCEP has recognized the value of this strategy by incorporating the Framingham tables to calculate the 10-year risk of developing clinical CAD based on a patient’s individual risk factors, including cholesterol levels (Table 1).4 This new NCEP III guideline recommends traditional risk factor counting coupled, in certain situations, with the 10-year risk derived from the Framingham scoring system.
TABLE 1
FRAMINGHAM TABLES FOR CALCULATING CORONARY ARTERY DISEASE RISK
Therapy is based on the individual patient’s risk category and LDL levels (Figure). Patients whose 10-year risk is greater than 20% or those who have CAD-equivalent conditions (ie, diabetes, peripheral arterial disease, abdominal aortic aneurysm, symptomatic carotid artery disease) are considered to have a risk equivalent to that of patients with known CAD; all have an LDL goal of 100 or less.
For those with a 10-year CAD risk less than 20%, the number of positive risk factors determines the LDL goal. This new method allows physicians to communicate with their patients more clearly about individual risk and enhances shared decision making. While the NCEP III report is based on extensive literature review, the recommendations of its expert panel are not characterized according to the strength of the supporting evidence, as is done by the US Preventive Services Task Force.
Figure
TREATMENT STRATEGY BASED ON LDL LEVEL AND RISK CATEGORY
Explaining treatment benefits
The NCEP III report does not make explicit the effect of the treatment on the patient; that is, how much the proposed treatment will reduce the risk of CAD. This determination depends in part on whether the patient being treated has known CAD or a CAD-equivalent condition (secondary prevention) versus no known CAD (primary prevention). The benefits of treatment have been most clearly quantified for drug treatment and are most easily evaluated using the number needed to treat (NNT). The NNT refers to the number of patients who would have to be treated for 5 years to prevent 1 CAD event. Physicians may use the NNTs to assist patients in determining their preferences for treatment, bearing in mind that the NNT refers to an outcome for a population, such as men with high cholesterol levels. For a given individual, their risk of an adverse outcome is all or none. Nonetheless, patients may find the NNT a useful way to assess their personal values in making treatment decisions.
Treatment
Lifestyle modification
Diet modification is the cornerstone of therapy for mild to moderate hyperlipidemia. Modifying the diet is also recommended along with pharmacologic therapy in people at higher risk of CAD. NCEP III recommends a diet for “therapeutic lifestyle changes” that includes < 200 mg cholesterol per day, < 7% saturated fat, 25% to 35% total fat, 50% to 60% carbohydrates, and 15% protein of total calories.4
Although diet therapy has shown a modest redution in total cholesterol in clinical trials, no clear evidence shows that a diet low in saturated fat and cholesterol will reduce cardiovascular morbidity and mortality.5,6 Many people find it difficult to change their dietary habits and to maintain healthier ones. Systematic reviews of observational studies have found that increased consumption of fruits and vegetables is associated with lower incidence of heart attack and stroke. However, the potential for bias and confounding factors in such studies makes them less convincing than randomized controlled trials (RCTs).5
The Ornish program, in which CAD or CAD-equivalent patients pursue intense lifestyle modification for up to 3 years, has shown that revascularization procedures can be avoided. Treatment groups that ate a very-low-fat diet, received intervention on stress management, and followed a prescribed exercise program showed similar improvement in angina symptoms versus the revascularization group. Another trial showed regression of atherosclerotic plaque on angiograms.10-12
Other nonpharmacologic options include plant stanols (2 grams/day) and soluble fiber (10 to 25 grams/day) to reduce LDL-cholesterol. Plant stanols have a structure similar to that of cholesterol and interfere with cholesterol absorption when eaten along with a typical diet, resulting in reduction of blood cholesterol levels. Plant stanols and sterols can be found in certain margarines and salad oils and can be taken with each meal as substitutes for other sources of dietary fat.7-9 No RCTs have shown that these substances reduce cardiovascular events or overall mortality.
Herbal products and dietary supplements
A survey found that as many as 50% of respondents with elevated cholesterol levels would prefer an over-the-counter product such as garlic, yeast, or soy products.13 Studies of products promoted for lipid-lowering effects were found to have a modest effect on lipid levels13-18 (Table 2); however, no RCTs were found that assessed patient-oriented outcomes. Because herbal products and supplements have modest effects on lipid levels and because long-term safety data are lacking, such products should be used with caution for treatment of hyperlipidemia.
TABLE 2
PHARMACOLOGIC AND NONPHARMACOLOGIC INTERVENTIONS
Strength of Recommendation* | Treatment | Type of Benefit | Cost Per Month ($) | Comments |
---|---|---|---|---|
A | Statins | OM, CVM | 40–110 | Well tolerated |
B | Fibric acids | CVE | 60–70 | All male subjects in both primary and secondary trials |
B | Niacin | CVE | 10–80 | Watch for adverse reactions (flushing, elevated glucose, liver function tests) |
B | Bile acid resin | CVE | 40–60 | Ideal agent for patients with severe liver disease; watch for drug interactions |
B | Lifestyle modification | Lipid | Varies | No strong evidence from randomized clinical trials on primary prevention of major coronary events or mortality |
B | Soy products | Lipid | 20 | FDA has approved labeling soy products for cholesterol reduction |
B | Red yeast | Lipid | 20–30 | Active ingredient is lovastatin; should be treated as lovastatin |
B | Plant stanols | Lipid | 20–30 | Substitute for other source of fat calories; must be taken with each meal |
C | Fish oils | Lipid | 5–10 | Use with caution because of high caloric value and cholesterol content in products; may increase cholesterol level with long-term use |
C | Garlic | Lipid | 10–20 | Conflicting results with clinical trials |
C | Green tea | Lipid | 15 | Epidemiologic study data |
* Criteria correspond to US Preventive Services Task Force categories (A = strong evidence to support recommendation, B = fair evidence to support recommendation, C = insufficient evidence to recommend for or against). CVE denotes reduction in cardiovascular events; CVM, reduction in cardiovascular mortality; lipid, reduction in lipid levels only; OM, reduction in overall mortality. |
Pharmacologic treatment
Clinical trials of hyperlipidemia therapy should address outcomes that matter most to patients, such as morbidity, mortality, quality of life, and cost, rather than stressing disease-oriented evidence, such as the ability to reduce cholesterol levels. For this review we identified major long-term RCTs that included significant coronary events or mortality as the primary outcomes. Table 3 summarizes the results of primary and secondary prevention studies.
TABLE 3
PHARMACOLOGIC INTERVENTION
Reduction in Risk | ||||
---|---|---|---|---|
Intervention | Major Coronary Events | All-Cause Mortality | Comments | |
ARR (%) | NNT | NNT | ||
Primary Prevention | ||||
Statins | 2.0–2.3 | 44–49 | NS | Studies on normal and hypercholesterolemic patients. Mean age was 47–58 years; all patients were men except for 1 statin study that included a small number of women |
Gemfibrozil | 1.4 | 71 | NS | |
Cholestyramine | 1.7 | 59 | NS | |
Secondary Prevention | ||||
Statins | 3–3.6 | 28–33 | 24–28 | Mean age was 55–64 years. Participants were male except for the 3 statin studies and the benzafibrate study that enrolled a small number of women. Cholesterol eligibility criteria varied among the studies and included patients with normal or elevated total and LDL levels or low HDL levels |
Gemfibrozil | 4.4 | 23 | NS | |
Benzafibrate | 1.4 | 71 | NS | |
Niacin | 6.2 | 17 | NS | |
ARR denotes absolute risk reduction in percent; NNT, number of needed to treat for 5 years to prevent 1 adverse outcome; NS, not significant. |
Primary prevention
Primary prevention studies have investigated the treatment of middle-aged men with hyperlipidemia and of men and women with average cholesterol levels.19-23 Results showed similar positive outcomes on reducing coronary events in all groups (Table 3). A systematic review and a meta-analysis of primary prevention studies also demonstrated that drug therapy reduced cholesterol levels and resulted in statistically significant lowering of cardiovascular events in the treated group compared with placebo without any significant reduction in overall mortality.5,24 Absolute risk reductions ranged from 1.4% to 2.3%. In other words, the number of patients that would have to be treated for 5 years to prevent a single major coronary event was 44 to 49 for the statins, 71 for gemfibrozil, and 59 for cholestyramine.
Secondary prevention
In secondary prevention trials, RCTs have demonstrated a strong, consistent relationship between cholesterol lowering and the reduction of risk for a coronary event Table 3.25-30 Patients with preexisting CAD and elevated or average lipid levels benefit from medical therapy. The relative risk of cardiovascular events was reduced by an average of 30% in the active treatment groups.
In these trials, the NNT for 5 years to prevent 1 coronary heart event or nonfatal myocardial infarction (MI) was 28 to 33 for statins, 23 for gemfibrozil, 71 for bezafibrate, and 17 for niacin. There was also a significant risk reduction for all-cause mortality in the statin trials.27,28 These data support the recommendations from NCEP III to treat patients with preexisting CAD aggressively. People with diabetes should receive similar treatment because they are more prone to the development of new CAD within 10 years.4 In addition, subgroup analyses of diabetics treated with statins in primary prevention trials demonstrated a decreased risk of cardiovascular events.26,29
While cholesterol-modifying agents include 4 different classes—statins, fibric acid derivatives, bile acid resins, and nicotinic acid—studies cited in this paper predominantly involved statins and fibric acids. In systematic reviews of both primary and secondary prevention trials, statins were the most effective agents for both cholesterol lowering and cardiovascular risk reduction.5 We found no RCTs that directly compared outcomes between cholesterol-lowering medications. Although women represented a small number of participants in these trials, a meta-analysis showed that statin therapy decreased their risk of heart disease, with an NNT of 31 for reduction of major coronary events.31 No evidence was found to support the effectiveness of hyperlipidemia therapy for people aged more than 75 years. For people aged 65 to 75 years, there is evidence to support drug therapy for secondary prevention but not for primary prevention.
Statins are well tolerated; the most common adverse reactions are gastrointestinal related and occur in approximately 3% of patients. The more serious but uncommon events associated with the use of statins are hepatitis and myopathy. Asymptomatic increases in hepatic transaminases to more than 3 times the upper normal limit occur in approximately 1% of patients.32 Therapy can be discontinued for 1 to 2 weeks; enzyme levels should return to normal if the elevations are medication related. It is not necessary to stop therapy when enzymes are elevated at less than 3 times the upper normal limit.
General guidelines on liver monitoring call for performing a baseline liver function test and repeating it 6 weeks later.33 Once a stable dose has been established, the manufacturer recommends periodic testing; however, no clear evidence supports a specific interval. Clinicians may choose to individualize decisions on testing frequency based on factors such as potential drug interactions (statins with fibric acids or niacin) or the presence of conditions that increase the risk of liver disease.34
Myopathy, defined as generalized muscle aches and pain with a serum creatine kinase level greater than 1000 U/L, occurs rarely (< 0.1%) but may be more likely to occur when statins are used concomitantly with medications such as fibric acid, antifungals, erythromycin, and cyclosporine.31,35 The best preventive strategy is to educate patients about early recognition of the signs and symptoms of myopathy. Because most statins are metabolized by the cytochrome P450-3A4, any medications that inhibit this enzyme can increase statin serum levels and increase the risk of hepatotoxicity and myopathy.
The NCEP III recommends the use of statins as firstline therapy. A standard dose of a statin decreases LDL levels by 20% to 50%, increases HDL levels by 5% to 10%, and reduces triglyceride levels by 10% to 20%. Atorvastatin and simvastatin can produce the highest reductions in LDL levels: up to 50%. Only pravastatin, simvastatin, and lovastatin have been involved in longterm RCTs of primary and secondary prevention. Atorvastatin had positive benefits in a short-term secondary prevention trial.37 Unfortunately, the only head-to-head comparisons of statins have looked at disease-oriented outcomes such as lipid levels.37 Statins are patient friendly. They require a daily evening dose because cholesterol synthesis is more active during the night. Atorvastatin can be given at any time of day because of its long half-life.
Gemfibrozil, a fibric acid, is often used to treat hypertriglyceridemia and as an adjunctive agent to statin therapy. It decreases triglycerides by 40% to 50% but has minimal effects on the rest of the lipid panel. Adverse effects are generally mild. Liver function monitoring is recommended. The usual dosage regimen for fibric acids is 2 times a day and should be adjusted for renal function.
Niacin can increase HDL by 30% and decrease triglycerides by 30% and LDL by 20%. Major adverse reactions include flushing, gastrointestinal symptoms, elevation of liver function tests, uric acid, and serum glucose levels. The new longer-acting formulation has been associated with less flushing. Another class, the bile acid resins, including cholestyramine and colestipol, may play an adjunctive role in therapy. Their effect on the lipid panel is mild compared with those of the other class and they can increase triglyceride levels. Many patients find the gritty taste of the granular formulation unpalatable. The bile acid resins have a favorable safety profile. Most adverse events occur locally in the gut.
Conclusions
The emergence of statins as a safe and effective, although costly, therapy for hyperlipidemia and the development of clinical guidelines advocating their increased use will place family physicians under added pressure to screen for and treat hyperlipidemia. While the general value of lifestyle changes is recognized in national recommendations, more effective ways for physicians to implement them successfully in ambulatory settings are needed.
An optimal evidence-based approach to hyperlipidemia uses the new NCEP III guideline, which combines traditional risk factor assessment with assessment for CAD using the Framingham tables to determine LDL goals and appropriate treatment modalities. Statins are first-line agents for patients who are candidates for drug therapy. Discussions between clinicians and patients of the NNTs for primary and secondary prevention will help foster patient-centered discussions on the role of medical, economic, and quality-of-life issues in the decision-making process.
- The new NCEP III provides revised guidelines for the treatment of hyperlipidemia.
- Combining traditional risk factor assessment with the calculated 10-year risk of coronary artery disease allows for optimal patient-centered counseling.
- Statins are normally the first-line therapy for hyperlipidemia.
In 1995 and 1996, US adults made more than 18 million office visits for the evaluation and treatment of hyperlipidemia, including 3.4% of all visits to family physicians. Among visits to family physicians, 4.1% included measurement of cholesterol levels.1 Overall, mean cholesterol levels decreased from 220 in 1960–1962 to 203 in 1988–1994. During the same time period, the proportion of adults with elevated total cholesterol levels (> 240) decreased from 32% to 19%.2 Despite this progress, the availability of more effective drugs, guidelines advocating increasingly aggressive treatment, and populationwide goals established in Healthy People 2010 will continue to increase the number of patients seen by family physicians for screening, diagnosis, and treatment of hyperlipidemia.3
When to treat
The National Cholesterol Education Program (NCEP), a program within the National Institute of Health’s Heart, Lung, and Blood Institute, published a guideline in 1993 for screening and treating hyperlipidemia. Physicians have since become familiar with the NCEP concept of basing treatment decisions on assessment of patient risk factors (smoking, age, diabetes, hypertension, family history of early coronary artery disease [CAD]) and application of algorithms linked to desired low-density lipoprotein (LDL) cholesterol levels. The advantage of this strategy is its simplicity. Physicians assess whether the NCEP risk factors are present and then work with their patients to achieve the desired LDL level through lifestyle modification, drug therapy, or both.
Unfortunately, the NCEP guideline did not assess the individual’s actual risk of CAD. In its recently released Third Report, the NCEP has recognized the value of this strategy by incorporating the Framingham tables to calculate the 10-year risk of developing clinical CAD based on a patient’s individual risk factors, including cholesterol levels (Table 1).4 This new NCEP III guideline recommends traditional risk factor counting coupled, in certain situations, with the 10-year risk derived from the Framingham scoring system.
TABLE 1
FRAMINGHAM TABLES FOR CALCULATING CORONARY ARTERY DISEASE RISK
Therapy is based on the individual patient’s risk category and LDL levels (Figure). Patients whose 10-year risk is greater than 20% or those who have CAD-equivalent conditions (ie, diabetes, peripheral arterial disease, abdominal aortic aneurysm, symptomatic carotid artery disease) are considered to have a risk equivalent to that of patients with known CAD; all have an LDL goal of 100 or less.
For those with a 10-year CAD risk less than 20%, the number of positive risk factors determines the LDL goal. This new method allows physicians to communicate with their patients more clearly about individual risk and enhances shared decision making. While the NCEP III report is based on extensive literature review, the recommendations of its expert panel are not characterized according to the strength of the supporting evidence, as is done by the US Preventive Services Task Force.
Figure
TREATMENT STRATEGY BASED ON LDL LEVEL AND RISK CATEGORY
Explaining treatment benefits
The NCEP III report does not make explicit the effect of the treatment on the patient; that is, how much the proposed treatment will reduce the risk of CAD. This determination depends in part on whether the patient being treated has known CAD or a CAD-equivalent condition (secondary prevention) versus no known CAD (primary prevention). The benefits of treatment have been most clearly quantified for drug treatment and are most easily evaluated using the number needed to treat (NNT). The NNT refers to the number of patients who would have to be treated for 5 years to prevent 1 CAD event. Physicians may use the NNTs to assist patients in determining their preferences for treatment, bearing in mind that the NNT refers to an outcome for a population, such as men with high cholesterol levels. For a given individual, their risk of an adverse outcome is all or none. Nonetheless, patients may find the NNT a useful way to assess their personal values in making treatment decisions.
Treatment
Lifestyle modification
Diet modification is the cornerstone of therapy for mild to moderate hyperlipidemia. Modifying the diet is also recommended along with pharmacologic therapy in people at higher risk of CAD. NCEP III recommends a diet for “therapeutic lifestyle changes” that includes < 200 mg cholesterol per day, < 7% saturated fat, 25% to 35% total fat, 50% to 60% carbohydrates, and 15% protein of total calories.4
Although diet therapy has shown a modest redution in total cholesterol in clinical trials, no clear evidence shows that a diet low in saturated fat and cholesterol will reduce cardiovascular morbidity and mortality.5,6 Many people find it difficult to change their dietary habits and to maintain healthier ones. Systematic reviews of observational studies have found that increased consumption of fruits and vegetables is associated with lower incidence of heart attack and stroke. However, the potential for bias and confounding factors in such studies makes them less convincing than randomized controlled trials (RCTs).5
The Ornish program, in which CAD or CAD-equivalent patients pursue intense lifestyle modification for up to 3 years, has shown that revascularization procedures can be avoided. Treatment groups that ate a very-low-fat diet, received intervention on stress management, and followed a prescribed exercise program showed similar improvement in angina symptoms versus the revascularization group. Another trial showed regression of atherosclerotic plaque on angiograms.10-12
Other nonpharmacologic options include plant stanols (2 grams/day) and soluble fiber (10 to 25 grams/day) to reduce LDL-cholesterol. Plant stanols have a structure similar to that of cholesterol and interfere with cholesterol absorption when eaten along with a typical diet, resulting in reduction of blood cholesterol levels. Plant stanols and sterols can be found in certain margarines and salad oils and can be taken with each meal as substitutes for other sources of dietary fat.7-9 No RCTs have shown that these substances reduce cardiovascular events or overall mortality.
Herbal products and dietary supplements
A survey found that as many as 50% of respondents with elevated cholesterol levels would prefer an over-the-counter product such as garlic, yeast, or soy products.13 Studies of products promoted for lipid-lowering effects were found to have a modest effect on lipid levels13-18 (Table 2); however, no RCTs were found that assessed patient-oriented outcomes. Because herbal products and supplements have modest effects on lipid levels and because long-term safety data are lacking, such products should be used with caution for treatment of hyperlipidemia.
TABLE 2
PHARMACOLOGIC AND NONPHARMACOLOGIC INTERVENTIONS
Strength of Recommendation* | Treatment | Type of Benefit | Cost Per Month ($) | Comments |
---|---|---|---|---|
A | Statins | OM, CVM | 40–110 | Well tolerated |
B | Fibric acids | CVE | 60–70 | All male subjects in both primary and secondary trials |
B | Niacin | CVE | 10–80 | Watch for adverse reactions (flushing, elevated glucose, liver function tests) |
B | Bile acid resin | CVE | 40–60 | Ideal agent for patients with severe liver disease; watch for drug interactions |
B | Lifestyle modification | Lipid | Varies | No strong evidence from randomized clinical trials on primary prevention of major coronary events or mortality |
B | Soy products | Lipid | 20 | FDA has approved labeling soy products for cholesterol reduction |
B | Red yeast | Lipid | 20–30 | Active ingredient is lovastatin; should be treated as lovastatin |
B | Plant stanols | Lipid | 20–30 | Substitute for other source of fat calories; must be taken with each meal |
C | Fish oils | Lipid | 5–10 | Use with caution because of high caloric value and cholesterol content in products; may increase cholesterol level with long-term use |
C | Garlic | Lipid | 10–20 | Conflicting results with clinical trials |
C | Green tea | Lipid | 15 | Epidemiologic study data |
* Criteria correspond to US Preventive Services Task Force categories (A = strong evidence to support recommendation, B = fair evidence to support recommendation, C = insufficient evidence to recommend for or against). CVE denotes reduction in cardiovascular events; CVM, reduction in cardiovascular mortality; lipid, reduction in lipid levels only; OM, reduction in overall mortality. |
Pharmacologic treatment
Clinical trials of hyperlipidemia therapy should address outcomes that matter most to patients, such as morbidity, mortality, quality of life, and cost, rather than stressing disease-oriented evidence, such as the ability to reduce cholesterol levels. For this review we identified major long-term RCTs that included significant coronary events or mortality as the primary outcomes. Table 3 summarizes the results of primary and secondary prevention studies.
TABLE 3
PHARMACOLOGIC INTERVENTION
Reduction in Risk | ||||
---|---|---|---|---|
Intervention | Major Coronary Events | All-Cause Mortality | Comments | |
ARR (%) | NNT | NNT | ||
Primary Prevention | ||||
Statins | 2.0–2.3 | 44–49 | NS | Studies on normal and hypercholesterolemic patients. Mean age was 47–58 years; all patients were men except for 1 statin study that included a small number of women |
Gemfibrozil | 1.4 | 71 | NS | |
Cholestyramine | 1.7 | 59 | NS | |
Secondary Prevention | ||||
Statins | 3–3.6 | 28–33 | 24–28 | Mean age was 55–64 years. Participants were male except for the 3 statin studies and the benzafibrate study that enrolled a small number of women. Cholesterol eligibility criteria varied among the studies and included patients with normal or elevated total and LDL levels or low HDL levels |
Gemfibrozil | 4.4 | 23 | NS | |
Benzafibrate | 1.4 | 71 | NS | |
Niacin | 6.2 | 17 | NS | |
ARR denotes absolute risk reduction in percent; NNT, number of needed to treat for 5 years to prevent 1 adverse outcome; NS, not significant. |
Primary prevention
Primary prevention studies have investigated the treatment of middle-aged men with hyperlipidemia and of men and women with average cholesterol levels.19-23 Results showed similar positive outcomes on reducing coronary events in all groups (Table 3). A systematic review and a meta-analysis of primary prevention studies also demonstrated that drug therapy reduced cholesterol levels and resulted in statistically significant lowering of cardiovascular events in the treated group compared with placebo without any significant reduction in overall mortality.5,24 Absolute risk reductions ranged from 1.4% to 2.3%. In other words, the number of patients that would have to be treated for 5 years to prevent a single major coronary event was 44 to 49 for the statins, 71 for gemfibrozil, and 59 for cholestyramine.
Secondary prevention
In secondary prevention trials, RCTs have demonstrated a strong, consistent relationship between cholesterol lowering and the reduction of risk for a coronary event Table 3.25-30 Patients with preexisting CAD and elevated or average lipid levels benefit from medical therapy. The relative risk of cardiovascular events was reduced by an average of 30% in the active treatment groups.
In these trials, the NNT for 5 years to prevent 1 coronary heart event or nonfatal myocardial infarction (MI) was 28 to 33 for statins, 23 for gemfibrozil, 71 for bezafibrate, and 17 for niacin. There was also a significant risk reduction for all-cause mortality in the statin trials.27,28 These data support the recommendations from NCEP III to treat patients with preexisting CAD aggressively. People with diabetes should receive similar treatment because they are more prone to the development of new CAD within 10 years.4 In addition, subgroup analyses of diabetics treated with statins in primary prevention trials demonstrated a decreased risk of cardiovascular events.26,29
While cholesterol-modifying agents include 4 different classes—statins, fibric acid derivatives, bile acid resins, and nicotinic acid—studies cited in this paper predominantly involved statins and fibric acids. In systematic reviews of both primary and secondary prevention trials, statins were the most effective agents for both cholesterol lowering and cardiovascular risk reduction.5 We found no RCTs that directly compared outcomes between cholesterol-lowering medications. Although women represented a small number of participants in these trials, a meta-analysis showed that statin therapy decreased their risk of heart disease, with an NNT of 31 for reduction of major coronary events.31 No evidence was found to support the effectiveness of hyperlipidemia therapy for people aged more than 75 years. For people aged 65 to 75 years, there is evidence to support drug therapy for secondary prevention but not for primary prevention.
Statins are well tolerated; the most common adverse reactions are gastrointestinal related and occur in approximately 3% of patients. The more serious but uncommon events associated with the use of statins are hepatitis and myopathy. Asymptomatic increases in hepatic transaminases to more than 3 times the upper normal limit occur in approximately 1% of patients.32 Therapy can be discontinued for 1 to 2 weeks; enzyme levels should return to normal if the elevations are medication related. It is not necessary to stop therapy when enzymes are elevated at less than 3 times the upper normal limit.
General guidelines on liver monitoring call for performing a baseline liver function test and repeating it 6 weeks later.33 Once a stable dose has been established, the manufacturer recommends periodic testing; however, no clear evidence supports a specific interval. Clinicians may choose to individualize decisions on testing frequency based on factors such as potential drug interactions (statins with fibric acids or niacin) or the presence of conditions that increase the risk of liver disease.34
Myopathy, defined as generalized muscle aches and pain with a serum creatine kinase level greater than 1000 U/L, occurs rarely (< 0.1%) but may be more likely to occur when statins are used concomitantly with medications such as fibric acid, antifungals, erythromycin, and cyclosporine.31,35 The best preventive strategy is to educate patients about early recognition of the signs and symptoms of myopathy. Because most statins are metabolized by the cytochrome P450-3A4, any medications that inhibit this enzyme can increase statin serum levels and increase the risk of hepatotoxicity and myopathy.
The NCEP III recommends the use of statins as firstline therapy. A standard dose of a statin decreases LDL levels by 20% to 50%, increases HDL levels by 5% to 10%, and reduces triglyceride levels by 10% to 20%. Atorvastatin and simvastatin can produce the highest reductions in LDL levels: up to 50%. Only pravastatin, simvastatin, and lovastatin have been involved in longterm RCTs of primary and secondary prevention. Atorvastatin had positive benefits in a short-term secondary prevention trial.37 Unfortunately, the only head-to-head comparisons of statins have looked at disease-oriented outcomes such as lipid levels.37 Statins are patient friendly. They require a daily evening dose because cholesterol synthesis is more active during the night. Atorvastatin can be given at any time of day because of its long half-life.
Gemfibrozil, a fibric acid, is often used to treat hypertriglyceridemia and as an adjunctive agent to statin therapy. It decreases triglycerides by 40% to 50% but has minimal effects on the rest of the lipid panel. Adverse effects are generally mild. Liver function monitoring is recommended. The usual dosage regimen for fibric acids is 2 times a day and should be adjusted for renal function.
Niacin can increase HDL by 30% and decrease triglycerides by 30% and LDL by 20%. Major adverse reactions include flushing, gastrointestinal symptoms, elevation of liver function tests, uric acid, and serum glucose levels. The new longer-acting formulation has been associated with less flushing. Another class, the bile acid resins, including cholestyramine and colestipol, may play an adjunctive role in therapy. Their effect on the lipid panel is mild compared with those of the other class and they can increase triglyceride levels. Many patients find the gritty taste of the granular formulation unpalatable. The bile acid resins have a favorable safety profile. Most adverse events occur locally in the gut.
Conclusions
The emergence of statins as a safe and effective, although costly, therapy for hyperlipidemia and the development of clinical guidelines advocating their increased use will place family physicians under added pressure to screen for and treat hyperlipidemia. While the general value of lifestyle changes is recognized in national recommendations, more effective ways for physicians to implement them successfully in ambulatory settings are needed.
An optimal evidence-based approach to hyperlipidemia uses the new NCEP III guideline, which combines traditional risk factor assessment with assessment for CAD using the Framingham tables to determine LDL goals and appropriate treatment modalities. Statins are first-line agents for patients who are candidates for drug therapy. Discussions between clinicians and patients of the NNTs for primary and secondary prevention will help foster patient-centered discussions on the role of medical, economic, and quality-of-life issues in the decision-making process.
1. Schappert SM, Nelson C. National Ambulatory Medical Care Survey: 1995–1996 summary. National Center for Health Statistics. Vital Health Stat 1999;13(142).-
2. National Center for Health Statistics. Health, United States, 1999, with health and aging chartbook. Hyattsville, Md: 1999.
3. US Department of Health and Human Services. Tracking Healthy People 2010. Washington, DC: US Government Printing Office. November 2000. Available at: http://www.cdc.gov/hchs/hphome.htm.
4. Third report of the National Cholesterol Education Program (NCEP) Expert Panel on Detection, Evaluation and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III). Executive Summary. Available at: http://www.nhlbi.nih.gov/guidelines/cholesterol/atp3xsum.pdf Accessed April 16, 2001.
5. Clinical evidence London, England: BMJ Publishing Group; June 2001. Available at: www.clinicalevidence.org.
6. Henkin Y, Shai I, Zuk R, et al. Dietary treatment of hyperlipidemia: Do dietitians do it better? A randomized, controlled trial. Am J Med 2000;109:549-55.
7. Ornish D. Avoiding revascularization with lifestyle changes: the multicenter lifestyle demonstration project. Am J Cardiol 1998;82:72T-76T.
8. Ornish D, Scherwitz LW, Billings JH, et al. Intensive lifestyle changes for reversal of coronary heart disease. JAMA 1998;280:2001-7.
9. Gould KL, Ornish D, Scherwitz L, et al. Changes in myocardial perfusion abnormalities by positron emission tomography after long-term, intense risk factor modification. JAMA 1995;274:894-901.
10. Mensink RP, Plat J. Efficacy of dietary plant stanols. In: New developments in the dietary management of high cholesterol. New York: McGraw-Hill; 1998;27-31.
11. Blair SN, Capuzzi DM, Gottlieb SO, et al. Incremental reduction of serum total cholesterol and LDL with the addition of plant stanol ester-containing spread to statin therapy. Am J Cardiol 2000;86:46-52.
12. Miettinen TA, Puska P, Gylling H, et al. Reduction of serum cholesterol with sitostanol-ester margarine in a mildly hypercholesteremic population. N Engl J Med 1995;333:1308-12.
13. Caron MF, White CM. Evaluation of the antihyperlipidemic properties of dietary supplements. Pharmacotherapy 2001;21:481-7.
14. Harris WS. Nonpharmacologic treatment of hypertriglyceridemia: focus on fish oils. Clin Cardiol 1999;22(suppl 2):II40-3.
15. Stevinson C, Pittler MH, Ernst E. Garlic for treating hyperlipidemia. Ann Intern Med 2000;133:420-9.
16. EBM Reviews. Database of abstracts of reviews of effectiveness [database online]. Psyllium-enriched cereals lower blood total cholesterol and LDL cholesterol, but not HDL cholesterol, in hypercholesterolemic adults: results of a meta-analysis. July 2001; v1, accession no. 00125498-100000000-00737. Available at: http://www.ovid.com/products/databases. Accessed Oct. 29, 2001.
17. EBM Reviews. ACP Journal Club [database online]. Soy protein intake decreases total and LDL cholesterol and triglyceride levels. March/April 1996; 124:41, accession no. 00021607-199603000-00013. Available at: http://www.ovid.com/products/databases. Accessed April 4, 2001.
18. Jellin JM, Batz F, Hitchens K. Natural Medicines Comprehensive Database, 3rd ed. Stockton, Calif: Therapeutic Research Faculty; 2000.
19. Frick MH, Elo O, Haapa K, et al. Helsinki heart study: primary-prevention trial with gemfibrozil in middle-aged men with dyslipidemia. N Engl J Med 1987;317:1237-45.
20. The lipid research clinics coronary primary prevention trial results I. Reduction in incidence of coronary heart disease. JAMA 1984;251:351-64.
21. The lipid research clinics coronary primary prevention trial results II. The relationship of reduction in incidence of coronary heart disease to cholesterol lowering. JAMA 1984;251:365-74.
22. Shepherd J, Cobbe SM, Ford I, et al. Prevention of coronary heart disease with pravastatin in men with hyperlipidemia. N Engl J Med 1995;333:1301-6.
23. Downs JR, Clearfield M, Weis S, et al. Primary prevention of acute coronary events with lovastatin in men and women with average cholesterol levels. JAMA 1998;279:1615-22.
24. Pignone M, Phillips C, Mulrow C. Use of lipid lowering drugs for primary prevention of coronary heart disease: meta-analysis of randomized trials. BMJ 2000;321:983-5.
25. Secondary prevention by raising HDL cholesterol and reducing triglycerides in patients with coronary artery disease: the Bezafibrate Infarction Prevention (BIP) Study. Circulation 2000;102:21-7.
26. Rubins HB, Robins SJ, Collins D, et al. Gemfibrozil for the secondary prevention of coronary heart disease in men with low levels of HDL-cholesterol. N Engl J Med 1999;341:410-8.
27. Randomised trial of cholesterol lowering in 4444 patients with coronary heart disease: the Scandinavian simvastatin survival study (4S). Lancet 1994;344:1383-9.
28. Preventio of cardiovascular events and death with pravastatin in patients with coronary heart disease and a broad range of initial cholesterol levels. The Long-Term Intervention with Pravastatin in Ischaemic Disease. N Engl J Med 1998;339:1349-57.
29. Sacks FM, Pfeffer MA, Moye LA, et al. The effect of pravastatin on coronary events after myocardial infarction in patients with average cholesterol levels. N Engl J Med 1996;335:1001-9.
30. Canner PL, Berge KG, Wenger NK, et al. Fifteen year mortality in coronary drug project patients: long-term benefit with niacin. J Am Coll Cardiol 1986;8:1245-55.
31. LaRosa JC, He J, Vupputuri S. Effect of statins on risk of coronary disease; a meta-analysis of randomized controlled trials. JAMA 1999;24:2340-6.
32. Hsu I, Spinler SA, Johnson N. Comparative evaluation of the safety and efficacy of HMG-CoA reductase inhibitor monotherapy in the treatment of primary hyperlipidemia. Ann Pharmacother 1995;29:743-59.
33. Tice SA, Parry D. Medications that require hepatic monitoring. Hosp Pharm 2001;36:456-64.
34. Weismantel D. What lab monitoring is appropriate to detect adverse drug reactions in patients on cholesterol-lowering agents? J Fam Pract 2001;50:927.-
35. American College of Clinical Pharmacy. PSAP: pharmacotherapy self-assessment program, 4th ed. Kansas City, Mo: ACCP; 2001;66-7.
36. Schwartz GG, Olsson AG, Ezekowitz MD, et al. Effects of atorvastatin on early recurrent ischemic events in acute coronary syndromes. JAMA 2001;285:1711-8.
37. Jones P, Kafonek S, Laurora I, et al. Comparative dose efficacy study of atorvastatin vs. simvastatin, pravastatin, lovastatin, and fluvastatin in patients with hyperlipidemia (the CURVES study). Am J Cardiol 1998;81:582-7.
1. Schappert SM, Nelson C. National Ambulatory Medical Care Survey: 1995–1996 summary. National Center for Health Statistics. Vital Health Stat 1999;13(142).-
2. National Center for Health Statistics. Health, United States, 1999, with health and aging chartbook. Hyattsville, Md: 1999.
3. US Department of Health and Human Services. Tracking Healthy People 2010. Washington, DC: US Government Printing Office. November 2000. Available at: http://www.cdc.gov/hchs/hphome.htm.
4. Third report of the National Cholesterol Education Program (NCEP) Expert Panel on Detection, Evaluation and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III). Executive Summary. Available at: http://www.nhlbi.nih.gov/guidelines/cholesterol/atp3xsum.pdf Accessed April 16, 2001.
5. Clinical evidence London, England: BMJ Publishing Group; June 2001. Available at: www.clinicalevidence.org.
6. Henkin Y, Shai I, Zuk R, et al. Dietary treatment of hyperlipidemia: Do dietitians do it better? A randomized, controlled trial. Am J Med 2000;109:549-55.
7. Ornish D. Avoiding revascularization with lifestyle changes: the multicenter lifestyle demonstration project. Am J Cardiol 1998;82:72T-76T.
8. Ornish D, Scherwitz LW, Billings JH, et al. Intensive lifestyle changes for reversal of coronary heart disease. JAMA 1998;280:2001-7.
9. Gould KL, Ornish D, Scherwitz L, et al. Changes in myocardial perfusion abnormalities by positron emission tomography after long-term, intense risk factor modification. JAMA 1995;274:894-901.
10. Mensink RP, Plat J. Efficacy of dietary plant stanols. In: New developments in the dietary management of high cholesterol. New York: McGraw-Hill; 1998;27-31.
11. Blair SN, Capuzzi DM, Gottlieb SO, et al. Incremental reduction of serum total cholesterol and LDL with the addition of plant stanol ester-containing spread to statin therapy. Am J Cardiol 2000;86:46-52.
12. Miettinen TA, Puska P, Gylling H, et al. Reduction of serum cholesterol with sitostanol-ester margarine in a mildly hypercholesteremic population. N Engl J Med 1995;333:1308-12.
13. Caron MF, White CM. Evaluation of the antihyperlipidemic properties of dietary supplements. Pharmacotherapy 2001;21:481-7.
14. Harris WS. Nonpharmacologic treatment of hypertriglyceridemia: focus on fish oils. Clin Cardiol 1999;22(suppl 2):II40-3.
15. Stevinson C, Pittler MH, Ernst E. Garlic for treating hyperlipidemia. Ann Intern Med 2000;133:420-9.
16. EBM Reviews. Database of abstracts of reviews of effectiveness [database online]. Psyllium-enriched cereals lower blood total cholesterol and LDL cholesterol, but not HDL cholesterol, in hypercholesterolemic adults: results of a meta-analysis. July 2001; v1, accession no. 00125498-100000000-00737. Available at: http://www.ovid.com/products/databases. Accessed Oct. 29, 2001.
17. EBM Reviews. ACP Journal Club [database online]. Soy protein intake decreases total and LDL cholesterol and triglyceride levels. March/April 1996; 124:41, accession no. 00021607-199603000-00013. Available at: http://www.ovid.com/products/databases. Accessed April 4, 2001.
18. Jellin JM, Batz F, Hitchens K. Natural Medicines Comprehensive Database, 3rd ed. Stockton, Calif: Therapeutic Research Faculty; 2000.
19. Frick MH, Elo O, Haapa K, et al. Helsinki heart study: primary-prevention trial with gemfibrozil in middle-aged men with dyslipidemia. N Engl J Med 1987;317:1237-45.
20. The lipid research clinics coronary primary prevention trial results I. Reduction in incidence of coronary heart disease. JAMA 1984;251:351-64.
21. The lipid research clinics coronary primary prevention trial results II. The relationship of reduction in incidence of coronary heart disease to cholesterol lowering. JAMA 1984;251:365-74.
22. Shepherd J, Cobbe SM, Ford I, et al. Prevention of coronary heart disease with pravastatin in men with hyperlipidemia. N Engl J Med 1995;333:1301-6.
23. Downs JR, Clearfield M, Weis S, et al. Primary prevention of acute coronary events with lovastatin in men and women with average cholesterol levels. JAMA 1998;279:1615-22.
24. Pignone M, Phillips C, Mulrow C. Use of lipid lowering drugs for primary prevention of coronary heart disease: meta-analysis of randomized trials. BMJ 2000;321:983-5.
25. Secondary prevention by raising HDL cholesterol and reducing triglycerides in patients with coronary artery disease: the Bezafibrate Infarction Prevention (BIP) Study. Circulation 2000;102:21-7.
26. Rubins HB, Robins SJ, Collins D, et al. Gemfibrozil for the secondary prevention of coronary heart disease in men with low levels of HDL-cholesterol. N Engl J Med 1999;341:410-8.
27. Randomised trial of cholesterol lowering in 4444 patients with coronary heart disease: the Scandinavian simvastatin survival study (4S). Lancet 1994;344:1383-9.
28. Preventio of cardiovascular events and death with pravastatin in patients with coronary heart disease and a broad range of initial cholesterol levels. The Long-Term Intervention with Pravastatin in Ischaemic Disease. N Engl J Med 1998;339:1349-57.
29. Sacks FM, Pfeffer MA, Moye LA, et al. The effect of pravastatin on coronary events after myocardial infarction in patients with average cholesterol levels. N Engl J Med 1996;335:1001-9.
30. Canner PL, Berge KG, Wenger NK, et al. Fifteen year mortality in coronary drug project patients: long-term benefit with niacin. J Am Coll Cardiol 1986;8:1245-55.
31. LaRosa JC, He J, Vupputuri S. Effect of statins on risk of coronary disease; a meta-analysis of randomized controlled trials. JAMA 1999;24:2340-6.
32. Hsu I, Spinler SA, Johnson N. Comparative evaluation of the safety and efficacy of HMG-CoA reductase inhibitor monotherapy in the treatment of primary hyperlipidemia. Ann Pharmacother 1995;29:743-59.
33. Tice SA, Parry D. Medications that require hepatic monitoring. Hosp Pharm 2001;36:456-64.
34. Weismantel D. What lab monitoring is appropriate to detect adverse drug reactions in patients on cholesterol-lowering agents? J Fam Pract 2001;50:927.-
35. American College of Clinical Pharmacy. PSAP: pharmacotherapy self-assessment program, 4th ed. Kansas City, Mo: ACCP; 2001;66-7.
36. Schwartz GG, Olsson AG, Ezekowitz MD, et al. Effects of atorvastatin on early recurrent ischemic events in acute coronary syndromes. JAMA 2001;285:1711-8.
37. Jones P, Kafonek S, Laurora I, et al. Comparative dose efficacy study of atorvastatin vs. simvastatin, pravastatin, lovastatin, and fluvastatin in patients with hyperlipidemia (the CURVES study). Am J Cardiol 1998;81:582-7.
Which oral triptans are effective for the treatment of acute migraine?
ABSTRACT
BACKGROUND: Six selective serotonin 5-HT1B/1D agonists (triptans) are currently approved and available in the United States; 1 more may eventually be approved. Although clinicians need evidence of the differences in efficacy and safety of these agents to assist in their prescribing decisions, a lack of head-to-head comparison trials makes this assessment difficult. The authors performed a meta-analysis of multiple trials of oral triptans to determine their relative effectiveness in treating acute migraine.
POPULATION STUDIED: Patients eligible for the studies were aged 18 to 65 years, had moderate to severe migraine, and had pain rated on a 4-point scale (0 = no pain; 3 = most severe pain). A total of 24,000 patients from 53 clinical trials met the criteria. The authors selected 100 mg sumatriptan, the most widely prescribed agent, as the reference dose.
STUDY DESIGN AND VALIDITY: The authors performed a systematic review of published English-language trials and asked the 6 pharmaceutical companies for raw data from published and unpublished trials. Five companies provided data on 6 drugs; the makers of frovatriptan did not. The investigators included studies that (1) were randomized double-blind controlled clinical trials (placebo or active comparison); (2) treated moderate or severe migraine attacks (by International Headache Society criteria) within 8 hours of migraine onset; (3) used an oral triptan at a recommended clinical dose; and (4) evaluated the headache on the 4-point pain scale. The authors excluded 23 studies that lacked a control group, used nonrecommended dosages, or studied special populations. Of the 53 trials reviewed, 31 were placebo-controlled trials and 22 were direct-comparison trials.
OUTCOMES MEASURED: Four outcomes were measured: (1) proportion of patients with a headache response (improvement to mild or no pain 2 hours post dose); (2) sustained pain-free response (2 hours post dose and no recurrence of moderate or severe migraine 2 to 24 hours post dose); (3) consistent effect of a medication over recurrent attacks in the same person; and (4) adverse reactions.
RESULTS: In placebo trials, 100 mg sumatriptan showed a mean absolute and therapeutic gain of 59% and 29% for 2-hour headache response and 29% and 10% for sustained pain-free rate. The mean therapeutic harm rate was 13% for at least 1 adverse event, 6% for 1 central nervous system event, and 2% for 1 chest event. On average, patients obtained relief for 2 of 3 consecutive migraines. Only 80 mg eletriptan had a statistically significant advantage over 100 mg sumatriptan for therapeutic gain in 2-hour headache response (number needed to treat [NNT] = 8). For therapeutic gain in 2-hour pain-free response, both 10 mg rizatriptan (NNT = 8) and 80 mg eletriptan (NNT = 13) had a statistically significant advantage. Adverse reaction rates were similar for most triptans but lower for 2.5 mg naratriptan and 12.5 mg almotriptan.
This meta-analysis demonstrates that oral triptans are effective in relieving acute migraine headache with acceptable adverse effect rates and non-clinically relevant degrees of relief among the agents. The meta-analysis also showed that only approximately 60% of patients respond to a specific triptan. In the few consistency trials, the triptans were effective in treating an average of 2 of 3 consecutive acute migraines in the same patient. Research supports nonsteroidal anti-inflammatory drugs as first-line therapy for mild to moderate migraine; triptans should be considered first-line therapy for moderate to severe migraine. We suggest that clinicians become familiar with several triptans and recognize that a given agent will not always relieve the same person’s migraine and that the failure of 1 triptan to help a patient does not predict failure with another triptan.
ABSTRACT
BACKGROUND: Six selective serotonin 5-HT1B/1D agonists (triptans) are currently approved and available in the United States; 1 more may eventually be approved. Although clinicians need evidence of the differences in efficacy and safety of these agents to assist in their prescribing decisions, a lack of head-to-head comparison trials makes this assessment difficult. The authors performed a meta-analysis of multiple trials of oral triptans to determine their relative effectiveness in treating acute migraine.
POPULATION STUDIED: Patients eligible for the studies were aged 18 to 65 years, had moderate to severe migraine, and had pain rated on a 4-point scale (0 = no pain; 3 = most severe pain). A total of 24,000 patients from 53 clinical trials met the criteria. The authors selected 100 mg sumatriptan, the most widely prescribed agent, as the reference dose.
STUDY DESIGN AND VALIDITY: The authors performed a systematic review of published English-language trials and asked the 6 pharmaceutical companies for raw data from published and unpublished trials. Five companies provided data on 6 drugs; the makers of frovatriptan did not. The investigators included studies that (1) were randomized double-blind controlled clinical trials (placebo or active comparison); (2) treated moderate or severe migraine attacks (by International Headache Society criteria) within 8 hours of migraine onset; (3) used an oral triptan at a recommended clinical dose; and (4) evaluated the headache on the 4-point pain scale. The authors excluded 23 studies that lacked a control group, used nonrecommended dosages, or studied special populations. Of the 53 trials reviewed, 31 were placebo-controlled trials and 22 were direct-comparison trials.
OUTCOMES MEASURED: Four outcomes were measured: (1) proportion of patients with a headache response (improvement to mild or no pain 2 hours post dose); (2) sustained pain-free response (2 hours post dose and no recurrence of moderate or severe migraine 2 to 24 hours post dose); (3) consistent effect of a medication over recurrent attacks in the same person; and (4) adverse reactions.
RESULTS: In placebo trials, 100 mg sumatriptan showed a mean absolute and therapeutic gain of 59% and 29% for 2-hour headache response and 29% and 10% for sustained pain-free rate. The mean therapeutic harm rate was 13% for at least 1 adverse event, 6% for 1 central nervous system event, and 2% for 1 chest event. On average, patients obtained relief for 2 of 3 consecutive migraines. Only 80 mg eletriptan had a statistically significant advantage over 100 mg sumatriptan for therapeutic gain in 2-hour headache response (number needed to treat [NNT] = 8). For therapeutic gain in 2-hour pain-free response, both 10 mg rizatriptan (NNT = 8) and 80 mg eletriptan (NNT = 13) had a statistically significant advantage. Adverse reaction rates were similar for most triptans but lower for 2.5 mg naratriptan and 12.5 mg almotriptan.
This meta-analysis demonstrates that oral triptans are effective in relieving acute migraine headache with acceptable adverse effect rates and non-clinically relevant degrees of relief among the agents. The meta-analysis also showed that only approximately 60% of patients respond to a specific triptan. In the few consistency trials, the triptans were effective in treating an average of 2 of 3 consecutive acute migraines in the same patient. Research supports nonsteroidal anti-inflammatory drugs as first-line therapy for mild to moderate migraine; triptans should be considered first-line therapy for moderate to severe migraine. We suggest that clinicians become familiar with several triptans and recognize that a given agent will not always relieve the same person’s migraine and that the failure of 1 triptan to help a patient does not predict failure with another triptan.
ABSTRACT
BACKGROUND: Six selective serotonin 5-HT1B/1D agonists (triptans) are currently approved and available in the United States; 1 more may eventually be approved. Although clinicians need evidence of the differences in efficacy and safety of these agents to assist in their prescribing decisions, a lack of head-to-head comparison trials makes this assessment difficult. The authors performed a meta-analysis of multiple trials of oral triptans to determine their relative effectiveness in treating acute migraine.
POPULATION STUDIED: Patients eligible for the studies were aged 18 to 65 years, had moderate to severe migraine, and had pain rated on a 4-point scale (0 = no pain; 3 = most severe pain). A total of 24,000 patients from 53 clinical trials met the criteria. The authors selected 100 mg sumatriptan, the most widely prescribed agent, as the reference dose.
STUDY DESIGN AND VALIDITY: The authors performed a systematic review of published English-language trials and asked the 6 pharmaceutical companies for raw data from published and unpublished trials. Five companies provided data on 6 drugs; the makers of frovatriptan did not. The investigators included studies that (1) were randomized double-blind controlled clinical trials (placebo or active comparison); (2) treated moderate or severe migraine attacks (by International Headache Society criteria) within 8 hours of migraine onset; (3) used an oral triptan at a recommended clinical dose; and (4) evaluated the headache on the 4-point pain scale. The authors excluded 23 studies that lacked a control group, used nonrecommended dosages, or studied special populations. Of the 53 trials reviewed, 31 were placebo-controlled trials and 22 were direct-comparison trials.
OUTCOMES MEASURED: Four outcomes were measured: (1) proportion of patients with a headache response (improvement to mild or no pain 2 hours post dose); (2) sustained pain-free response (2 hours post dose and no recurrence of moderate or severe migraine 2 to 24 hours post dose); (3) consistent effect of a medication over recurrent attacks in the same person; and (4) adverse reactions.
RESULTS: In placebo trials, 100 mg sumatriptan showed a mean absolute and therapeutic gain of 59% and 29% for 2-hour headache response and 29% and 10% for sustained pain-free rate. The mean therapeutic harm rate was 13% for at least 1 adverse event, 6% for 1 central nervous system event, and 2% for 1 chest event. On average, patients obtained relief for 2 of 3 consecutive migraines. Only 80 mg eletriptan had a statistically significant advantage over 100 mg sumatriptan for therapeutic gain in 2-hour headache response (number needed to treat [NNT] = 8). For therapeutic gain in 2-hour pain-free response, both 10 mg rizatriptan (NNT = 8) and 80 mg eletriptan (NNT = 13) had a statistically significant advantage. Adverse reaction rates were similar for most triptans but lower for 2.5 mg naratriptan and 12.5 mg almotriptan.
This meta-analysis demonstrates that oral triptans are effective in relieving acute migraine headache with acceptable adverse effect rates and non-clinically relevant degrees of relief among the agents. The meta-analysis also showed that only approximately 60% of patients respond to a specific triptan. In the few consistency trials, the triptans were effective in treating an average of 2 of 3 consecutive acute migraines in the same patient. Research supports nonsteroidal anti-inflammatory drugs as first-line therapy for mild to moderate migraine; triptans should be considered first-line therapy for moderate to severe migraine. We suggest that clinicians become familiar with several triptans and recognize that a given agent will not always relieve the same person’s migraine and that the failure of 1 triptan to help a patient does not predict failure with another triptan.
What is the best treatment for patients with severe gastroesophageal reflux disease (GERD)?
BACKGROUND: GERD is common in US adults, and its sequela, Barrett esophagus, is a risk factor for esophageal carcinoma. Modern medical and surgical (laparoscopic fundoplication) therapies are highly effective in controlling GERD symptoms, but there are few data regarding long-term outcomes.
POPULATION STUDIED: The patients studied were part of a Veterans Affairs cooperative study comparing medical and surgical therapy from 1986 to 1988. The enrolled subjects had complicated GERD as determined by GERD Activity Index score and presence of esophagitis, esophageal ulcer, esophageal stricture, or Barrett esophagus.
STUDY DESIGN AND VALIDITY: This study provides the longest follow-up (approximately 10 years) and most complete comparison of outcomes of medical and surgical therapy for severe GERD. In the original trial, patients had been randomized (allocation assignment concealed) to medical treatment with ranitidine and other drugs given continuously or intermittently for symptoms, or to surgery (open Nissen fundoplication). For this follow-up report, the researchers identified the cause of death of those who had died since the original study and re-evaluated those still living. The evaluation consisted of GERD activity scores, endoscopy, 24-hour esophageal pH monitoring, and completion of the 36-item Medical Outcomes Study short form (SF-36) and a questionnaire regarding GERD treatments. Patients discontinued antireflux medications for 1 week before the endoscopy and recording their symptoms. In the original study, baseline characteristics were similar between treatment groups, including frequency of complications such as ulcers and Barrett esophagus. Almost all the patients were men, with a mean age of 67 years. This might limit generalizability, although men have a higher risk of esophageal cancer. Possible limitations included lack of blinding of the investigators to the original treatment assignment, self-reporting of symptoms using a diary (probably similar reliability for both groups), and inability to document if patients stopped use of all antireflux medications in the second week (antacids were allowed).
OUTCOMES MEASURED: The outcomes measured included antireflux medication usage, GERD activity index score off medication, grade of esophagitis, frequency of treatment of esophageal stricture, frequency of additional antireflux operations, SF-36 survey scores, satisfaction with antireflux therapy, survival, and incidence of esophageal adenocarcinoma.
RESULTS: In the original study, 165 patients received medical therapy, and 82 had surgery (n=247). In the follow-up study, 239 subjects were located; of these, 79 had died. Of the remaining 160 subjects, 129 participated in the follow-up study. Thus, outcomes were determined for 84% of the original participants (129 survivors and 79 deaths). More medically treated than surgically treated patients reported regular use of antireflux medications (92% vs 62%; P <.001; number needed to treat = 3). Symptom scores were also significantly lower in the surgical group during the week off medication. No significant differences were found between the groups in the endoscopic grade of esophagitis, frequency of treatment of esophageal stricture, subsequent antireflux operations, health survey scores, overall satisfaction with antireflux therapy, or the incidence of esophageal cancer. Mortality was significantly higher in the surgical group during the follow-up period (40% vs 28%; P=.047; number needed to harm = 8). The majority of these deaths were attributed to heart disease. Patients with Barrett esophagus developed esophageal cancer at an annual rate of 0.4%, while the rate was 0.07% in patients with severe GERD but without Barrett esophagus. Esophageal cancer was an uncommon cause of death.
This study revealed no significant differences in outcomes between surgical and medical treatment for severe GERD. Surgical therapy did not eliminate the need for antisecretory medications, although there was less regular use of these medications. Surgery was associated with an unexplained increase in subsequent mortality from heart disease. Esophageal cancer incidence and mortality were rare. Satisfaction with current medical therapy is likely to be even better with the availability of potent proton pump inhibitors. Since surgical mortality from laparoscopic fundoplication exceeds the rate of esophageal cancer in patients with severe GERD, medical therapy for GERD should be the first line of treatment.
BACKGROUND: GERD is common in US adults, and its sequela, Barrett esophagus, is a risk factor for esophageal carcinoma. Modern medical and surgical (laparoscopic fundoplication) therapies are highly effective in controlling GERD symptoms, but there are few data regarding long-term outcomes.
POPULATION STUDIED: The patients studied were part of a Veterans Affairs cooperative study comparing medical and surgical therapy from 1986 to 1988. The enrolled subjects had complicated GERD as determined by GERD Activity Index score and presence of esophagitis, esophageal ulcer, esophageal stricture, or Barrett esophagus.
STUDY DESIGN AND VALIDITY: This study provides the longest follow-up (approximately 10 years) and most complete comparison of outcomes of medical and surgical therapy for severe GERD. In the original trial, patients had been randomized (allocation assignment concealed) to medical treatment with ranitidine and other drugs given continuously or intermittently for symptoms, or to surgery (open Nissen fundoplication). For this follow-up report, the researchers identified the cause of death of those who had died since the original study and re-evaluated those still living. The evaluation consisted of GERD activity scores, endoscopy, 24-hour esophageal pH monitoring, and completion of the 36-item Medical Outcomes Study short form (SF-36) and a questionnaire regarding GERD treatments. Patients discontinued antireflux medications for 1 week before the endoscopy and recording their symptoms. In the original study, baseline characteristics were similar between treatment groups, including frequency of complications such as ulcers and Barrett esophagus. Almost all the patients were men, with a mean age of 67 years. This might limit generalizability, although men have a higher risk of esophageal cancer. Possible limitations included lack of blinding of the investigators to the original treatment assignment, self-reporting of symptoms using a diary (probably similar reliability for both groups), and inability to document if patients stopped use of all antireflux medications in the second week (antacids were allowed).
OUTCOMES MEASURED: The outcomes measured included antireflux medication usage, GERD activity index score off medication, grade of esophagitis, frequency of treatment of esophageal stricture, frequency of additional antireflux operations, SF-36 survey scores, satisfaction with antireflux therapy, survival, and incidence of esophageal adenocarcinoma.
RESULTS: In the original study, 165 patients received medical therapy, and 82 had surgery (n=247). In the follow-up study, 239 subjects were located; of these, 79 had died. Of the remaining 160 subjects, 129 participated in the follow-up study. Thus, outcomes were determined for 84% of the original participants (129 survivors and 79 deaths). More medically treated than surgically treated patients reported regular use of antireflux medications (92% vs 62%; P <.001; number needed to treat = 3). Symptom scores were also significantly lower in the surgical group during the week off medication. No significant differences were found between the groups in the endoscopic grade of esophagitis, frequency of treatment of esophageal stricture, subsequent antireflux operations, health survey scores, overall satisfaction with antireflux therapy, or the incidence of esophageal cancer. Mortality was significantly higher in the surgical group during the follow-up period (40% vs 28%; P=.047; number needed to harm = 8). The majority of these deaths were attributed to heart disease. Patients with Barrett esophagus developed esophageal cancer at an annual rate of 0.4%, while the rate was 0.07% in patients with severe GERD but without Barrett esophagus. Esophageal cancer was an uncommon cause of death.
This study revealed no significant differences in outcomes between surgical and medical treatment for severe GERD. Surgical therapy did not eliminate the need for antisecretory medications, although there was less regular use of these medications. Surgery was associated with an unexplained increase in subsequent mortality from heart disease. Esophageal cancer incidence and mortality were rare. Satisfaction with current medical therapy is likely to be even better with the availability of potent proton pump inhibitors. Since surgical mortality from laparoscopic fundoplication exceeds the rate of esophageal cancer in patients with severe GERD, medical therapy for GERD should be the first line of treatment.
BACKGROUND: GERD is common in US adults, and its sequela, Barrett esophagus, is a risk factor for esophageal carcinoma. Modern medical and surgical (laparoscopic fundoplication) therapies are highly effective in controlling GERD symptoms, but there are few data regarding long-term outcomes.
POPULATION STUDIED: The patients studied were part of a Veterans Affairs cooperative study comparing medical and surgical therapy from 1986 to 1988. The enrolled subjects had complicated GERD as determined by GERD Activity Index score and presence of esophagitis, esophageal ulcer, esophageal stricture, or Barrett esophagus.
STUDY DESIGN AND VALIDITY: This study provides the longest follow-up (approximately 10 years) and most complete comparison of outcomes of medical and surgical therapy for severe GERD. In the original trial, patients had been randomized (allocation assignment concealed) to medical treatment with ranitidine and other drugs given continuously or intermittently for symptoms, or to surgery (open Nissen fundoplication). For this follow-up report, the researchers identified the cause of death of those who had died since the original study and re-evaluated those still living. The evaluation consisted of GERD activity scores, endoscopy, 24-hour esophageal pH monitoring, and completion of the 36-item Medical Outcomes Study short form (SF-36) and a questionnaire regarding GERD treatments. Patients discontinued antireflux medications for 1 week before the endoscopy and recording their symptoms. In the original study, baseline characteristics were similar between treatment groups, including frequency of complications such as ulcers and Barrett esophagus. Almost all the patients were men, with a mean age of 67 years. This might limit generalizability, although men have a higher risk of esophageal cancer. Possible limitations included lack of blinding of the investigators to the original treatment assignment, self-reporting of symptoms using a diary (probably similar reliability for both groups), and inability to document if patients stopped use of all antireflux medications in the second week (antacids were allowed).
OUTCOMES MEASURED: The outcomes measured included antireflux medication usage, GERD activity index score off medication, grade of esophagitis, frequency of treatment of esophageal stricture, frequency of additional antireflux operations, SF-36 survey scores, satisfaction with antireflux therapy, survival, and incidence of esophageal adenocarcinoma.
RESULTS: In the original study, 165 patients received medical therapy, and 82 had surgery (n=247). In the follow-up study, 239 subjects were located; of these, 79 had died. Of the remaining 160 subjects, 129 participated in the follow-up study. Thus, outcomes were determined for 84% of the original participants (129 survivors and 79 deaths). More medically treated than surgically treated patients reported regular use of antireflux medications (92% vs 62%; P <.001; number needed to treat = 3). Symptom scores were also significantly lower in the surgical group during the week off medication. No significant differences were found between the groups in the endoscopic grade of esophagitis, frequency of treatment of esophageal stricture, subsequent antireflux operations, health survey scores, overall satisfaction with antireflux therapy, or the incidence of esophageal cancer. Mortality was significantly higher in the surgical group during the follow-up period (40% vs 28%; P=.047; number needed to harm = 8). The majority of these deaths were attributed to heart disease. Patients with Barrett esophagus developed esophageal cancer at an annual rate of 0.4%, while the rate was 0.07% in patients with severe GERD but without Barrett esophagus. Esophageal cancer was an uncommon cause of death.
This study revealed no significant differences in outcomes between surgical and medical treatment for severe GERD. Surgical therapy did not eliminate the need for antisecretory medications, although there was less regular use of these medications. Surgery was associated with an unexplained increase in subsequent mortality from heart disease. Esophageal cancer incidence and mortality were rare. Satisfaction with current medical therapy is likely to be even better with the availability of potent proton pump inhibitors. Since surgical mortality from laparoscopic fundoplication exceeds the rate of esophageal cancer in patients with severe GERD, medical therapy for GERD should be the first line of treatment.