User login
Insulin Levels, Lifestyle May Affect Colon Cancer Risk
WASHINGTON — Elevated insulin levels and lifestyle factors were significantly associated with an increased risk of colon cancer in both white and African American subjects in a population-based study, Temitope O. Keku, Ph.D., reported at the annual meeting of the American Association for Cancer Research.
The study included 231 African Americans with colon cancer and 360 African American controls, along with 297 white patients with colon cancer and 530 white controls in North Carolina.
Cancer patients of both races reported eating significantly more food, compared with controls. Whites with cancer were significantly more likely than controls to report a high-fat diet, while African Americans with cancer were significantly more likely than controls to report NSAID use, Dr. Keku and her colleagues at the University of North Carolina at Chapel Hill wrote in a poster.
Comparison of the highest and lowest quartiles for insulinlike growth factor I showed a threefold increased risk of colon cancer among African Americans and a 1.6-fold increased risk among whites in the highest quartiles compared with the lowest quartiles.
Elevated insulin and C-peptide levels were positively associated with cancer in both races, as were elevated IGF-I levels, higher body mass index, and low levels of physical activity. In addition, high levels of IGF-I significantly increased the risk of colon cancer in overweight or obese subjects.
These interactions suggest that lifestyle and dietary factors may modify the link between insulin resistance and colon cancer, regardless of race.
WASHINGTON — Elevated insulin levels and lifestyle factors were significantly associated with an increased risk of colon cancer in both white and African American subjects in a population-based study, Temitope O. Keku, Ph.D., reported at the annual meeting of the American Association for Cancer Research.
The study included 231 African Americans with colon cancer and 360 African American controls, along with 297 white patients with colon cancer and 530 white controls in North Carolina.
Cancer patients of both races reported eating significantly more food, compared with controls. Whites with cancer were significantly more likely than controls to report a high-fat diet, while African Americans with cancer were significantly more likely than controls to report NSAID use, Dr. Keku and her colleagues at the University of North Carolina at Chapel Hill wrote in a poster.
Comparison of the highest and lowest quartiles for insulinlike growth factor I showed a threefold increased risk of colon cancer among African Americans and a 1.6-fold increased risk among whites in the highest quartiles compared with the lowest quartiles.
Elevated insulin and C-peptide levels were positively associated with cancer in both races, as were elevated IGF-I levels, higher body mass index, and low levels of physical activity. In addition, high levels of IGF-I significantly increased the risk of colon cancer in overweight or obese subjects.
These interactions suggest that lifestyle and dietary factors may modify the link between insulin resistance and colon cancer, regardless of race.
WASHINGTON — Elevated insulin levels and lifestyle factors were significantly associated with an increased risk of colon cancer in both white and African American subjects in a population-based study, Temitope O. Keku, Ph.D., reported at the annual meeting of the American Association for Cancer Research.
The study included 231 African Americans with colon cancer and 360 African American controls, along with 297 white patients with colon cancer and 530 white controls in North Carolina.
Cancer patients of both races reported eating significantly more food, compared with controls. Whites with cancer were significantly more likely than controls to report a high-fat diet, while African Americans with cancer were significantly more likely than controls to report NSAID use, Dr. Keku and her colleagues at the University of North Carolina at Chapel Hill wrote in a poster.
Comparison of the highest and lowest quartiles for insulinlike growth factor I showed a threefold increased risk of colon cancer among African Americans and a 1.6-fold increased risk among whites in the highest quartiles compared with the lowest quartiles.
Elevated insulin and C-peptide levels were positively associated with cancer in both races, as were elevated IGF-I levels, higher body mass index, and low levels of physical activity. In addition, high levels of IGF-I significantly increased the risk of colon cancer in overweight or obese subjects.
These interactions suggest that lifestyle and dietary factors may modify the link between insulin resistance and colon cancer, regardless of race.
Panitumumab Found to Slow Metastatic Colorectal Cancer
WASHINGTON — Twice-weekly doses of the investigational monoclonal antibody panitumumab reduced short-term disease progression by 46% in previously treated patients with metastatic colorectal cancer, Dr. Marc Peeters reported at the annual meeting of the American Association for Cancer Research.
After 8 weeks of twice-weekly 6-mg/kg doses of panitumumab plus the best standard of care, 49% of 231 patients were alive and had no disease progression, compared with 30% of 232 patients who received the best standard of care without panitumumab, said Dr. Peeters of Ghent (Belgium) University Hospital.
Disease progression continued to be slower in patients treated with panitumumab, compared with the standard-care group until about 20 weeks of treatment, and more panitumumab-treated patients remained alive after 32 weeks of treatment, compared with the standard therapy group.
Panitumumab, which is being developed by Amgen Inc., targets the epidermal growth factor receptor.
The randomized study included 463 patients aged 27–82 years of age, with a median age of 62 years. Overall, 67% of the patients had colon cancer, and 33% had rectal cancer, and all but one patient had undergone at least two chemotherapy regimens. Other demographic and clinical characteristics were similar between the groups.
Skin rash, the most common adverse event, was reported in about 90% of the panitumumab patients and in 9% of the standard group. Other side effects that were more common among panitumumab patients included fatigue, abdominal pain, nausea, and diarrhea. No treatment-related deaths were reported.
“This is the first randomized controlled phase III study comparing a monoclonal antibody with best supportive care in chemoresistant patients,” Dr. Peeters noted. “These data support further investigations with panitumumab, and additional studies are ongoing,” he said.
Dr. James Abbruzzese, of the University of Texas M.D. Anderson Cancer Center in Houston, was the study's discussant at the meeting. The results were unsurprising, he said. “The objective response rate was very consistent with prior phase II studies.”
Dr. Abbruzzese, who was not financially associated with the study, reviewed the data and noted that the hazard ratio for panitumumab was significantly lower during the first weeks of care, stabilized to 0.5 at about 5 weeks, and then tapered off after about 20 weeks.
Although the long-term survival rates were no longer significantly different between the two groups, “the impact of panitumumab during the early weeks of treatment was substantial, and prevented patients from progressing or dying during that period of time,” he said. Panitumumab's impact on disease progression after 20 weeks remains uncertain, and may suggest some emergent resistance over time, he noted.
Panitumumab is being incorporated into investigational front-line cancer therapy regimens, and future research will examine its long-term effects.
WASHINGTON — Twice-weekly doses of the investigational monoclonal antibody panitumumab reduced short-term disease progression by 46% in previously treated patients with metastatic colorectal cancer, Dr. Marc Peeters reported at the annual meeting of the American Association for Cancer Research.
After 8 weeks of twice-weekly 6-mg/kg doses of panitumumab plus the best standard of care, 49% of 231 patients were alive and had no disease progression, compared with 30% of 232 patients who received the best standard of care without panitumumab, said Dr. Peeters of Ghent (Belgium) University Hospital.
Disease progression continued to be slower in patients treated with panitumumab, compared with the standard-care group until about 20 weeks of treatment, and more panitumumab-treated patients remained alive after 32 weeks of treatment, compared with the standard therapy group.
Panitumumab, which is being developed by Amgen Inc., targets the epidermal growth factor receptor.
The randomized study included 463 patients aged 27–82 years of age, with a median age of 62 years. Overall, 67% of the patients had colon cancer, and 33% had rectal cancer, and all but one patient had undergone at least two chemotherapy regimens. Other demographic and clinical characteristics were similar between the groups.
Skin rash, the most common adverse event, was reported in about 90% of the panitumumab patients and in 9% of the standard group. Other side effects that were more common among panitumumab patients included fatigue, abdominal pain, nausea, and diarrhea. No treatment-related deaths were reported.
“This is the first randomized controlled phase III study comparing a monoclonal antibody with best supportive care in chemoresistant patients,” Dr. Peeters noted. “These data support further investigations with panitumumab, and additional studies are ongoing,” he said.
Dr. James Abbruzzese, of the University of Texas M.D. Anderson Cancer Center in Houston, was the study's discussant at the meeting. The results were unsurprising, he said. “The objective response rate was very consistent with prior phase II studies.”
Dr. Abbruzzese, who was not financially associated with the study, reviewed the data and noted that the hazard ratio for panitumumab was significantly lower during the first weeks of care, stabilized to 0.5 at about 5 weeks, and then tapered off after about 20 weeks.
Although the long-term survival rates were no longer significantly different between the two groups, “the impact of panitumumab during the early weeks of treatment was substantial, and prevented patients from progressing or dying during that period of time,” he said. Panitumumab's impact on disease progression after 20 weeks remains uncertain, and may suggest some emergent resistance over time, he noted.
Panitumumab is being incorporated into investigational front-line cancer therapy regimens, and future research will examine its long-term effects.
WASHINGTON — Twice-weekly doses of the investigational monoclonal antibody panitumumab reduced short-term disease progression by 46% in previously treated patients with metastatic colorectal cancer, Dr. Marc Peeters reported at the annual meeting of the American Association for Cancer Research.
After 8 weeks of twice-weekly 6-mg/kg doses of panitumumab plus the best standard of care, 49% of 231 patients were alive and had no disease progression, compared with 30% of 232 patients who received the best standard of care without panitumumab, said Dr. Peeters of Ghent (Belgium) University Hospital.
Disease progression continued to be slower in patients treated with panitumumab, compared with the standard-care group until about 20 weeks of treatment, and more panitumumab-treated patients remained alive after 32 weeks of treatment, compared with the standard therapy group.
Panitumumab, which is being developed by Amgen Inc., targets the epidermal growth factor receptor.
The randomized study included 463 patients aged 27–82 years of age, with a median age of 62 years. Overall, 67% of the patients had colon cancer, and 33% had rectal cancer, and all but one patient had undergone at least two chemotherapy regimens. Other demographic and clinical characteristics were similar between the groups.
Skin rash, the most common adverse event, was reported in about 90% of the panitumumab patients and in 9% of the standard group. Other side effects that were more common among panitumumab patients included fatigue, abdominal pain, nausea, and diarrhea. No treatment-related deaths were reported.
“This is the first randomized controlled phase III study comparing a monoclonal antibody with best supportive care in chemoresistant patients,” Dr. Peeters noted. “These data support further investigations with panitumumab, and additional studies are ongoing,” he said.
Dr. James Abbruzzese, of the University of Texas M.D. Anderson Cancer Center in Houston, was the study's discussant at the meeting. The results were unsurprising, he said. “The objective response rate was very consistent with prior phase II studies.”
Dr. Abbruzzese, who was not financially associated with the study, reviewed the data and noted that the hazard ratio for panitumumab was significantly lower during the first weeks of care, stabilized to 0.5 at about 5 weeks, and then tapered off after about 20 weeks.
Although the long-term survival rates were no longer significantly different between the two groups, “the impact of panitumumab during the early weeks of treatment was substantial, and prevented patients from progressing or dying during that period of time,” he said. Panitumumab's impact on disease progression after 20 weeks remains uncertain, and may suggest some emergent resistance over time, he noted.
Panitumumab is being incorporated into investigational front-line cancer therapy regimens, and future research will examine its long-term effects.
Utah's Influenza Hospitalization Data Show Ethnic Disparities
ATLANTA — Blacks, Asian Americans, and Hispanics were significantly more likely to be hospitalized for influenza during the 2004–2005 flu season in Utah, compared with non-Hispanic whites, Lisa Wyman reported in a poster presented at the International Conference on Emerging Infectious Diseases.
Overall, the hospitalization rate per 100,000 person-years was 22.2 cases among blacks, 22.6 cases among Asians/Pacific Islanders, and 19.0 cases among Hispanics, compared with 7.2 cases among non-Hispanic whites. Children younger than 5 years had the highest hospitalization rates of any age group, and these rates were significantly higher among minority children, compared with non-Hispanic whites.
Ms. Wyman and her colleagues at the Utah Department of Health reviewed all laboratory-confirmed cases of influenza reported in Utah during the 2004–2005 season. A total of 253 hospitalizations were reported, and complete race and ethnicity data were available for 209 of those cases.
The type of influenza virus was determined for 224 hospitalized cases; 136 were associated with the influenza A virus, and 88 were associated with the influenza B virus. Hispanics and Asian/Pacific Islander Americans were significantly more likely to have the influenza B virus (46% and 69%, respectively), compared with non-Hispanic whites. Hispanics aged 25 years and older were more likely to have the influenza B virus, compared with non-Hispanic whites, with an odds ratio of 6.86.
Although the study was limited by relatively small numbers, a preliminary review of data from the 2005–2006 flu season showed similar trends with regard to ethnic disparities in hospitalization rates.
ATLANTA — Blacks, Asian Americans, and Hispanics were significantly more likely to be hospitalized for influenza during the 2004–2005 flu season in Utah, compared with non-Hispanic whites, Lisa Wyman reported in a poster presented at the International Conference on Emerging Infectious Diseases.
Overall, the hospitalization rate per 100,000 person-years was 22.2 cases among blacks, 22.6 cases among Asians/Pacific Islanders, and 19.0 cases among Hispanics, compared with 7.2 cases among non-Hispanic whites. Children younger than 5 years had the highest hospitalization rates of any age group, and these rates were significantly higher among minority children, compared with non-Hispanic whites.
Ms. Wyman and her colleagues at the Utah Department of Health reviewed all laboratory-confirmed cases of influenza reported in Utah during the 2004–2005 season. A total of 253 hospitalizations were reported, and complete race and ethnicity data were available for 209 of those cases.
The type of influenza virus was determined for 224 hospitalized cases; 136 were associated with the influenza A virus, and 88 were associated with the influenza B virus. Hispanics and Asian/Pacific Islander Americans were significantly more likely to have the influenza B virus (46% and 69%, respectively), compared with non-Hispanic whites. Hispanics aged 25 years and older were more likely to have the influenza B virus, compared with non-Hispanic whites, with an odds ratio of 6.86.
Although the study was limited by relatively small numbers, a preliminary review of data from the 2005–2006 flu season showed similar trends with regard to ethnic disparities in hospitalization rates.
ATLANTA — Blacks, Asian Americans, and Hispanics were significantly more likely to be hospitalized for influenza during the 2004–2005 flu season in Utah, compared with non-Hispanic whites, Lisa Wyman reported in a poster presented at the International Conference on Emerging Infectious Diseases.
Overall, the hospitalization rate per 100,000 person-years was 22.2 cases among blacks, 22.6 cases among Asians/Pacific Islanders, and 19.0 cases among Hispanics, compared with 7.2 cases among non-Hispanic whites. Children younger than 5 years had the highest hospitalization rates of any age group, and these rates were significantly higher among minority children, compared with non-Hispanic whites.
Ms. Wyman and her colleagues at the Utah Department of Health reviewed all laboratory-confirmed cases of influenza reported in Utah during the 2004–2005 season. A total of 253 hospitalizations were reported, and complete race and ethnicity data were available for 209 of those cases.
The type of influenza virus was determined for 224 hospitalized cases; 136 were associated with the influenza A virus, and 88 were associated with the influenza B virus. Hispanics and Asian/Pacific Islander Americans were significantly more likely to have the influenza B virus (46% and 69%, respectively), compared with non-Hispanic whites. Hispanics aged 25 years and older were more likely to have the influenza B virus, compared with non-Hispanic whites, with an odds ratio of 6.86.
Although the study was limited by relatively small numbers, a preliminary review of data from the 2005–2006 flu season showed similar trends with regard to ethnic disparities in hospitalization rates.
Less-Common E. coli Isolates Identified in Foodborne Cases
ATLANTA — Certain serogroups of non-O157 shiga toxin-producing Escherichia coli may be especially virulent in cases of foodborne illness, Bridget J. Anderson, Ph.D., reported at the International Conference on Emerging Infectious Diseases.
Although O157 is the most common serogroup implicated in severe illness, non-O157 serogroups “are being recognized with increasing frequency in persons with diarrheal illness,” Dr. Anderson wrote in a poster focusing on the epidemiology of non-O157 infections. An analysis of Foodborne Diseases Active Surveillance Network data from 214 cases of non-O157 shiga toxin-producing Escherichia coli (STEC) in New York, Connecticut, and Minnesota during 2000–2004 showed that the non-O157 serogroups O145, O111, and O45 may be more virulent than other non-O157 serogroups.
The federally supported Foodborne Diseases Active Surveillance Network program seeks to link foodborne illnesses to specific foods and settings in selected U.S. locations.
Ms. Anderson, of the New York State Department of Health, in Albany, and her colleagues identified 27 non-O157 serogroups among the 214 cases. Of these, O111 was the most common and contributed to 37% of the cases. The O111 serogroup caused 31 cases (14%) in a single outbreak of foodborne illness. Overall, 14% of the patients were hospitalized, for a median of 3 days. Four patients developed hemolytic uremic syndrome, but no deaths were reported.
Hospitalization was more common among cases associated with O145 than among those associated with other non-O145 serogroups (36% vs. 13%). Cases of O45 also were more likely to involve hospitalization, compared with all other non-O45 cases (30% vs. 12%).
Clinical data were available for 75% of the cases. The age of the patients ranged from 1 month to 88 years, with a median age of 13 years. The most common symptoms of illness were diarrhea (98%), abdominal cramping (83%), and bloody stool (50%). Symptoms lasted for a median of 7 days, and 27% of the patients received antibiotics.
ATLANTA — Certain serogroups of non-O157 shiga toxin-producing Escherichia coli may be especially virulent in cases of foodborne illness, Bridget J. Anderson, Ph.D., reported at the International Conference on Emerging Infectious Diseases.
Although O157 is the most common serogroup implicated in severe illness, non-O157 serogroups “are being recognized with increasing frequency in persons with diarrheal illness,” Dr. Anderson wrote in a poster focusing on the epidemiology of non-O157 infections. An analysis of Foodborne Diseases Active Surveillance Network data from 214 cases of non-O157 shiga toxin-producing Escherichia coli (STEC) in New York, Connecticut, and Minnesota during 2000–2004 showed that the non-O157 serogroups O145, O111, and O45 may be more virulent than other non-O157 serogroups.
The federally supported Foodborne Diseases Active Surveillance Network program seeks to link foodborne illnesses to specific foods and settings in selected U.S. locations.
Ms. Anderson, of the New York State Department of Health, in Albany, and her colleagues identified 27 non-O157 serogroups among the 214 cases. Of these, O111 was the most common and contributed to 37% of the cases. The O111 serogroup caused 31 cases (14%) in a single outbreak of foodborne illness. Overall, 14% of the patients were hospitalized, for a median of 3 days. Four patients developed hemolytic uremic syndrome, but no deaths were reported.
Hospitalization was more common among cases associated with O145 than among those associated with other non-O145 serogroups (36% vs. 13%). Cases of O45 also were more likely to involve hospitalization, compared with all other non-O45 cases (30% vs. 12%).
Clinical data were available for 75% of the cases. The age of the patients ranged from 1 month to 88 years, with a median age of 13 years. The most common symptoms of illness were diarrhea (98%), abdominal cramping (83%), and bloody stool (50%). Symptoms lasted for a median of 7 days, and 27% of the patients received antibiotics.
ATLANTA — Certain serogroups of non-O157 shiga toxin-producing Escherichia coli may be especially virulent in cases of foodborne illness, Bridget J. Anderson, Ph.D., reported at the International Conference on Emerging Infectious Diseases.
Although O157 is the most common serogroup implicated in severe illness, non-O157 serogroups “are being recognized with increasing frequency in persons with diarrheal illness,” Dr. Anderson wrote in a poster focusing on the epidemiology of non-O157 infections. An analysis of Foodborne Diseases Active Surveillance Network data from 214 cases of non-O157 shiga toxin-producing Escherichia coli (STEC) in New York, Connecticut, and Minnesota during 2000–2004 showed that the non-O157 serogroups O145, O111, and O45 may be more virulent than other non-O157 serogroups.
The federally supported Foodborne Diseases Active Surveillance Network program seeks to link foodborne illnesses to specific foods and settings in selected U.S. locations.
Ms. Anderson, of the New York State Department of Health, in Albany, and her colleagues identified 27 non-O157 serogroups among the 214 cases. Of these, O111 was the most common and contributed to 37% of the cases. The O111 serogroup caused 31 cases (14%) in a single outbreak of foodborne illness. Overall, 14% of the patients were hospitalized, for a median of 3 days. Four patients developed hemolytic uremic syndrome, but no deaths were reported.
Hospitalization was more common among cases associated with O145 than among those associated with other non-O145 serogroups (36% vs. 13%). Cases of O45 also were more likely to involve hospitalization, compared with all other non-O45 cases (30% vs. 12%).
Clinical data were available for 75% of the cases. The age of the patients ranged from 1 month to 88 years, with a median age of 13 years. The most common symptoms of illness were diarrhea (98%), abdominal cramping (83%), and bloody stool (50%). Symptoms lasted for a median of 7 days, and 27% of the patients received antibiotics.
Viruses Overtake Bacteria as No. 1 U.S. Cause of Foodborne Illnesses
ATLANTA — Viruses surpassed bacteria as the pathogen group responsible for the most foodborne disease outbreaks in the United States in 2004, Rachel Yelk Woodruff reported in a poster at the International Conference on Emerging Infectious Diseases.
She and her colleagues at the Centers for Disease Control and Prevention in Atlanta reviewed data on 9,034 foodborne outbreaks collected through the Foodborne Outbreak Reporting System from 1998 to 2004.
Overall, the number of foodborne disease outbreaks remained stable during the study period, but the median number of illnesses per outbreak increased steadily, from a median of six illnesses per outbreak during 1998–2000, to seven illnesses per outbreak during 2001–2003, and eight illnesses per outbreak in 2004, Ms. Woodruff and her associates said.
In 2004, viral pathogens caused more outbreaks than did bacterial pathogens (249 vs. 208). In contrast, foodborne disease outbreaks caused by bacterial pathogens outnumbered those caused by viral pathogens during 1998–2003.
However, the median number of illnesses per viral outbreak decreased from 32 in 1998 to 22 in 2004, while the median number of illnesses per bacterial outbreak was fairly stable: 12 in 1998 and 11 in 2004.
Salmonella was the predominant bacteria, accounting for 9%–12% of all bacterial outbreaks. Shiga toxin-producing Escherichia coli (STEC) was the second most common, accounting for 1%–2% of all reported outbreaks. The total number of Salmonella outbreaks did not change significantly from 1998 to 2004 (125 vs. 117).
The number of STEC outbreaks decreased from 26 to 16, and the number of outbreaks of unknown origin decreased from 946 in 1998 to 801 in 2004.
ATLANTA — Viruses surpassed bacteria as the pathogen group responsible for the most foodborne disease outbreaks in the United States in 2004, Rachel Yelk Woodruff reported in a poster at the International Conference on Emerging Infectious Diseases.
She and her colleagues at the Centers for Disease Control and Prevention in Atlanta reviewed data on 9,034 foodborne outbreaks collected through the Foodborne Outbreak Reporting System from 1998 to 2004.
Overall, the number of foodborne disease outbreaks remained stable during the study period, but the median number of illnesses per outbreak increased steadily, from a median of six illnesses per outbreak during 1998–2000, to seven illnesses per outbreak during 2001–2003, and eight illnesses per outbreak in 2004, Ms. Woodruff and her associates said.
In 2004, viral pathogens caused more outbreaks than did bacterial pathogens (249 vs. 208). In contrast, foodborne disease outbreaks caused by bacterial pathogens outnumbered those caused by viral pathogens during 1998–2003.
However, the median number of illnesses per viral outbreak decreased from 32 in 1998 to 22 in 2004, while the median number of illnesses per bacterial outbreak was fairly stable: 12 in 1998 and 11 in 2004.
Salmonella was the predominant bacteria, accounting for 9%–12% of all bacterial outbreaks. Shiga toxin-producing Escherichia coli (STEC) was the second most common, accounting for 1%–2% of all reported outbreaks. The total number of Salmonella outbreaks did not change significantly from 1998 to 2004 (125 vs. 117).
The number of STEC outbreaks decreased from 26 to 16, and the number of outbreaks of unknown origin decreased from 946 in 1998 to 801 in 2004.
ATLANTA — Viruses surpassed bacteria as the pathogen group responsible for the most foodborne disease outbreaks in the United States in 2004, Rachel Yelk Woodruff reported in a poster at the International Conference on Emerging Infectious Diseases.
She and her colleagues at the Centers for Disease Control and Prevention in Atlanta reviewed data on 9,034 foodborne outbreaks collected through the Foodborne Outbreak Reporting System from 1998 to 2004.
Overall, the number of foodborne disease outbreaks remained stable during the study period, but the median number of illnesses per outbreak increased steadily, from a median of six illnesses per outbreak during 1998–2000, to seven illnesses per outbreak during 2001–2003, and eight illnesses per outbreak in 2004, Ms. Woodruff and her associates said.
In 2004, viral pathogens caused more outbreaks than did bacterial pathogens (249 vs. 208). In contrast, foodborne disease outbreaks caused by bacterial pathogens outnumbered those caused by viral pathogens during 1998–2003.
However, the median number of illnesses per viral outbreak decreased from 32 in 1998 to 22 in 2004, while the median number of illnesses per bacterial outbreak was fairly stable: 12 in 1998 and 11 in 2004.
Salmonella was the predominant bacteria, accounting for 9%–12% of all bacterial outbreaks. Shiga toxin-producing Escherichia coli (STEC) was the second most common, accounting for 1%–2% of all reported outbreaks. The total number of Salmonella outbreaks did not change significantly from 1998 to 2004 (125 vs. 117).
The number of STEC outbreaks decreased from 26 to 16, and the number of outbreaks of unknown origin decreased from 946 in 1998 to 801 in 2004.
Salmonella Outbreak Traced to Tomatoes in Salsa
ATLANTA — Hold the salsa—tomato-related Salmonella outbreaks are a growing problem.
A multicounty outbreak of Salmonella enteritidis in the San Francisco Bay area of California in 2005 was traced to overripe tomatoes sold by a single grower. This is the first evidence of S. enteritidis associated with tomatoes in California, Dr. Jean W. Yuan reported at the International Conference on Emerging Infectious Diseases.
The number of Salmonella isolates in California increased significantly—by nearly threefold—in 2005 compared with 2004, noted Dr. Yuan of the Centers for Disease Control and Prevention and the California Department of Health Services. “We noticed a statewide increase in S. enteritidis, and we identified two restaurant clusters with S. enteritidis in geographically distinct locations with no common food handler,” Dr. Yuan said. “We wondered if there was a common source for the statewide increase and the restaurant clusters.”
Overall, 67% of the patients had eaten fresh salsa in restaurants, compared with 23% of controls, based on a case-control study of 79 patients who developed S. enteritidis from July 1 to July 18, 2005. These patients were significantly more likely than controls to have eaten in one of two affected Mexican restaurant chains.
The outbreak was especially noteworthy because it involved phage type 30, “a rare phage type that we had not detected in California residents before,” Dr. Yuan said. “The only previous phage type 30 outbreak had been attributed to raw almonds from California.”
The investigators examined the salsa ingredients and found that tomatoes and cilantro were significantly associated with illness. No common source was identified for cilantro, but a traceback investigation of the suspected tomatoes identified a common tomato grower and packer who supplied tomatoes to the affected restaurants.
“This grower-packer sells tomatoes in cash transactions only, when they are too ripe to sell to regular customers,” Dr. Yuan said. “Many Mexican restaurants prefer the cheaper, riper tomatoes because of the high volume of tomatoes they use on a daily basis.”
“Uncooked tomatoes are an integral and nutritious part of the American diet, and the potential for future outbreaks is a concern,” Dr. Yuan added.
ATLANTA — Hold the salsa—tomato-related Salmonella outbreaks are a growing problem.
A multicounty outbreak of Salmonella enteritidis in the San Francisco Bay area of California in 2005 was traced to overripe tomatoes sold by a single grower. This is the first evidence of S. enteritidis associated with tomatoes in California, Dr. Jean W. Yuan reported at the International Conference on Emerging Infectious Diseases.
The number of Salmonella isolates in California increased significantly—by nearly threefold—in 2005 compared with 2004, noted Dr. Yuan of the Centers for Disease Control and Prevention and the California Department of Health Services. “We noticed a statewide increase in S. enteritidis, and we identified two restaurant clusters with S. enteritidis in geographically distinct locations with no common food handler,” Dr. Yuan said. “We wondered if there was a common source for the statewide increase and the restaurant clusters.”
Overall, 67% of the patients had eaten fresh salsa in restaurants, compared with 23% of controls, based on a case-control study of 79 patients who developed S. enteritidis from July 1 to July 18, 2005. These patients were significantly more likely than controls to have eaten in one of two affected Mexican restaurant chains.
The outbreak was especially noteworthy because it involved phage type 30, “a rare phage type that we had not detected in California residents before,” Dr. Yuan said. “The only previous phage type 30 outbreak had been attributed to raw almonds from California.”
The investigators examined the salsa ingredients and found that tomatoes and cilantro were significantly associated with illness. No common source was identified for cilantro, but a traceback investigation of the suspected tomatoes identified a common tomato grower and packer who supplied tomatoes to the affected restaurants.
“This grower-packer sells tomatoes in cash transactions only, when they are too ripe to sell to regular customers,” Dr. Yuan said. “Many Mexican restaurants prefer the cheaper, riper tomatoes because of the high volume of tomatoes they use on a daily basis.”
“Uncooked tomatoes are an integral and nutritious part of the American diet, and the potential for future outbreaks is a concern,” Dr. Yuan added.
ATLANTA — Hold the salsa—tomato-related Salmonella outbreaks are a growing problem.
A multicounty outbreak of Salmonella enteritidis in the San Francisco Bay area of California in 2005 was traced to overripe tomatoes sold by a single grower. This is the first evidence of S. enteritidis associated with tomatoes in California, Dr. Jean W. Yuan reported at the International Conference on Emerging Infectious Diseases.
The number of Salmonella isolates in California increased significantly—by nearly threefold—in 2005 compared with 2004, noted Dr. Yuan of the Centers for Disease Control and Prevention and the California Department of Health Services. “We noticed a statewide increase in S. enteritidis, and we identified two restaurant clusters with S. enteritidis in geographically distinct locations with no common food handler,” Dr. Yuan said. “We wondered if there was a common source for the statewide increase and the restaurant clusters.”
Overall, 67% of the patients had eaten fresh salsa in restaurants, compared with 23% of controls, based on a case-control study of 79 patients who developed S. enteritidis from July 1 to July 18, 2005. These patients were significantly more likely than controls to have eaten in one of two affected Mexican restaurant chains.
The outbreak was especially noteworthy because it involved phage type 30, “a rare phage type that we had not detected in California residents before,” Dr. Yuan said. “The only previous phage type 30 outbreak had been attributed to raw almonds from California.”
The investigators examined the salsa ingredients and found that tomatoes and cilantro were significantly associated with illness. No common source was identified for cilantro, but a traceback investigation of the suspected tomatoes identified a common tomato grower and packer who supplied tomatoes to the affected restaurants.
“This grower-packer sells tomatoes in cash transactions only, when they are too ripe to sell to regular customers,” Dr. Yuan said. “Many Mexican restaurants prefer the cheaper, riper tomatoes because of the high volume of tomatoes they use on a daily basis.”
“Uncooked tomatoes are an integral and nutritious part of the American diet, and the potential for future outbreaks is a concern,” Dr. Yuan added.
Genital Gram Stains Unreliable As STI Detectors
WASHINGTON — Genital Gram stains alone lack the diagnostic ability to detect Chlamydia trachomatis and Neisseria gonorrhoeae infections, based on data from 1,511 emergency department visits, reported Dr. Shanda Riley in a poster at the annual meeting of the American College of Emergency Physicians.
In 502 visits (33%), physicians used a DNA probe without a Gram stain, 68 visits (5%) included a Gram stain without a DNA probe, and 941 visits (62%) included both a Gram stain and a DNA probe to detect sexually transmitted infections.
Dr. Riley of the University of Illinois, Peoria, and her colleagues reviewed all DNA probes for C. trachomatis and N. gonorrhoeae, along with Trichomonas vaginalis wet preps and genital Gram stains performed on patients seen in an emergency department between January 2004 and December 2004. The sensitivity and the specificity of the Gram stains were 71.1% and 41%, respectively, for N. gonorrhoeae, and 75.6% and 43%, respectively, for C. trachomatis. In addition, the average positive predictive value of the Gram stains for both organisms was 15%. Gram stains were considered positive if they demonstrated more than 10 white blood cells per high-power field or if clue cells, Gram-negative intracellular/extracellular diplococci, or T. vaginalis organisms were found.
WASHINGTON — Genital Gram stains alone lack the diagnostic ability to detect Chlamydia trachomatis and Neisseria gonorrhoeae infections, based on data from 1,511 emergency department visits, reported Dr. Shanda Riley in a poster at the annual meeting of the American College of Emergency Physicians.
In 502 visits (33%), physicians used a DNA probe without a Gram stain, 68 visits (5%) included a Gram stain without a DNA probe, and 941 visits (62%) included both a Gram stain and a DNA probe to detect sexually transmitted infections.
Dr. Riley of the University of Illinois, Peoria, and her colleagues reviewed all DNA probes for C. trachomatis and N. gonorrhoeae, along with Trichomonas vaginalis wet preps and genital Gram stains performed on patients seen in an emergency department between January 2004 and December 2004. The sensitivity and the specificity of the Gram stains were 71.1% and 41%, respectively, for N. gonorrhoeae, and 75.6% and 43%, respectively, for C. trachomatis. In addition, the average positive predictive value of the Gram stains for both organisms was 15%. Gram stains were considered positive if they demonstrated more than 10 white blood cells per high-power field or if clue cells, Gram-negative intracellular/extracellular diplococci, or T. vaginalis organisms were found.
WASHINGTON — Genital Gram stains alone lack the diagnostic ability to detect Chlamydia trachomatis and Neisseria gonorrhoeae infections, based on data from 1,511 emergency department visits, reported Dr. Shanda Riley in a poster at the annual meeting of the American College of Emergency Physicians.
In 502 visits (33%), physicians used a DNA probe without a Gram stain, 68 visits (5%) included a Gram stain without a DNA probe, and 941 visits (62%) included both a Gram stain and a DNA probe to detect sexually transmitted infections.
Dr. Riley of the University of Illinois, Peoria, and her colleagues reviewed all DNA probes for C. trachomatis and N. gonorrhoeae, along with Trichomonas vaginalis wet preps and genital Gram stains performed on patients seen in an emergency department between January 2004 and December 2004. The sensitivity and the specificity of the Gram stains were 71.1% and 41%, respectively, for N. gonorrhoeae, and 75.6% and 43%, respectively, for C. trachomatis. In addition, the average positive predictive value of the Gram stains for both organisms was 15%. Gram stains were considered positive if they demonstrated more than 10 white blood cells per high-power field or if clue cells, Gram-negative intracellular/extracellular diplococci, or T. vaginalis organisms were found.
Young Black Women at Risk For Aggressive Breast Tumors
WASHINGTON — Aggressive breast tumors, known as “triple negatives,” are significantly more common among black women—especially younger women—than in white women, reported Mary Jo B. Lund, Ph.D., at the annual meeting of the American Association for Cancer Research.
Tumors that test negative for three biomarkers—estrogen receptors (ER), progesterone receptors (PR), and human epidermal growth factor receptor 2 (HER2)—are not only more aggressive than are other subtypes of breast cancer, but they can't be treated effectively with tamoxifen or trastuzumab, said Dr. Lund of Emory University in Atlanta.
Dr. Lund and her colleagues evaluated the potential racial differences in the incidence of triple-negative tumors in a group of 117 black women and 362 white women aged 20–54 years. The women had been diagnosed with breast cancer between 1990 and 1992 and were enrolled in a population-based, case-control breast cancer study in the Atlanta area.
The overall incidence of triple-negative tumors was 29.5%, but the tumors were significantly more common among black women, compared with white women (47% vs. 22%).
The incidence of triple-negative tumors decreased with age among white women, but was consistent across age groups among black women. “Essentially, across all age groups, black women were twice as likely to have these triple-negative tumors,” Dr. Lund noted.
Younger black women, aged 20–34 years, appeared to be at particular risk; more than 50% of the tumors in this age group were triple negative. In addition, the percentage of triple-negative tumors increased with increasing severity among both races, but the incidence of grade 3 tumors remained higher among black women, compared with the incidence in white women (81% vs. 66%).
Crucial decisions about breast cancer treatment are based on the presence or status of ER, PR, and HER2 tumors, said Dr. Lund. “Almost 30% of all women and 50% of black women have tumors for which there is no targeted therapy.”
Future research on triple-negative tumors should focus on the risk factors, the reasons for increased risk among black women, and the possible roles of genetics and other biomarkers, she added.
WASHINGTON — Aggressive breast tumors, known as “triple negatives,” are significantly more common among black women—especially younger women—than in white women, reported Mary Jo B. Lund, Ph.D., at the annual meeting of the American Association for Cancer Research.
Tumors that test negative for three biomarkers—estrogen receptors (ER), progesterone receptors (PR), and human epidermal growth factor receptor 2 (HER2)—are not only more aggressive than are other subtypes of breast cancer, but they can't be treated effectively with tamoxifen or trastuzumab, said Dr. Lund of Emory University in Atlanta.
Dr. Lund and her colleagues evaluated the potential racial differences in the incidence of triple-negative tumors in a group of 117 black women and 362 white women aged 20–54 years. The women had been diagnosed with breast cancer between 1990 and 1992 and were enrolled in a population-based, case-control breast cancer study in the Atlanta area.
The overall incidence of triple-negative tumors was 29.5%, but the tumors were significantly more common among black women, compared with white women (47% vs. 22%).
The incidence of triple-negative tumors decreased with age among white women, but was consistent across age groups among black women. “Essentially, across all age groups, black women were twice as likely to have these triple-negative tumors,” Dr. Lund noted.
Younger black women, aged 20–34 years, appeared to be at particular risk; more than 50% of the tumors in this age group were triple negative. In addition, the percentage of triple-negative tumors increased with increasing severity among both races, but the incidence of grade 3 tumors remained higher among black women, compared with the incidence in white women (81% vs. 66%).
Crucial decisions about breast cancer treatment are based on the presence or status of ER, PR, and HER2 tumors, said Dr. Lund. “Almost 30% of all women and 50% of black women have tumors for which there is no targeted therapy.”
Future research on triple-negative tumors should focus on the risk factors, the reasons for increased risk among black women, and the possible roles of genetics and other biomarkers, she added.
WASHINGTON — Aggressive breast tumors, known as “triple negatives,” are significantly more common among black women—especially younger women—than in white women, reported Mary Jo B. Lund, Ph.D., at the annual meeting of the American Association for Cancer Research.
Tumors that test negative for three biomarkers—estrogen receptors (ER), progesterone receptors (PR), and human epidermal growth factor receptor 2 (HER2)—are not only more aggressive than are other subtypes of breast cancer, but they can't be treated effectively with tamoxifen or trastuzumab, said Dr. Lund of Emory University in Atlanta.
Dr. Lund and her colleagues evaluated the potential racial differences in the incidence of triple-negative tumors in a group of 117 black women and 362 white women aged 20–54 years. The women had been diagnosed with breast cancer between 1990 and 1992 and were enrolled in a population-based, case-control breast cancer study in the Atlanta area.
The overall incidence of triple-negative tumors was 29.5%, but the tumors were significantly more common among black women, compared with white women (47% vs. 22%).
The incidence of triple-negative tumors decreased with age among white women, but was consistent across age groups among black women. “Essentially, across all age groups, black women were twice as likely to have these triple-negative tumors,” Dr. Lund noted.
Younger black women, aged 20–34 years, appeared to be at particular risk; more than 50% of the tumors in this age group were triple negative. In addition, the percentage of triple-negative tumors increased with increasing severity among both races, but the incidence of grade 3 tumors remained higher among black women, compared with the incidence in white women (81% vs. 66%).
Crucial decisions about breast cancer treatment are based on the presence or status of ER, PR, and HER2 tumors, said Dr. Lund. “Almost 30% of all women and 50% of black women have tumors for which there is no targeted therapy.”
Future research on triple-negative tumors should focus on the risk factors, the reasons for increased risk among black women, and the possible roles of genetics and other biomarkers, she added.
Clinical Capsules
Erythromycin Resistance in S. pyogenes
Macrolide prescriptions within 1 year of throat culture were significant predictors of erythromycin-resistant Streptococcus pyogenes in a study of 1,225 children, reported Dr. Carlo Gagliotti of the Agenzia Sanitaria Regionale Emilia-Romagna in Bologna, Italy, and his associates.
The study included children aged 0–14 years who had at least one throat swab culture that was positive for S. pyogenes during 2003 (CID 2006;42:1153–6).
Overall, the average prevalence of erythromycin resistance was 25%. Among children who were given azithromycin within 1 month of culture, 2–3 months of culture, and 4–12 months of culture, the prevalence of erythromycin resistance was 67%, 44%, and 23%, respectively.
Among children who were given macrolides other than azithromycin at the same intervals, the prevalence of erythromycin resistance was 41%, 38%, and 20%, respectively. The long half-life of azithromycin may have contributed to the significant difference between azithromycin and other macrolides, the investigators noted.
By contrast, the resistance rate was only 21% among the 818 children who had not received a macrolide within 1 year of their throat swabs. Overall, the odds ratios of erythromycin resistance during the 3 months prior to throat swab cultures were 5.0 for children who were given azithromycin and 2.2 for children who were given other macrolides, compared with children who did not receive macrolides.
Multiple Vaccines Pose Minimal Risk
The measles, mumps, rubella, and varicella vaccine can be given concomitantly with other childhood vaccines, reported Dr. Henry Shinefield of the University of California, San Francisco, and his colleagues.
The researchers conducted an open, multicenter trial in which 1,779 healthy children aged 11–16 months were randomized into three groups. Group 1 received the measles, mumps, rubella, and varicella vaccine (MMRV), the combined Haemophilus influenzae type b conjugate-hepatitis B vaccine (HH), and the combined diphtheria-tetanus-acellular pertussis vaccine (DTaP) at the same visit. Group 2 received the MMRV at the initial visit, followed by HH and DTaP 42 days later. Group 3 received separate MMR and varicella vaccines at the initial visit, followed by HH and DTaP 42 days later.
Overall, the antibody response rates and geometric mean antibody titers to measles, mumps, rubella, and varicella were similar whether MMRV was given at the same time as the other vaccines or 42 days earlier. When MMRV was given at the same time as HH and DTaP, the antibody response rates for measles, mumps, rubella, and varicella were 97.8%, 95.4%, 98.6%, and 89.7%—higher than the previously established acceptability criteria.
Children who received all the vaccines at once were significantly more likely to report pain or tenderness at the injection site, compared with the other groups.
Dr. Shinefield has received an honorarium for preparing informational material about the MMRV vaccine ProQuad, and is a member of the Merck Advisory Committee on Varicella and ProQuad.
Molecular Diagnosis in Empyema
Molecular diagnosis improved detection of bacteria in 28% of children with pleural empyema and in 43% of those with empyema resulting from Streptococcus pneumoniae, reported Dr. Alban Le Monnier of the Assistance Publique-Hôpitaux de Paris, France, and his colleagues.
The molecular diagnostic techniques of broad-range 16S ribosomal DNA (rDNA) polymerase chain reaction (PCR) and pneumococcal antigen detection have been validated for urine and cerebrospinal fluid samples, but had not been validated for pleural fluid (CID 2006;42:1135–40).
Pleural fluid specimens were collected from 78 children with pleural empyema aged 15 years and younger (median age 3.9 years) in a prospective 4-year study from January 2001 to December 2004.
Overall, 60 of the 78 cases of empyema (77%) were microbiologically confirmed either by culture or by 16S rDNA PCR, and 40 (51%) were found to have pneumococcal origins. Conventional microbiologic culture identified pneumococcal strains in 23 of these 40 cases (58%). A total of 20 of these 23 cases also tested positive for S. pneumoniae using the 16S rDNA PCR and pneumolysin PCR techniques.
The diagnosis of S. pneumoniae was obtained by 16S rDNA PCR alone in 17 of the 40 cases (43%), all of whom had received antibiotics prior to pleural fluid aspiration.
No bacterial association with empyema could be found either by culture or PCR in 18 patients (23%), 16 of whom had received antibiotics prior to testing. Although the molecular tests are not a substitute for standard cultures, they can provide rapid results that allow clinicians to quickly adapt antibiotic therapy, they said.
Erythromycin Resistance in S. pyogenes
Macrolide prescriptions within 1 year of throat culture were significant predictors of erythromycin-resistant Streptococcus pyogenes in a study of 1,225 children, reported Dr. Carlo Gagliotti of the Agenzia Sanitaria Regionale Emilia-Romagna in Bologna, Italy, and his associates.
The study included children aged 0–14 years who had at least one throat swab culture that was positive for S. pyogenes during 2003 (CID 2006;42:1153–6).
Overall, the average prevalence of erythromycin resistance was 25%. Among children who were given azithromycin within 1 month of culture, 2–3 months of culture, and 4–12 months of culture, the prevalence of erythromycin resistance was 67%, 44%, and 23%, respectively.
Among children who were given macrolides other than azithromycin at the same intervals, the prevalence of erythromycin resistance was 41%, 38%, and 20%, respectively. The long half-life of azithromycin may have contributed to the significant difference between azithromycin and other macrolides, the investigators noted.
By contrast, the resistance rate was only 21% among the 818 children who had not received a macrolide within 1 year of their throat swabs. Overall, the odds ratios of erythromycin resistance during the 3 months prior to throat swab cultures were 5.0 for children who were given azithromycin and 2.2 for children who were given other macrolides, compared with children who did not receive macrolides.
Multiple Vaccines Pose Minimal Risk
The measles, mumps, rubella, and varicella vaccine can be given concomitantly with other childhood vaccines, reported Dr. Henry Shinefield of the University of California, San Francisco, and his colleagues.
The researchers conducted an open, multicenter trial in which 1,779 healthy children aged 11–16 months were randomized into three groups. Group 1 received the measles, mumps, rubella, and varicella vaccine (MMRV), the combined Haemophilus influenzae type b conjugate-hepatitis B vaccine (HH), and the combined diphtheria-tetanus-acellular pertussis vaccine (DTaP) at the same visit. Group 2 received the MMRV at the initial visit, followed by HH and DTaP 42 days later. Group 3 received separate MMR and varicella vaccines at the initial visit, followed by HH and DTaP 42 days later.
Overall, the antibody response rates and geometric mean antibody titers to measles, mumps, rubella, and varicella were similar whether MMRV was given at the same time as the other vaccines or 42 days earlier. When MMRV was given at the same time as HH and DTaP, the antibody response rates for measles, mumps, rubella, and varicella were 97.8%, 95.4%, 98.6%, and 89.7%—higher than the previously established acceptability criteria.
Children who received all the vaccines at once were significantly more likely to report pain or tenderness at the injection site, compared with the other groups.
Dr. Shinefield has received an honorarium for preparing informational material about the MMRV vaccine ProQuad, and is a member of the Merck Advisory Committee on Varicella and ProQuad.
Molecular Diagnosis in Empyema
Molecular diagnosis improved detection of bacteria in 28% of children with pleural empyema and in 43% of those with empyema resulting from Streptococcus pneumoniae, reported Dr. Alban Le Monnier of the Assistance Publique-Hôpitaux de Paris, France, and his colleagues.
The molecular diagnostic techniques of broad-range 16S ribosomal DNA (rDNA) polymerase chain reaction (PCR) and pneumococcal antigen detection have been validated for urine and cerebrospinal fluid samples, but had not been validated for pleural fluid (CID 2006;42:1135–40).
Pleural fluid specimens were collected from 78 children with pleural empyema aged 15 years and younger (median age 3.9 years) in a prospective 4-year study from January 2001 to December 2004.
Overall, 60 of the 78 cases of empyema (77%) were microbiologically confirmed either by culture or by 16S rDNA PCR, and 40 (51%) were found to have pneumococcal origins. Conventional microbiologic culture identified pneumococcal strains in 23 of these 40 cases (58%). A total of 20 of these 23 cases also tested positive for S. pneumoniae using the 16S rDNA PCR and pneumolysin PCR techniques.
The diagnosis of S. pneumoniae was obtained by 16S rDNA PCR alone in 17 of the 40 cases (43%), all of whom had received antibiotics prior to pleural fluid aspiration.
No bacterial association with empyema could be found either by culture or PCR in 18 patients (23%), 16 of whom had received antibiotics prior to testing. Although the molecular tests are not a substitute for standard cultures, they can provide rapid results that allow clinicians to quickly adapt antibiotic therapy, they said.
Erythromycin Resistance in S. pyogenes
Macrolide prescriptions within 1 year of throat culture were significant predictors of erythromycin-resistant Streptococcus pyogenes in a study of 1,225 children, reported Dr. Carlo Gagliotti of the Agenzia Sanitaria Regionale Emilia-Romagna in Bologna, Italy, and his associates.
The study included children aged 0–14 years who had at least one throat swab culture that was positive for S. pyogenes during 2003 (CID 2006;42:1153–6).
Overall, the average prevalence of erythromycin resistance was 25%. Among children who were given azithromycin within 1 month of culture, 2–3 months of culture, and 4–12 months of culture, the prevalence of erythromycin resistance was 67%, 44%, and 23%, respectively.
Among children who were given macrolides other than azithromycin at the same intervals, the prevalence of erythromycin resistance was 41%, 38%, and 20%, respectively. The long half-life of azithromycin may have contributed to the significant difference between azithromycin and other macrolides, the investigators noted.
By contrast, the resistance rate was only 21% among the 818 children who had not received a macrolide within 1 year of their throat swabs. Overall, the odds ratios of erythromycin resistance during the 3 months prior to throat swab cultures were 5.0 for children who were given azithromycin and 2.2 for children who were given other macrolides, compared with children who did not receive macrolides.
Multiple Vaccines Pose Minimal Risk
The measles, mumps, rubella, and varicella vaccine can be given concomitantly with other childhood vaccines, reported Dr. Henry Shinefield of the University of California, San Francisco, and his colleagues.
The researchers conducted an open, multicenter trial in which 1,779 healthy children aged 11–16 months were randomized into three groups. Group 1 received the measles, mumps, rubella, and varicella vaccine (MMRV), the combined Haemophilus influenzae type b conjugate-hepatitis B vaccine (HH), and the combined diphtheria-tetanus-acellular pertussis vaccine (DTaP) at the same visit. Group 2 received the MMRV at the initial visit, followed by HH and DTaP 42 days later. Group 3 received separate MMR and varicella vaccines at the initial visit, followed by HH and DTaP 42 days later.
Overall, the antibody response rates and geometric mean antibody titers to measles, mumps, rubella, and varicella were similar whether MMRV was given at the same time as the other vaccines or 42 days earlier. When MMRV was given at the same time as HH and DTaP, the antibody response rates for measles, mumps, rubella, and varicella were 97.8%, 95.4%, 98.6%, and 89.7%—higher than the previously established acceptability criteria.
Children who received all the vaccines at once were significantly more likely to report pain or tenderness at the injection site, compared with the other groups.
Dr. Shinefield has received an honorarium for preparing informational material about the MMRV vaccine ProQuad, and is a member of the Merck Advisory Committee on Varicella and ProQuad.
Molecular Diagnosis in Empyema
Molecular diagnosis improved detection of bacteria in 28% of children with pleural empyema and in 43% of those with empyema resulting from Streptococcus pneumoniae, reported Dr. Alban Le Monnier of the Assistance Publique-Hôpitaux de Paris, France, and his colleagues.
The molecular diagnostic techniques of broad-range 16S ribosomal DNA (rDNA) polymerase chain reaction (PCR) and pneumococcal antigen detection have been validated for urine and cerebrospinal fluid samples, but had not been validated for pleural fluid (CID 2006;42:1135–40).
Pleural fluid specimens were collected from 78 children with pleural empyema aged 15 years and younger (median age 3.9 years) in a prospective 4-year study from January 2001 to December 2004.
Overall, 60 of the 78 cases of empyema (77%) were microbiologically confirmed either by culture or by 16S rDNA PCR, and 40 (51%) were found to have pneumococcal origins. Conventional microbiologic culture identified pneumococcal strains in 23 of these 40 cases (58%). A total of 20 of these 23 cases also tested positive for S. pneumoniae using the 16S rDNA PCR and pneumolysin PCR techniques.
The diagnosis of S. pneumoniae was obtained by 16S rDNA PCR alone in 17 of the 40 cases (43%), all of whom had received antibiotics prior to pleural fluid aspiration.
No bacterial association with empyema could be found either by culture or PCR in 18 patients (23%), 16 of whom had received antibiotics prior to testing. Although the molecular tests are not a substitute for standard cultures, they can provide rapid results that allow clinicians to quickly adapt antibiotic therapy, they said.
Escitalopram Eases Depression in Adolescents
Escitalopram failed to significantly improve the symptoms of depression in children aged 6–11 years, but it did appear to improve symptoms in children aged 12–17 years, wrote Dr. Karen Dineen Wagner of the University of Texas, Galveston, and her colleagues.
The study included 264 children and adolescents aged 6–17 years who had been diagnosed with major depressive disorder. The Children's Depression Rating Scale-Revised (CDRS-R) served as the primary outcome measure (J. Am. Acad. Child Adolesc. Psychiatry 2006;45:280–8).
The patients received either a placebo or 10 mg/day of escitalopram (Lexapro) for the first 4 weeks, with the option to increase dosage up to 20 mg/day for the next 4 weeks, depending on the patient's response to the medication and tolerance.
Overall, average changes in scores on the CDRS-R from baseline were not significantly different among the 102 escitalopram patients and 115 placebo patients who completed the study (−21.9 vs. −20.2). However, a later analysis that adjusted for age group revealed significant improvements in CDRS-R scores from baseline among the 77 children aged 12–17 years who took escitalopram, compared with the 80 children aged 12–17 years who took a placebo, based on observed cases (−22.3 vs. −17.8).
In addition, adolescents in the escitalopram group showed significant improvements in symptoms based on several secondary outcome measures, including the Clinical Global Impressions-Severity scale.
Headaches and abdominal pain were the only reported adverse events that occurred in more than 10% of patients in either group, and the discontinuation rate in both groups was 1.5%. The study was supported by Forest Laboratories, one of many companies from which Dr. Wagner has received research support.
Escitalopram failed to significantly improve the symptoms of depression in children aged 6–11 years, but it did appear to improve symptoms in children aged 12–17 years, wrote Dr. Karen Dineen Wagner of the University of Texas, Galveston, and her colleagues.
The study included 264 children and adolescents aged 6–17 years who had been diagnosed with major depressive disorder. The Children's Depression Rating Scale-Revised (CDRS-R) served as the primary outcome measure (J. Am. Acad. Child Adolesc. Psychiatry 2006;45:280–8).
The patients received either a placebo or 10 mg/day of escitalopram (Lexapro) for the first 4 weeks, with the option to increase dosage up to 20 mg/day for the next 4 weeks, depending on the patient's response to the medication and tolerance.
Overall, average changes in scores on the CDRS-R from baseline were not significantly different among the 102 escitalopram patients and 115 placebo patients who completed the study (−21.9 vs. −20.2). However, a later analysis that adjusted for age group revealed significant improvements in CDRS-R scores from baseline among the 77 children aged 12–17 years who took escitalopram, compared with the 80 children aged 12–17 years who took a placebo, based on observed cases (−22.3 vs. −17.8).
In addition, adolescents in the escitalopram group showed significant improvements in symptoms based on several secondary outcome measures, including the Clinical Global Impressions-Severity scale.
Headaches and abdominal pain were the only reported adverse events that occurred in more than 10% of patients in either group, and the discontinuation rate in both groups was 1.5%. The study was supported by Forest Laboratories, one of many companies from which Dr. Wagner has received research support.
Escitalopram failed to significantly improve the symptoms of depression in children aged 6–11 years, but it did appear to improve symptoms in children aged 12–17 years, wrote Dr. Karen Dineen Wagner of the University of Texas, Galveston, and her colleagues.
The study included 264 children and adolescents aged 6–17 years who had been diagnosed with major depressive disorder. The Children's Depression Rating Scale-Revised (CDRS-R) served as the primary outcome measure (J. Am. Acad. Child Adolesc. Psychiatry 2006;45:280–8).
The patients received either a placebo or 10 mg/day of escitalopram (Lexapro) for the first 4 weeks, with the option to increase dosage up to 20 mg/day for the next 4 weeks, depending on the patient's response to the medication and tolerance.
Overall, average changes in scores on the CDRS-R from baseline were not significantly different among the 102 escitalopram patients and 115 placebo patients who completed the study (−21.9 vs. −20.2). However, a later analysis that adjusted for age group revealed significant improvements in CDRS-R scores from baseline among the 77 children aged 12–17 years who took escitalopram, compared with the 80 children aged 12–17 years who took a placebo, based on observed cases (−22.3 vs. −17.8).
In addition, adolescents in the escitalopram group showed significant improvements in symptoms based on several secondary outcome measures, including the Clinical Global Impressions-Severity scale.
Headaches and abdominal pain were the only reported adverse events that occurred in more than 10% of patients in either group, and the discontinuation rate in both groups was 1.5%. The study was supported by Forest Laboratories, one of many companies from which Dr. Wagner has received research support.