User login
Shorter Length of Stay May Be Better for Some PE Patients
Discharging patients with low-risk pulmonary embolism (PE) sooner not only saves money, but it could be saving their lives, according to a study of 6,746 VHA patients with PE.
Of the patients, 1,918 were low-risk, and of those, 688 had a short length of stay (LOS) (2 days or less). While adverse events associated with PE (recurrent venous thromboembolism, major bleeding, and death) were similar, patients with short LOS had fewer hospital-acquired complications (1.5% vs 13.3%) and bacterial pneumonias (5.9% vs 11.7%). Patients in the long LOS cohort had a higher number of pharmacy visits per patient (12.2 vs 9.4) and surgeries for placement of the inferior vena cava filter.
The researchers note that PE is associated with a “substantial burden” of health care utilization and associated costs. The annual cost per patient for an initial episode of PE ranges from $13,000 to $31,000; with recurrent episodes, the cost can be $11,014-$14,722 per year. In this study, inpatient costs for short LOS were half those of the longer LOS costs ($2,164 vs $5,100). Total costs were $9,056 for short LOS vs $12,544.
But they also note that since patients with low-risk PE can be identified using the validated risk stratification tools, an opportunity exists to select patients who can be safely treated without a traditional hospital admission. The researchers cite estimates that, in fact, up to 50% of PE patients can be treated safely as outpatients. Although this is common practice in Europe, U.S. physicians have been less willing to adopt the strategy, they add.
Risk stratification, the researchers conclude, is “of utmost importance”: Reducing the LOS among low-risk PE patients may substantially reduce the disease’s clinical and economic burden.
Discharging patients with low-risk pulmonary embolism (PE) sooner not only saves money, but it could be saving their lives, according to a study of 6,746 VHA patients with PE.
Of the patients, 1,918 were low-risk, and of those, 688 had a short length of stay (LOS) (2 days or less). While adverse events associated with PE (recurrent venous thromboembolism, major bleeding, and death) were similar, patients with short LOS had fewer hospital-acquired complications (1.5% vs 13.3%) and bacterial pneumonias (5.9% vs 11.7%). Patients in the long LOS cohort had a higher number of pharmacy visits per patient (12.2 vs 9.4) and surgeries for placement of the inferior vena cava filter.
The researchers note that PE is associated with a “substantial burden” of health care utilization and associated costs. The annual cost per patient for an initial episode of PE ranges from $13,000 to $31,000; with recurrent episodes, the cost can be $11,014-$14,722 per year. In this study, inpatient costs for short LOS were half those of the longer LOS costs ($2,164 vs $5,100). Total costs were $9,056 for short LOS vs $12,544.
But they also note that since patients with low-risk PE can be identified using the validated risk stratification tools, an opportunity exists to select patients who can be safely treated without a traditional hospital admission. The researchers cite estimates that, in fact, up to 50% of PE patients can be treated safely as outpatients. Although this is common practice in Europe, U.S. physicians have been less willing to adopt the strategy, they add.
Risk stratification, the researchers conclude, is “of utmost importance”: Reducing the LOS among low-risk PE patients may substantially reduce the disease’s clinical and economic burden.
Discharging patients with low-risk pulmonary embolism (PE) sooner not only saves money, but it could be saving their lives, according to a study of 6,746 VHA patients with PE.
Of the patients, 1,918 were low-risk, and of those, 688 had a short length of stay (LOS) (2 days or less). While adverse events associated with PE (recurrent venous thromboembolism, major bleeding, and death) were similar, patients with short LOS had fewer hospital-acquired complications (1.5% vs 13.3%) and bacterial pneumonias (5.9% vs 11.7%). Patients in the long LOS cohort had a higher number of pharmacy visits per patient (12.2 vs 9.4) and surgeries for placement of the inferior vena cava filter.
The researchers note that PE is associated with a “substantial burden” of health care utilization and associated costs. The annual cost per patient for an initial episode of PE ranges from $13,000 to $31,000; with recurrent episodes, the cost can be $11,014-$14,722 per year. In this study, inpatient costs for short LOS were half those of the longer LOS costs ($2,164 vs $5,100). Total costs were $9,056 for short LOS vs $12,544.
But they also note that since patients with low-risk PE can be identified using the validated risk stratification tools, an opportunity exists to select patients who can be safely treated without a traditional hospital admission. The researchers cite estimates that, in fact, up to 50% of PE patients can be treated safely as outpatients. Although this is common practice in Europe, U.S. physicians have been less willing to adopt the strategy, they add.
Risk stratification, the researchers conclude, is “of utmost importance”: Reducing the LOS among low-risk PE patients may substantially reduce the disease’s clinical and economic burden.
‘Untangling’ DNA Damage
“Imagine your DNA is a giant ball of yarn,” says Matthew Schellenberg, PhD. That is the metaphor he uses to help describe the findings of a study he conducted with other researchers from the NIH. They discovered how 2 proteins work together to “untangle” DNA damage known as a DNA-protein crosslink (DPC).
When DNA becomes tangled inside of cells, organisms use another protein called topoisomerase 2 (TOP2) to straighten things out, by cutting and “retying” individual threads. To do that, it first conceals the cut DNA ends within the core of the TOP2 protein, which allows it to then retie, or rejoin, the DNA ends. However, cancer drugs or environmental chemicals sometimes can block this retying ability, so the TOP2 remains stuck. That creates a stable environment for TOP2 and DPC, leading to an accumulation of severed DNA that kills cells.
Scott Williams, PhD, deputy chief of the Genome Integrity and Structural Biology Laboratory at the National Institute of Environmental Health Sciences, headed the team that identified ZATT as a new contributor to the process of removing DPCs. He uses another metaphor, likening the TOP2-DPCs to “ticking time bombs for cells.” The molecular charges are armed, Williams says, by TOP2’s interaction with environmental toxicants, chemical metabolites, tobacco exposures, or DNA damage caused by ultraviolet light.
While cancer drugs induce formation of TOP2-DPCs to treat cancer, TOP2-DPC lesions also can cause rearrangement of an organism’s genome that leads to cancer. If they are not removed, they trigger cell death. That led Williams and the research team to find out how DPCs are located and broken down. In his metaphor, the protein ZATT “is like a bomb-sniffing dog.” When it locates the target, it sounds an alarm to mobilize the recruitment of TOP2, which “cuts the red wire to disarm these threats.”
Schellenberg says, “We’ve discovered how we defend against this potent means of killing.” The knowledge may help researchers make drugs that kill cancer cells more effective.
“Imagine your DNA is a giant ball of yarn,” says Matthew Schellenberg, PhD. That is the metaphor he uses to help describe the findings of a study he conducted with other researchers from the NIH. They discovered how 2 proteins work together to “untangle” DNA damage known as a DNA-protein crosslink (DPC).
When DNA becomes tangled inside of cells, organisms use another protein called topoisomerase 2 (TOP2) to straighten things out, by cutting and “retying” individual threads. To do that, it first conceals the cut DNA ends within the core of the TOP2 protein, which allows it to then retie, or rejoin, the DNA ends. However, cancer drugs or environmental chemicals sometimes can block this retying ability, so the TOP2 remains stuck. That creates a stable environment for TOP2 and DPC, leading to an accumulation of severed DNA that kills cells.
Scott Williams, PhD, deputy chief of the Genome Integrity and Structural Biology Laboratory at the National Institute of Environmental Health Sciences, headed the team that identified ZATT as a new contributor to the process of removing DPCs. He uses another metaphor, likening the TOP2-DPCs to “ticking time bombs for cells.” The molecular charges are armed, Williams says, by TOP2’s interaction with environmental toxicants, chemical metabolites, tobacco exposures, or DNA damage caused by ultraviolet light.
While cancer drugs induce formation of TOP2-DPCs to treat cancer, TOP2-DPC lesions also can cause rearrangement of an organism’s genome that leads to cancer. If they are not removed, they trigger cell death. That led Williams and the research team to find out how DPCs are located and broken down. In his metaphor, the protein ZATT “is like a bomb-sniffing dog.” When it locates the target, it sounds an alarm to mobilize the recruitment of TOP2, which “cuts the red wire to disarm these threats.”
Schellenberg says, “We’ve discovered how we defend against this potent means of killing.” The knowledge may help researchers make drugs that kill cancer cells more effective.
“Imagine your DNA is a giant ball of yarn,” says Matthew Schellenberg, PhD. That is the metaphor he uses to help describe the findings of a study he conducted with other researchers from the NIH. They discovered how 2 proteins work together to “untangle” DNA damage known as a DNA-protein crosslink (DPC).
When DNA becomes tangled inside of cells, organisms use another protein called topoisomerase 2 (TOP2) to straighten things out, by cutting and “retying” individual threads. To do that, it first conceals the cut DNA ends within the core of the TOP2 protein, which allows it to then retie, or rejoin, the DNA ends. However, cancer drugs or environmental chemicals sometimes can block this retying ability, so the TOP2 remains stuck. That creates a stable environment for TOP2 and DPC, leading to an accumulation of severed DNA that kills cells.
Scott Williams, PhD, deputy chief of the Genome Integrity and Structural Biology Laboratory at the National Institute of Environmental Health Sciences, headed the team that identified ZATT as a new contributor to the process of removing DPCs. He uses another metaphor, likening the TOP2-DPCs to “ticking time bombs for cells.” The molecular charges are armed, Williams says, by TOP2’s interaction with environmental toxicants, chemical metabolites, tobacco exposures, or DNA damage caused by ultraviolet light.
While cancer drugs induce formation of TOP2-DPCs to treat cancer, TOP2-DPC lesions also can cause rearrangement of an organism’s genome that leads to cancer. If they are not removed, they trigger cell death. That led Williams and the research team to find out how DPCs are located and broken down. In his metaphor, the protein ZATT “is like a bomb-sniffing dog.” When it locates the target, it sounds an alarm to mobilize the recruitment of TOP2, which “cuts the red wire to disarm these threats.”
Schellenberg says, “We’ve discovered how we defend against this potent means of killing.” The knowledge may help researchers make drugs that kill cancer cells more effective.
The ‘Virtual Radiology Resident’—Coming to a Computer Near You
Researchers around the world may be able to teach computers how to better detect and diagnose disease, thanks to > 100,000 chest x-ray images and corresponding data recently released by the NIH Clinical Center.
Reading and diagnosing chest x-rays requires careful observation, as well as knowledge of anatomy, physiology, and pathology. When that is combined with the need to consider all common thoracic diseases, it becomes hard to automate a consistent technique for reading images, the NIH says. With the free dataset, the hope is that academic and research institution staff will be able to teach their computers to read and process enormous amounts of scans, to confirm radiologists’ results, and potentially identify anything that may have been overlooked.
The NIH says in addition to being a “virtual radiology resident,” advanced computer technology has other potential benefits: For instance, it could identify slow changes occurring over the course of multiple chest x-rays that might otherwise be overlooked. The technology also would be useful in poor countries that lack radiologists. And in the future, the “resident” might be taught to read more complex images, such as CT and MRI.
The dataset, compiled from scans from > 30,000 patients, including many with advanced lung disease, was scrubbed of private information before release. The images are available via Box at https://nihcc.app.box.com/v/ChestXray-NIHCC.
Researchers around the world may be able to teach computers how to better detect and diagnose disease, thanks to > 100,000 chest x-ray images and corresponding data recently released by the NIH Clinical Center.
Reading and diagnosing chest x-rays requires careful observation, as well as knowledge of anatomy, physiology, and pathology. When that is combined with the need to consider all common thoracic diseases, it becomes hard to automate a consistent technique for reading images, the NIH says. With the free dataset, the hope is that academic and research institution staff will be able to teach their computers to read and process enormous amounts of scans, to confirm radiologists’ results, and potentially identify anything that may have been overlooked.
The NIH says in addition to being a “virtual radiology resident,” advanced computer technology has other potential benefits: For instance, it could identify slow changes occurring over the course of multiple chest x-rays that might otherwise be overlooked. The technology also would be useful in poor countries that lack radiologists. And in the future, the “resident” might be taught to read more complex images, such as CT and MRI.
The dataset, compiled from scans from > 30,000 patients, including many with advanced lung disease, was scrubbed of private information before release. The images are available via Box at https://nihcc.app.box.com/v/ChestXray-NIHCC.
Researchers around the world may be able to teach computers how to better detect and diagnose disease, thanks to > 100,000 chest x-ray images and corresponding data recently released by the NIH Clinical Center.
Reading and diagnosing chest x-rays requires careful observation, as well as knowledge of anatomy, physiology, and pathology. When that is combined with the need to consider all common thoracic diseases, it becomes hard to automate a consistent technique for reading images, the NIH says. With the free dataset, the hope is that academic and research institution staff will be able to teach their computers to read and process enormous amounts of scans, to confirm radiologists’ results, and potentially identify anything that may have been overlooked.
The NIH says in addition to being a “virtual radiology resident,” advanced computer technology has other potential benefits: For instance, it could identify slow changes occurring over the course of multiple chest x-rays that might otherwise be overlooked. The technology also would be useful in poor countries that lack radiologists. And in the future, the “resident” might be taught to read more complex images, such as CT and MRI.
The dataset, compiled from scans from > 30,000 patients, including many with advanced lung disease, was scrubbed of private information before release. The images are available via Box at https://nihcc.app.box.com/v/ChestXray-NIHCC.
Monitoring HIV and Kidney Disease in Aging Asians
More people who are HIV-1 positive are living longer and better managing comorbidities—especially those associated with kidney disease. This becomes critical since the medications patients with HIV must take can be nephrotoxic with long-term use. Researchers from the AIDS Clinical Center, National Center for Global Health and Medicine, in Tokyo, Japan note that Asian patients may be at higher risk because of their generally smaller body weight and metabolism differences, compared with whites and blacks.
Few studies in Asia had assessed the prevalence and factors associated with chronic kidney disease (CKD) and end-stage renal disease (ERSD) in patients with HIV-1, the researchers say, so they conducted the first, to their knowledge, with a cross-sectional study of 1,990 patients.
One third of the patients were aged ≥ 50 years. Nearly all (94%) had HIV-load < 50 copies/mL. The median time from diagnosis to study enrollment was 9.1 years; the median duration of antiretroviral therapy (ART) was 7.35 years. Of the study patients, 256 (13%) had chronic kidney disease and 9 (0.5%) had ESRD. It is noteworthy, the researchers say, that 5 of those 9 developed ESRD long after the diagnosis of HIV infection and that the age of the ESRD patients varied from 30s to 60s.
The incidence of CKD rose from 18.6% among those aged 50-59 years to 47% for those aged > 70 years. Heavier body weight, diabetes, hypertension, and longer duration of ART also were associated with CKD. Duration of exposure to tenofovir disoproxil fumarate (TDF), however, was not associated with CKD. Of all the patients, 61% had a history of TDF use. At the time of the study, 774 patients were taking TDF; 69 in the group with CKD and 705 in the group without CKD.
Tenofovir disoproxil fumarate nephrotoxicity had been well publicized by 2016, when the data were collected for this study, the researchers note, which is why they used “duration of TDF exposure” in their logistic regression model. Tenofovir disoproxil fumarate may have been discontinued early on for the patients at risk for CKD in their study cohort. However, they note that while TDF will be replaced with its prodrug tenofovir alafenamide, other antiretroviral drugs that inhibit excretion of creatinine in the renal proximal tubules and increase serum creatinine value, such as dolutegravir, cobicistat, rilpivirine, raltegravir, and ritonavir, will still be widely used.
Source:
Nishijima T, Kawasaki Y, Mutoh Y, et al. Sci Rep. 2017;7(14565) doi:10.1038/s41598-017-15214-x
More people who are HIV-1 positive are living longer and better managing comorbidities—especially those associated with kidney disease. This becomes critical since the medications patients with HIV must take can be nephrotoxic with long-term use. Researchers from the AIDS Clinical Center, National Center for Global Health and Medicine, in Tokyo, Japan note that Asian patients may be at higher risk because of their generally smaller body weight and metabolism differences, compared with whites and blacks.
Few studies in Asia had assessed the prevalence and factors associated with chronic kidney disease (CKD) and end-stage renal disease (ERSD) in patients with HIV-1, the researchers say, so they conducted the first, to their knowledge, with a cross-sectional study of 1,990 patients.
One third of the patients were aged ≥ 50 years. Nearly all (94%) had HIV-load < 50 copies/mL. The median time from diagnosis to study enrollment was 9.1 years; the median duration of antiretroviral therapy (ART) was 7.35 years. Of the study patients, 256 (13%) had chronic kidney disease and 9 (0.5%) had ESRD. It is noteworthy, the researchers say, that 5 of those 9 developed ESRD long after the diagnosis of HIV infection and that the age of the ESRD patients varied from 30s to 60s.
The incidence of CKD rose from 18.6% among those aged 50-59 years to 47% for those aged > 70 years. Heavier body weight, diabetes, hypertension, and longer duration of ART also were associated with CKD. Duration of exposure to tenofovir disoproxil fumarate (TDF), however, was not associated with CKD. Of all the patients, 61% had a history of TDF use. At the time of the study, 774 patients were taking TDF; 69 in the group with CKD and 705 in the group without CKD.
Tenofovir disoproxil fumarate nephrotoxicity had been well publicized by 2016, when the data were collected for this study, the researchers note, which is why they used “duration of TDF exposure” in their logistic regression model. Tenofovir disoproxil fumarate may have been discontinued early on for the patients at risk for CKD in their study cohort. However, they note that while TDF will be replaced with its prodrug tenofovir alafenamide, other antiretroviral drugs that inhibit excretion of creatinine in the renal proximal tubules and increase serum creatinine value, such as dolutegravir, cobicistat, rilpivirine, raltegravir, and ritonavir, will still be widely used.
Source:
Nishijima T, Kawasaki Y, Mutoh Y, et al. Sci Rep. 2017;7(14565) doi:10.1038/s41598-017-15214-x
More people who are HIV-1 positive are living longer and better managing comorbidities—especially those associated with kidney disease. This becomes critical since the medications patients with HIV must take can be nephrotoxic with long-term use. Researchers from the AIDS Clinical Center, National Center for Global Health and Medicine, in Tokyo, Japan note that Asian patients may be at higher risk because of their generally smaller body weight and metabolism differences, compared with whites and blacks.
Few studies in Asia had assessed the prevalence and factors associated with chronic kidney disease (CKD) and end-stage renal disease (ERSD) in patients with HIV-1, the researchers say, so they conducted the first, to their knowledge, with a cross-sectional study of 1,990 patients.
One third of the patients were aged ≥ 50 years. Nearly all (94%) had HIV-load < 50 copies/mL. The median time from diagnosis to study enrollment was 9.1 years; the median duration of antiretroviral therapy (ART) was 7.35 years. Of the study patients, 256 (13%) had chronic kidney disease and 9 (0.5%) had ESRD. It is noteworthy, the researchers say, that 5 of those 9 developed ESRD long after the diagnosis of HIV infection and that the age of the ESRD patients varied from 30s to 60s.
The incidence of CKD rose from 18.6% among those aged 50-59 years to 47% for those aged > 70 years. Heavier body weight, diabetes, hypertension, and longer duration of ART also were associated with CKD. Duration of exposure to tenofovir disoproxil fumarate (TDF), however, was not associated with CKD. Of all the patients, 61% had a history of TDF use. At the time of the study, 774 patients were taking TDF; 69 in the group with CKD and 705 in the group without CKD.
Tenofovir disoproxil fumarate nephrotoxicity had been well publicized by 2016, when the data were collected for this study, the researchers note, which is why they used “duration of TDF exposure” in their logistic regression model. Tenofovir disoproxil fumarate may have been discontinued early on for the patients at risk for CKD in their study cohort. However, they note that while TDF will be replaced with its prodrug tenofovir alafenamide, other antiretroviral drugs that inhibit excretion of creatinine in the renal proximal tubules and increase serum creatinine value, such as dolutegravir, cobicistat, rilpivirine, raltegravir, and ritonavir, will still be widely used.
Source:
Nishijima T, Kawasaki Y, Mutoh Y, et al. Sci Rep. 2017;7(14565) doi:10.1038/s41598-017-15214-x
Study Will Compare Mammography Screening Methods
Which method is better for breast cancer screening: 3-D mammography or 2-D mammography? Researchers from the ECOG-ACRIN Cancer Research Group and the National Cancer Institute are hoping to find out, with the Tomosynthesis Mammographic Imaging Screening Trial (TMIST).
It has been decades since the last large-scale randomized trial of mammography, points out Worta McCaskill-Stevens, MD, director of the NCI Community Oncology Research Program. In the meantime, mammography technology has evolved, from “conventional” 2-D mammography to tomosynthesis, also known as 3-D mammography.
However, although 3-D mammography is more likely to detect more findings that require follow-up, it is also likely to lead to more procedures and treatments. “If a newer screening technology does not reduce the numbers of advanced, life-threatening cancers, then are we really improving screening for breast cancer?” said Etta Pisano, MD, ECOG-ACRIN study chair.
Researchers plan to enroll 165,000 participants aged between 45 and 74 years who already are scheduled for routine mammograms. They will follow all participants for breast cancer status, treatment, and outcomes until at least 2025. About 100 mammography clinics are expected to take part.
Which method is better for breast cancer screening: 3-D mammography or 2-D mammography? Researchers from the ECOG-ACRIN Cancer Research Group and the National Cancer Institute are hoping to find out, with the Tomosynthesis Mammographic Imaging Screening Trial (TMIST).
It has been decades since the last large-scale randomized trial of mammography, points out Worta McCaskill-Stevens, MD, director of the NCI Community Oncology Research Program. In the meantime, mammography technology has evolved, from “conventional” 2-D mammography to tomosynthesis, also known as 3-D mammography.
However, although 3-D mammography is more likely to detect more findings that require follow-up, it is also likely to lead to more procedures and treatments. “If a newer screening technology does not reduce the numbers of advanced, life-threatening cancers, then are we really improving screening for breast cancer?” said Etta Pisano, MD, ECOG-ACRIN study chair.
Researchers plan to enroll 165,000 participants aged between 45 and 74 years who already are scheduled for routine mammograms. They will follow all participants for breast cancer status, treatment, and outcomes until at least 2025. About 100 mammography clinics are expected to take part.
Which method is better for breast cancer screening: 3-D mammography or 2-D mammography? Researchers from the ECOG-ACRIN Cancer Research Group and the National Cancer Institute are hoping to find out, with the Tomosynthesis Mammographic Imaging Screening Trial (TMIST).
It has been decades since the last large-scale randomized trial of mammography, points out Worta McCaskill-Stevens, MD, director of the NCI Community Oncology Research Program. In the meantime, mammography technology has evolved, from “conventional” 2-D mammography to tomosynthesis, also known as 3-D mammography.
However, although 3-D mammography is more likely to detect more findings that require follow-up, it is also likely to lead to more procedures and treatments. “If a newer screening technology does not reduce the numbers of advanced, life-threatening cancers, then are we really improving screening for breast cancer?” said Etta Pisano, MD, ECOG-ACRIN study chair.
Researchers plan to enroll 165,000 participants aged between 45 and 74 years who already are scheduled for routine mammograms. They will follow all participants for breast cancer status, treatment, and outcomes until at least 2025. About 100 mammography clinics are expected to take part.
The Return of Scarlet Fever?
Is scarlet fever—long thought to be eradicated—reemerging as a health threat? China, the United Kingdom, and Hong Kong have seen upsurges in scarlet fever cases in the past few years.
Hong Kong has seen a more than 10-fold increase over the previous incidence rate. In a study of 7,266 patients aged ≤ 14 years (3,304 with laboratory-confirmed diagnosis), researchers from University of Hong Kong found a “sharp peak” in 2011: 1,438 cases were reported, exceeding the total number of 1,117 in the previous 6 years. Since then, the annual number of reported cases has remained at a “relatively high level,” the researchers say, with an average of 14.5 cases per 10,000 children during 2012-2015.
The elevated pattern was more apparent in children aged ≤ 5 years. In that age group, annual incidence averaged 3.3 per 10,000 during 2005-2010, then jumped dramatically to 23.9 per 10,000 in 2011. It dropped slightly to 18.1 per 10,000 in 2012-2015.
The cause is unclear, the researchers say. They cite 1 report that suggests toxin acquisition and multidrug resistance may have contributed. School is probably a major transmission site. Incidence was higher among younger children entering school and during school days. The researchers say boys were more at risk than girls, possibly because they have more physical interactions or poorer personal hygiene. Thus, school-based control measures—especially for boys aged 3 to 5 years—could be “particularly important.”
Is scarlet fever—long thought to be eradicated—reemerging as a health threat? China, the United Kingdom, and Hong Kong have seen upsurges in scarlet fever cases in the past few years.
Hong Kong has seen a more than 10-fold increase over the previous incidence rate. In a study of 7,266 patients aged ≤ 14 years (3,304 with laboratory-confirmed diagnosis), researchers from University of Hong Kong found a “sharp peak” in 2011: 1,438 cases were reported, exceeding the total number of 1,117 in the previous 6 years. Since then, the annual number of reported cases has remained at a “relatively high level,” the researchers say, with an average of 14.5 cases per 10,000 children during 2012-2015.
The elevated pattern was more apparent in children aged ≤ 5 years. In that age group, annual incidence averaged 3.3 per 10,000 during 2005-2010, then jumped dramatically to 23.9 per 10,000 in 2011. It dropped slightly to 18.1 per 10,000 in 2012-2015.
The cause is unclear, the researchers say. They cite 1 report that suggests toxin acquisition and multidrug resistance may have contributed. School is probably a major transmission site. Incidence was higher among younger children entering school and during school days. The researchers say boys were more at risk than girls, possibly because they have more physical interactions or poorer personal hygiene. Thus, school-based control measures—especially for boys aged 3 to 5 years—could be “particularly important.”
Is scarlet fever—long thought to be eradicated—reemerging as a health threat? China, the United Kingdom, and Hong Kong have seen upsurges in scarlet fever cases in the past few years.
Hong Kong has seen a more than 10-fold increase over the previous incidence rate. In a study of 7,266 patients aged ≤ 14 years (3,304 with laboratory-confirmed diagnosis), researchers from University of Hong Kong found a “sharp peak” in 2011: 1,438 cases were reported, exceeding the total number of 1,117 in the previous 6 years. Since then, the annual number of reported cases has remained at a “relatively high level,” the researchers say, with an average of 14.5 cases per 10,000 children during 2012-2015.
The elevated pattern was more apparent in children aged ≤ 5 years. In that age group, annual incidence averaged 3.3 per 10,000 during 2005-2010, then jumped dramatically to 23.9 per 10,000 in 2011. It dropped slightly to 18.1 per 10,000 in 2012-2015.
The cause is unclear, the researchers say. They cite 1 report that suggests toxin acquisition and multidrug resistance may have contributed. School is probably a major transmission site. Incidence was higher among younger children entering school and during school days. The researchers say boys were more at risk than girls, possibly because they have more physical interactions or poorer personal hygiene. Thus, school-based control measures—especially for boys aged 3 to 5 years—could be “particularly important.”
NIH Researchers Find Lymph Drainage in Brain
In 1816, an Italian anatomist reported finding lymphatic vessels on the surface of the brain—but the information went nowhere. However, 200 years later, NIH researchers believe they have confirmed that report with evidence that the human brain may drain some waste out through the body’s lymphatic “sewer system.”
Two animal studies in 2015 had showed evidence of a lymphatic system in the brain. Building on that, the researchers used magnetic resonance imaging to scan the brains of 5 healthy volunteers who had been injected with gadobutrol, typically used to visualize brain blood vessels. The dye molecules are small enough to leak out of blood vessels in the dura but too big to pass through the blood-brain barrier.
At first, the researchers say, the dura lit up brightly, but no lymphatic vessels were visible. When they turned the scanner differently, the blood vessels “disappeared,” and they saw that the dura also had smaller but almost equally bright spots and lines—possibly lymph vessels. The researchers’ results suggested that the dye leaked from the blood vessels and flowed through the dura into neighboring lymphatic vessels. “We literally watched people’s brains drain fluid into these vessels,” said Daniel Reich, MD, PhD, senior author of the study.
The VA/DoD Chronic Effects of Neurotrauma Consortium: An Overview at Year 1
The researchers tested the findings by doing another round of scans, using a dye made of larger molecules. This time they saw blood vessels but no lymph vessels, no matter how the scanner was turned.
The researchers also found evidence for blood and lymph vessels in autopsied human brain tissue. “For years, we knew how fluid entered the brain. Now we may finally see that, like other organs in the body, brain fluid can drain out through the lymphatic system,” said Dr. Reich.
“These results could fundamentally change the way we think about how the brain and immune system interrelate,” said Walter Koroshetz, MD, the director of the National Institute of Neurological Disorders and Stroke.
In 1816, an Italian anatomist reported finding lymphatic vessels on the surface of the brain—but the information went nowhere. However, 200 years later, NIH researchers believe they have confirmed that report with evidence that the human brain may drain some waste out through the body’s lymphatic “sewer system.”
Two animal studies in 2015 had showed evidence of a lymphatic system in the brain. Building on that, the researchers used magnetic resonance imaging to scan the brains of 5 healthy volunteers who had been injected with gadobutrol, typically used to visualize brain blood vessels. The dye molecules are small enough to leak out of blood vessels in the dura but too big to pass through the blood-brain barrier.
At first, the researchers say, the dura lit up brightly, but no lymphatic vessels were visible. When they turned the scanner differently, the blood vessels “disappeared,” and they saw that the dura also had smaller but almost equally bright spots and lines—possibly lymph vessels. The researchers’ results suggested that the dye leaked from the blood vessels and flowed through the dura into neighboring lymphatic vessels. “We literally watched people’s brains drain fluid into these vessels,” said Daniel Reich, MD, PhD, senior author of the study.
The VA/DoD Chronic Effects of Neurotrauma Consortium: An Overview at Year 1
The researchers tested the findings by doing another round of scans, using a dye made of larger molecules. This time they saw blood vessels but no lymph vessels, no matter how the scanner was turned.
The researchers also found evidence for blood and lymph vessels in autopsied human brain tissue. “For years, we knew how fluid entered the brain. Now we may finally see that, like other organs in the body, brain fluid can drain out through the lymphatic system,” said Dr. Reich.
“These results could fundamentally change the way we think about how the brain and immune system interrelate,” said Walter Koroshetz, MD, the director of the National Institute of Neurological Disorders and Stroke.
In 1816, an Italian anatomist reported finding lymphatic vessels on the surface of the brain—but the information went nowhere. However, 200 years later, NIH researchers believe they have confirmed that report with evidence that the human brain may drain some waste out through the body’s lymphatic “sewer system.”
Two animal studies in 2015 had showed evidence of a lymphatic system in the brain. Building on that, the researchers used magnetic resonance imaging to scan the brains of 5 healthy volunteers who had been injected with gadobutrol, typically used to visualize brain blood vessels. The dye molecules are small enough to leak out of blood vessels in the dura but too big to pass through the blood-brain barrier.
At first, the researchers say, the dura lit up brightly, but no lymphatic vessels were visible. When they turned the scanner differently, the blood vessels “disappeared,” and they saw that the dura also had smaller but almost equally bright spots and lines—possibly lymph vessels. The researchers’ results suggested that the dye leaked from the blood vessels and flowed through the dura into neighboring lymphatic vessels. “We literally watched people’s brains drain fluid into these vessels,” said Daniel Reich, MD, PhD, senior author of the study.
The VA/DoD Chronic Effects of Neurotrauma Consortium: An Overview at Year 1
The researchers tested the findings by doing another round of scans, using a dye made of larger molecules. This time they saw blood vessels but no lymph vessels, no matter how the scanner was turned.
The researchers also found evidence for blood and lymph vessels in autopsied human brain tissue. “For years, we knew how fluid entered the brain. Now we may finally see that, like other organs in the body, brain fluid can drain out through the lymphatic system,” said Dr. Reich.
“These results could fundamentally change the way we think about how the brain and immune system interrelate,” said Walter Koroshetz, MD, the director of the National Institute of Neurological Disorders and Stroke.
Online Help for People With Alcohol Use Disorder
In any given year, < 10% of people diagnosed with alcohol use disorder receive treatment, and many do not receive the type of care that best fits their needs. Two reasons may be that they don’t know where to turn for help, or they may not know that they have more treatment options beyond a mutual help group or long-term residential rehabilitation facility.
The National Institute on Alcohol Abuse and Alcoholism (NIAAA) has developed a new online tool that may help. The “comprehensive, yet easy-to-use” Alcohol Treatment Navigator was developed “to help address the alcohol ‘treatment gap,’” said NIAAA Director George Koob, PhD. It’s based on decades of scientific research into clinical interventions and health services with input from patients, providers, and researchers.
The Navigator includes an overview of alcohol use disorder and a description of professionally led treatment options. It also gives step-by-step instructions for searching online directories of treatment providers, 10 questions to ask a provider, and signs of quality to listen for. A downloadable tool kit helps organize and simplify the search process.
In any given year, < 10% of people diagnosed with alcohol use disorder receive treatment, and many do not receive the type of care that best fits their needs. Two reasons may be that they don’t know where to turn for help, or they may not know that they have more treatment options beyond a mutual help group or long-term residential rehabilitation facility.
The National Institute on Alcohol Abuse and Alcoholism (NIAAA) has developed a new online tool that may help. The “comprehensive, yet easy-to-use” Alcohol Treatment Navigator was developed “to help address the alcohol ‘treatment gap,’” said NIAAA Director George Koob, PhD. It’s based on decades of scientific research into clinical interventions and health services with input from patients, providers, and researchers.
The Navigator includes an overview of alcohol use disorder and a description of professionally led treatment options. It also gives step-by-step instructions for searching online directories of treatment providers, 10 questions to ask a provider, and signs of quality to listen for. A downloadable tool kit helps organize and simplify the search process.
In any given year, < 10% of people diagnosed with alcohol use disorder receive treatment, and many do not receive the type of care that best fits their needs. Two reasons may be that they don’t know where to turn for help, or they may not know that they have more treatment options beyond a mutual help group or long-term residential rehabilitation facility.
The National Institute on Alcohol Abuse and Alcoholism (NIAAA) has developed a new online tool that may help. The “comprehensive, yet easy-to-use” Alcohol Treatment Navigator was developed “to help address the alcohol ‘treatment gap,’” said NIAAA Director George Koob, PhD. It’s based on decades of scientific research into clinical interventions and health services with input from patients, providers, and researchers.
The Navigator includes an overview of alcohol use disorder and a description of professionally led treatment options. It also gives step-by-step instructions for searching online directories of treatment providers, 10 questions to ask a provider, and signs of quality to listen for. A downloadable tool kit helps organize and simplify the search process.
Does Anyone Really Understand Nutrition Labels?
In 1990, nutrition labeling—that handy chart that gives us the information we need to make healthy choices—was added to nearly all packaged foods. But according to researchers from the FDA, Tufts University, and the National Cancer Institute, many people lack the health literacy to understand the information and use it as intended.
The researchers analyzed data on 3,185 U.S. adults from the Health Information National Trends Survey, conducted in 2013. Participants were asked to view an ice-cream nutrition label and answer 4 questions that tested their ability to apply basic arithmetic and understanding of percentages to interpret the label. They also reported their intake of sugar-sweetened soft drinks, fruits, and vegetables.
About one-quarter of the participants could not determine the calorie content of the full ice-cream container; 42% could not estimate the effect on daily calorie intake of foregoing 1 serving; 41% could not calculate the percentage daily value of calories in a single serving; and 21% could not estimate the number of servings equal to 60 g of carbohydrates.
Higher scores of label understanding were associated with consuming more vegetables and fewer sugar-sweetened drinks. After adjusting for demographic factors, only the link with soft drinks remained significant.
Across all educational levels, people had the most trouble with the questions about health recommendations and daily value. As in other studies, low educational attainment was associated with poor understanding of nutrition labels. More than one-third of participants with less than a high school diploma could not correctly answer any of the questions. Less than 9% could answer all 4 correctly. However, only 54% of participants with a 4-year college degree could answer all the questions correctly.
One obvious way to improve things, the researchers suggest, is to make the nutrition label easier to use. They note that the FDA tried to do this in 2016, in addition to reflecting current nutrition science and public health research. For instance, certain label elements, like calories and serving size, are now larger and in a bold font. Serving sizes have been updated to more accurately reflect the amount of food and drink people usually consume. To help consumers better understand serving size, 2 columns are used for foods that can be eaten in 1 or multiple sittings, such as a bag of potato chips, so people will better grasp how many calories they consume in 1 sitting.
Still, understanding nutrition labels is not the same as using the nutrition information for selecting food, the researchers point out. Participants who answered all 4 questions correctly might not necessarily use the labels when buying food.
In 1990, nutrition labeling—that handy chart that gives us the information we need to make healthy choices—was added to nearly all packaged foods. But according to researchers from the FDA, Tufts University, and the National Cancer Institute, many people lack the health literacy to understand the information and use it as intended.
The researchers analyzed data on 3,185 U.S. adults from the Health Information National Trends Survey, conducted in 2013. Participants were asked to view an ice-cream nutrition label and answer 4 questions that tested their ability to apply basic arithmetic and understanding of percentages to interpret the label. They also reported their intake of sugar-sweetened soft drinks, fruits, and vegetables.
About one-quarter of the participants could not determine the calorie content of the full ice-cream container; 42% could not estimate the effect on daily calorie intake of foregoing 1 serving; 41% could not calculate the percentage daily value of calories in a single serving; and 21% could not estimate the number of servings equal to 60 g of carbohydrates.
Higher scores of label understanding were associated with consuming more vegetables and fewer sugar-sweetened drinks. After adjusting for demographic factors, only the link with soft drinks remained significant.
Across all educational levels, people had the most trouble with the questions about health recommendations and daily value. As in other studies, low educational attainment was associated with poor understanding of nutrition labels. More than one-third of participants with less than a high school diploma could not correctly answer any of the questions. Less than 9% could answer all 4 correctly. However, only 54% of participants with a 4-year college degree could answer all the questions correctly.
One obvious way to improve things, the researchers suggest, is to make the nutrition label easier to use. They note that the FDA tried to do this in 2016, in addition to reflecting current nutrition science and public health research. For instance, certain label elements, like calories and serving size, are now larger and in a bold font. Serving sizes have been updated to more accurately reflect the amount of food and drink people usually consume. To help consumers better understand serving size, 2 columns are used for foods that can be eaten in 1 or multiple sittings, such as a bag of potato chips, so people will better grasp how many calories they consume in 1 sitting.
Still, understanding nutrition labels is not the same as using the nutrition information for selecting food, the researchers point out. Participants who answered all 4 questions correctly might not necessarily use the labels when buying food.
In 1990, nutrition labeling—that handy chart that gives us the information we need to make healthy choices—was added to nearly all packaged foods. But according to researchers from the FDA, Tufts University, and the National Cancer Institute, many people lack the health literacy to understand the information and use it as intended.
The researchers analyzed data on 3,185 U.S. adults from the Health Information National Trends Survey, conducted in 2013. Participants were asked to view an ice-cream nutrition label and answer 4 questions that tested their ability to apply basic arithmetic and understanding of percentages to interpret the label. They also reported their intake of sugar-sweetened soft drinks, fruits, and vegetables.
About one-quarter of the participants could not determine the calorie content of the full ice-cream container; 42% could not estimate the effect on daily calorie intake of foregoing 1 serving; 41% could not calculate the percentage daily value of calories in a single serving; and 21% could not estimate the number of servings equal to 60 g of carbohydrates.
Higher scores of label understanding were associated with consuming more vegetables and fewer sugar-sweetened drinks. After adjusting for demographic factors, only the link with soft drinks remained significant.
Across all educational levels, people had the most trouble with the questions about health recommendations and daily value. As in other studies, low educational attainment was associated with poor understanding of nutrition labels. More than one-third of participants with less than a high school diploma could not correctly answer any of the questions. Less than 9% could answer all 4 correctly. However, only 54% of participants with a 4-year college degree could answer all the questions correctly.
One obvious way to improve things, the researchers suggest, is to make the nutrition label easier to use. They note that the FDA tried to do this in 2016, in addition to reflecting current nutrition science and public health research. For instance, certain label elements, like calories and serving size, are now larger and in a bold font. Serving sizes have been updated to more accurately reflect the amount of food and drink people usually consume. To help consumers better understand serving size, 2 columns are used for foods that can be eaten in 1 or multiple sittings, such as a bag of potato chips, so people will better grasp how many calories they consume in 1 sitting.
Still, understanding nutrition labels is not the same as using the nutrition information for selecting food, the researchers point out. Participants who answered all 4 questions correctly might not necessarily use the labels when buying food.
Rural Communities Have High Rates of Suicide
More than half a million people committed suicide between 2001 and 2015, according to the CDC. Rural counties consistently had higher rates than those of metropolitan areas. “While we’ve seen many causes of death come down in recent years, suicide rates have increased more than 20% from 2001 to 2015,” said Brenda Fitzgerald, MD, CDC director. “And this is especially concerning in rural areas.”
Related: Suicide Federal Health Data Trends
Suicide rates in rural counties were 17.32 per 100,000 people compared with 14.86 in small-to-medium metropolitan counties and 11.92 in large metropolitan counties. Rates for American Indian/Alaska Native non-Hispanics were the highest.
The researchers note that, at some points, different negative factors had more impact. For instance, rural communities were harder hit by housing foreclosures, poverty, and unemployment due to the recession. However, the researchers also point out that suicide rates were on the rise before the recession began.
“The trends in suicide rates…are magnified in rural areas,” said James Mercy, PhD, director of CDC’s Division of Violence Prevention. “This report underscores the need for suicide prevention strategies that are specifically tailored for these communities.” To that end, the CDC recently released a compilation of evidence-based strategies that have the greatest prevention potential. The set includes examples of programs that can be customized to fit the cultural needs of different groups. In North Dakota, for instance, the program Sources of Strength was developed for tribal communities to promote connectedness between youth and adults.
Related: Improving Veteran Engagement With Mental Health Care
The Health Resource and Service Administration (HRSA) also has developed activities to address suicide in rural areas, including epidemiologic studies, research, telemedicine, and programs addressing primary health care providers.
https://www.cdc.gov/violenceprevention/pub/technical-packages.html
More than half a million people committed suicide between 2001 and 2015, according to the CDC. Rural counties consistently had higher rates than those of metropolitan areas. “While we’ve seen many causes of death come down in recent years, suicide rates have increased more than 20% from 2001 to 2015,” said Brenda Fitzgerald, MD, CDC director. “And this is especially concerning in rural areas.”
Related: Suicide Federal Health Data Trends
Suicide rates in rural counties were 17.32 per 100,000 people compared with 14.86 in small-to-medium metropolitan counties and 11.92 in large metropolitan counties. Rates for American Indian/Alaska Native non-Hispanics were the highest.
The researchers note that, at some points, different negative factors had more impact. For instance, rural communities were harder hit by housing foreclosures, poverty, and unemployment due to the recession. However, the researchers also point out that suicide rates were on the rise before the recession began.
“The trends in suicide rates…are magnified in rural areas,” said James Mercy, PhD, director of CDC’s Division of Violence Prevention. “This report underscores the need for suicide prevention strategies that are specifically tailored for these communities.” To that end, the CDC recently released a compilation of evidence-based strategies that have the greatest prevention potential. The set includes examples of programs that can be customized to fit the cultural needs of different groups. In North Dakota, for instance, the program Sources of Strength was developed for tribal communities to promote connectedness between youth and adults.
Related: Improving Veteran Engagement With Mental Health Care
The Health Resource and Service Administration (HRSA) also has developed activities to address suicide in rural areas, including epidemiologic studies, research, telemedicine, and programs addressing primary health care providers.
https://www.cdc.gov/violenceprevention/pub/technical-packages.html
More than half a million people committed suicide between 2001 and 2015, according to the CDC. Rural counties consistently had higher rates than those of metropolitan areas. “While we’ve seen many causes of death come down in recent years, suicide rates have increased more than 20% from 2001 to 2015,” said Brenda Fitzgerald, MD, CDC director. “And this is especially concerning in rural areas.”
Related: Suicide Federal Health Data Trends
Suicide rates in rural counties were 17.32 per 100,000 people compared with 14.86 in small-to-medium metropolitan counties and 11.92 in large metropolitan counties. Rates for American Indian/Alaska Native non-Hispanics were the highest.
The researchers note that, at some points, different negative factors had more impact. For instance, rural communities were harder hit by housing foreclosures, poverty, and unemployment due to the recession. However, the researchers also point out that suicide rates were on the rise before the recession began.
“The trends in suicide rates…are magnified in rural areas,” said James Mercy, PhD, director of CDC’s Division of Violence Prevention. “This report underscores the need for suicide prevention strategies that are specifically tailored for these communities.” To that end, the CDC recently released a compilation of evidence-based strategies that have the greatest prevention potential. The set includes examples of programs that can be customized to fit the cultural needs of different groups. In North Dakota, for instance, the program Sources of Strength was developed for tribal communities to promote connectedness between youth and adults.
Related: Improving Veteran Engagement With Mental Health Care
The Health Resource and Service Administration (HRSA) also has developed activities to address suicide in rural areas, including epidemiologic studies, research, telemedicine, and programs addressing primary health care providers.
https://www.cdc.gov/violenceprevention/pub/technical-packages.html