Mild cognitive impairment risk slashed by 19% in SPRINT MIND

Article Type
Changed
Fri, 01/18/2019 - 17:50

– Lowering systolic blood pressure to a target of 120 mm Hg or lower in people with cardiovascular risk factors reduced the risk of mild cognitive impairment by 19% and probable all-cause dementia by 17% relative to those who achieved a less intensive target of less than 140 mm Hg

Dr. Jeff D. Williamson

Drug class didn’t matter. Cheap generics were just as effective as expensive name brands. It equally benefited men and women, whites, blacks, and Hispanics. And keeping systolic blood pressure at 120 mm Hg or lower prevented MCI just as well in those older than 75 as it did for younger subjects.

The stunning announcement came during a press briefing at the Alzheimer’s Association International Conference, as Jeff D. Williamson, MD, unveiled the results of the 4-year SPRINT MIND study. Strict blood pressure control for 3.2 years, with a systolic target of 120 mm Hg or lower, reduced the incidence of mild cognitive impairment by a magnitude of benefit that no amyloid-targeting investigational drug has ever approached.

“I think we can say this is the first disease-modifying strategy to reduce the risk of MCI,” Dr. Williamson said during at the briefing. And although the primary endpoint – the 17% relative risk reduction for probable all-cause dementia – didn’t meet statistical significance, “It’s comforting to see that the benefit went in the same direction and was of the same magnitude. Three years of treatment and 3.2 years of follow-up absolutely reduced the risk.”

Brain imaging underscored the clinical importance of this finding and showed its physiologic pathway. People in the strict BP arm had 18% fewer white matter hyperintensities after 4 years of follow-up.

The news is an incredible step forward for the field that has stumbled repeatedly, clinicians agreed. Generic antihypertensives can be very inexpensive. They are almost globally available, and confer a host of other benefits, not only on cardiovascular health but on kidney health as well, said Dr. Williamson, chief of geriatric medicine at Wake Forest University, Winston-Salem, N.C.

“Hypertension is a highly prevalent condition, with 60%-70% having it. The 19% overall risk reduction for MCI will have a huge impact,” he said.

Maria Carrillo, PhD, chief scientific officer of the Alzheimer’s Association, was somewhat more guarded, but still very enthusiastic.

“I think the most we can say right now is we are able to reduce risk,” she said in an interview. “But the reality is that reducing the risk of MCI by 19% will have a huge impact on dementia overall. And slowing down the disease progress is a disease modification, versus developing symptoms. So, if that is the definition we are using, the I would say yes, it is disease modifying,” for dementias arising from cerebrovascular pathology.

SPRINT MIND was a substudy of the Hypertension Systolic Blood Pressure Intervention Trial (SPRINT). It compared two strategies for managing hypertension in older adults. The intensive strategy had a target of less than 120 mm Hg, and standard care, a target of less than 140 mm Hg. SPRINT showed that more intensive blood pressure control produced a 30% reduction in the composite primary composite endpoint of cardiovascular events, stroke, and cardiovascular death. The intensive arm was so successful that SPRINT helped inform the 2017 American Heart Association and American College of Cardiology high blood pressure clinical guidelines.

The SPRINT MIND substudy looked at whether intensive management had any effect on probable all-cause dementia or MCI, as well as imaging evidence of changes in white matter lesions and brain volume.

It comprised 9,361 SPRINT subjects who were 50 years or older (mean 68; 28% at least 75) and had at least one cardiovascular risk factor. Nearly a third (30%) were black, and 10% Hispanic. The primary outcome was incident probable dementia. Secondary outcomes were MCI and a composite of MCI and/or probable dementia.

In SPRINT, physicians could choose any appropriate antihypertensive regimen, but they were encouraged to use drugs with the strongest evidence of cardiovascular benefit: thiazide-type diuretics encouraged as first-line, and then loop diuretics and beta-adrenergic blockers. About 90% of the drugs used during the study were generics.

Subjects were seen monthly for the first 3 months, during which medications were adjusted to achieve the target, and then every 3 months after that. Medications could be adjusted monthly to keep on target.

At 1 year, the mean systolic blood pressure was 121.4 mm Hg in the intensive-treatment group and 136.2 mm Hg in the standard treatment group. Treatment was stopped in August 2015 due to the observed cardiovascular disease benefit, after a median follow up of 3.26 years, but cognitive assessment continued until the end of June (N Engl J Med. 2015 Nov 26; 373:2103-16).

The SPRINT MIND study did not meet its primary endpoint. Adjudicated cases of probable all-cause dementia developed in 175 of the standard care group and 147 of the intensive treatment group; the 17% risk reduction was not statistically significant (P = .10).

However, it did hit both secondary endpoints. Adjudicated cases of MCI developed in 348 of the standard treatment groups and 285 of the intensive treatment group: a statistically significant 19% risk reduction (P = .01). The combined secondary endpoint of MCI and probable dementia was a significant 15% risk reduction (P = .02), with 463 cases in the standard care group and 398 in the intensive care group.

The imaging study comprised 454 subjects who had brain MRI at baseline and 4 years after randomization. There was no change in total brain volume, said Ilya Nasrallah, MD, of the University of Pennsylvania. But those in the intensively managed group had 18% lower white matter lesion load than those in the standard care group (P = .004).

White matter lesions often point to small vessel disease, which is conclusively linked to vascular dementia, and may also linked to Alzheimer’s disease. Most AD patients, in fact, have a mixed dementia that often includes a vascular component, Dr. Carillo said.

SPRINT MIND didn’t follow subjects past 4 years, and didn’t include any follow-up for amyloid or Alzheimer’s diagnosis. But preventing MCI is no trivial thing, according to David Knopman, MD, who moderated the session.

“There’s nothing that is benign about MCI,” said Dr. Knopman of the Mayo Clinic, Rochester, Minn. “it’s the first sign of overt cognitive dysfunction, and although the rate at which MCI progress to dementia is slow, the appearance of it is just as important as the appearance of more severe dementia. To be able to see an effect in 3.2 years is quite remarkable. I think is going to change clinical practice for people in primary care and the benefits at the population level are going to be substantial.”

Dr. Williamson drove this point home in a later interview, suggesting that physicians may want to think about how the SPRINT MIND results might apply to even younger patients with hypertension, and even if they don’t have other cardiovascular risk factors.

“I can’t say as a scientist that we have evidence to do that, yet. But as a physician, and for my own self and my own patients, I will adhere to the guidelines we have and keep blood pressure at less than 130 mm Hg, and certainly start treating people in their 50s, and probably in their 40s.”

***This article was updated 7/31/18.

[email protected]

SOURCE: Williamson et al. AAIC 2018 DT-0202 Nasrallah et al. AAIC 2018 DT-03-03

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

– Lowering systolic blood pressure to a target of 120 mm Hg or lower in people with cardiovascular risk factors reduced the risk of mild cognitive impairment by 19% and probable all-cause dementia by 17% relative to those who achieved a less intensive target of less than 140 mm Hg

Dr. Jeff D. Williamson

Drug class didn’t matter. Cheap generics were just as effective as expensive name brands. It equally benefited men and women, whites, blacks, and Hispanics. And keeping systolic blood pressure at 120 mm Hg or lower prevented MCI just as well in those older than 75 as it did for younger subjects.

The stunning announcement came during a press briefing at the Alzheimer’s Association International Conference, as Jeff D. Williamson, MD, unveiled the results of the 4-year SPRINT MIND study. Strict blood pressure control for 3.2 years, with a systolic target of 120 mm Hg or lower, reduced the incidence of mild cognitive impairment by a magnitude of benefit that no amyloid-targeting investigational drug has ever approached.

“I think we can say this is the first disease-modifying strategy to reduce the risk of MCI,” Dr. Williamson said during at the briefing. And although the primary endpoint – the 17% relative risk reduction for probable all-cause dementia – didn’t meet statistical significance, “It’s comforting to see that the benefit went in the same direction and was of the same magnitude. Three years of treatment and 3.2 years of follow-up absolutely reduced the risk.”

Brain imaging underscored the clinical importance of this finding and showed its physiologic pathway. People in the strict BP arm had 18% fewer white matter hyperintensities after 4 years of follow-up.

The news is an incredible step forward for the field that has stumbled repeatedly, clinicians agreed. Generic antihypertensives can be very inexpensive. They are almost globally available, and confer a host of other benefits, not only on cardiovascular health but on kidney health as well, said Dr. Williamson, chief of geriatric medicine at Wake Forest University, Winston-Salem, N.C.

“Hypertension is a highly prevalent condition, with 60%-70% having it. The 19% overall risk reduction for MCI will have a huge impact,” he said.

Maria Carrillo, PhD, chief scientific officer of the Alzheimer’s Association, was somewhat more guarded, but still very enthusiastic.

“I think the most we can say right now is we are able to reduce risk,” she said in an interview. “But the reality is that reducing the risk of MCI by 19% will have a huge impact on dementia overall. And slowing down the disease progress is a disease modification, versus developing symptoms. So, if that is the definition we are using, the I would say yes, it is disease modifying,” for dementias arising from cerebrovascular pathology.

SPRINT MIND was a substudy of the Hypertension Systolic Blood Pressure Intervention Trial (SPRINT). It compared two strategies for managing hypertension in older adults. The intensive strategy had a target of less than 120 mm Hg, and standard care, a target of less than 140 mm Hg. SPRINT showed that more intensive blood pressure control produced a 30% reduction in the composite primary composite endpoint of cardiovascular events, stroke, and cardiovascular death. The intensive arm was so successful that SPRINT helped inform the 2017 American Heart Association and American College of Cardiology high blood pressure clinical guidelines.

The SPRINT MIND substudy looked at whether intensive management had any effect on probable all-cause dementia or MCI, as well as imaging evidence of changes in white matter lesions and brain volume.

It comprised 9,361 SPRINT subjects who were 50 years or older (mean 68; 28% at least 75) and had at least one cardiovascular risk factor. Nearly a third (30%) were black, and 10% Hispanic. The primary outcome was incident probable dementia. Secondary outcomes were MCI and a composite of MCI and/or probable dementia.

In SPRINT, physicians could choose any appropriate antihypertensive regimen, but they were encouraged to use drugs with the strongest evidence of cardiovascular benefit: thiazide-type diuretics encouraged as first-line, and then loop diuretics and beta-adrenergic blockers. About 90% of the drugs used during the study were generics.

Subjects were seen monthly for the first 3 months, during which medications were adjusted to achieve the target, and then every 3 months after that. Medications could be adjusted monthly to keep on target.

At 1 year, the mean systolic blood pressure was 121.4 mm Hg in the intensive-treatment group and 136.2 mm Hg in the standard treatment group. Treatment was stopped in August 2015 due to the observed cardiovascular disease benefit, after a median follow up of 3.26 years, but cognitive assessment continued until the end of June (N Engl J Med. 2015 Nov 26; 373:2103-16).

The SPRINT MIND study did not meet its primary endpoint. Adjudicated cases of probable all-cause dementia developed in 175 of the standard care group and 147 of the intensive treatment group; the 17% risk reduction was not statistically significant (P = .10).

However, it did hit both secondary endpoints. Adjudicated cases of MCI developed in 348 of the standard treatment groups and 285 of the intensive treatment group: a statistically significant 19% risk reduction (P = .01). The combined secondary endpoint of MCI and probable dementia was a significant 15% risk reduction (P = .02), with 463 cases in the standard care group and 398 in the intensive care group.

The imaging study comprised 454 subjects who had brain MRI at baseline and 4 years after randomization. There was no change in total brain volume, said Ilya Nasrallah, MD, of the University of Pennsylvania. But those in the intensively managed group had 18% lower white matter lesion load than those in the standard care group (P = .004).

White matter lesions often point to small vessel disease, which is conclusively linked to vascular dementia, and may also linked to Alzheimer’s disease. Most AD patients, in fact, have a mixed dementia that often includes a vascular component, Dr. Carillo said.

SPRINT MIND didn’t follow subjects past 4 years, and didn’t include any follow-up for amyloid or Alzheimer’s diagnosis. But preventing MCI is no trivial thing, according to David Knopman, MD, who moderated the session.

“There’s nothing that is benign about MCI,” said Dr. Knopman of the Mayo Clinic, Rochester, Minn. “it’s the first sign of overt cognitive dysfunction, and although the rate at which MCI progress to dementia is slow, the appearance of it is just as important as the appearance of more severe dementia. To be able to see an effect in 3.2 years is quite remarkable. I think is going to change clinical practice for people in primary care and the benefits at the population level are going to be substantial.”

Dr. Williamson drove this point home in a later interview, suggesting that physicians may want to think about how the SPRINT MIND results might apply to even younger patients with hypertension, and even if they don’t have other cardiovascular risk factors.

“I can’t say as a scientist that we have evidence to do that, yet. But as a physician, and for my own self and my own patients, I will adhere to the guidelines we have and keep blood pressure at less than 130 mm Hg, and certainly start treating people in their 50s, and probably in their 40s.”

***This article was updated 7/31/18.

[email protected]

SOURCE: Williamson et al. AAIC 2018 DT-0202 Nasrallah et al. AAIC 2018 DT-03-03

– Lowering systolic blood pressure to a target of 120 mm Hg or lower in people with cardiovascular risk factors reduced the risk of mild cognitive impairment by 19% and probable all-cause dementia by 17% relative to those who achieved a less intensive target of less than 140 mm Hg

Dr. Jeff D. Williamson

Drug class didn’t matter. Cheap generics were just as effective as expensive name brands. It equally benefited men and women, whites, blacks, and Hispanics. And keeping systolic blood pressure at 120 mm Hg or lower prevented MCI just as well in those older than 75 as it did for younger subjects.

The stunning announcement came during a press briefing at the Alzheimer’s Association International Conference, as Jeff D. Williamson, MD, unveiled the results of the 4-year SPRINT MIND study. Strict blood pressure control for 3.2 years, with a systolic target of 120 mm Hg or lower, reduced the incidence of mild cognitive impairment by a magnitude of benefit that no amyloid-targeting investigational drug has ever approached.

“I think we can say this is the first disease-modifying strategy to reduce the risk of MCI,” Dr. Williamson said during at the briefing. And although the primary endpoint – the 17% relative risk reduction for probable all-cause dementia – didn’t meet statistical significance, “It’s comforting to see that the benefit went in the same direction and was of the same magnitude. Three years of treatment and 3.2 years of follow-up absolutely reduced the risk.”

Brain imaging underscored the clinical importance of this finding and showed its physiologic pathway. People in the strict BP arm had 18% fewer white matter hyperintensities after 4 years of follow-up.

The news is an incredible step forward for the field that has stumbled repeatedly, clinicians agreed. Generic antihypertensives can be very inexpensive. They are almost globally available, and confer a host of other benefits, not only on cardiovascular health but on kidney health as well, said Dr. Williamson, chief of geriatric medicine at Wake Forest University, Winston-Salem, N.C.

“Hypertension is a highly prevalent condition, with 60%-70% having it. The 19% overall risk reduction for MCI will have a huge impact,” he said.

Maria Carrillo, PhD, chief scientific officer of the Alzheimer’s Association, was somewhat more guarded, but still very enthusiastic.

“I think the most we can say right now is we are able to reduce risk,” she said in an interview. “But the reality is that reducing the risk of MCI by 19% will have a huge impact on dementia overall. And slowing down the disease progress is a disease modification, versus developing symptoms. So, if that is the definition we are using, the I would say yes, it is disease modifying,” for dementias arising from cerebrovascular pathology.

SPRINT MIND was a substudy of the Hypertension Systolic Blood Pressure Intervention Trial (SPRINT). It compared two strategies for managing hypertension in older adults. The intensive strategy had a target of less than 120 mm Hg, and standard care, a target of less than 140 mm Hg. SPRINT showed that more intensive blood pressure control produced a 30% reduction in the composite primary composite endpoint of cardiovascular events, stroke, and cardiovascular death. The intensive arm was so successful that SPRINT helped inform the 2017 American Heart Association and American College of Cardiology high blood pressure clinical guidelines.

The SPRINT MIND substudy looked at whether intensive management had any effect on probable all-cause dementia or MCI, as well as imaging evidence of changes in white matter lesions and brain volume.

It comprised 9,361 SPRINT subjects who were 50 years or older (mean 68; 28% at least 75) and had at least one cardiovascular risk factor. Nearly a third (30%) were black, and 10% Hispanic. The primary outcome was incident probable dementia. Secondary outcomes were MCI and a composite of MCI and/or probable dementia.

In SPRINT, physicians could choose any appropriate antihypertensive regimen, but they were encouraged to use drugs with the strongest evidence of cardiovascular benefit: thiazide-type diuretics encouraged as first-line, and then loop diuretics and beta-adrenergic blockers. About 90% of the drugs used during the study were generics.

Subjects were seen monthly for the first 3 months, during which medications were adjusted to achieve the target, and then every 3 months after that. Medications could be adjusted monthly to keep on target.

At 1 year, the mean systolic blood pressure was 121.4 mm Hg in the intensive-treatment group and 136.2 mm Hg in the standard treatment group. Treatment was stopped in August 2015 due to the observed cardiovascular disease benefit, after a median follow up of 3.26 years, but cognitive assessment continued until the end of June (N Engl J Med. 2015 Nov 26; 373:2103-16).

The SPRINT MIND study did not meet its primary endpoint. Adjudicated cases of probable all-cause dementia developed in 175 of the standard care group and 147 of the intensive treatment group; the 17% risk reduction was not statistically significant (P = .10).

However, it did hit both secondary endpoints. Adjudicated cases of MCI developed in 348 of the standard treatment groups and 285 of the intensive treatment group: a statistically significant 19% risk reduction (P = .01). The combined secondary endpoint of MCI and probable dementia was a significant 15% risk reduction (P = .02), with 463 cases in the standard care group and 398 in the intensive care group.

The imaging study comprised 454 subjects who had brain MRI at baseline and 4 years after randomization. There was no change in total brain volume, said Ilya Nasrallah, MD, of the University of Pennsylvania. But those in the intensively managed group had 18% lower white matter lesion load than those in the standard care group (P = .004).

White matter lesions often point to small vessel disease, which is conclusively linked to vascular dementia, and may also linked to Alzheimer’s disease. Most AD patients, in fact, have a mixed dementia that often includes a vascular component, Dr. Carillo said.

SPRINT MIND didn’t follow subjects past 4 years, and didn’t include any follow-up for amyloid or Alzheimer’s diagnosis. But preventing MCI is no trivial thing, according to David Knopman, MD, who moderated the session.

“There’s nothing that is benign about MCI,” said Dr. Knopman of the Mayo Clinic, Rochester, Minn. “it’s the first sign of overt cognitive dysfunction, and although the rate at which MCI progress to dementia is slow, the appearance of it is just as important as the appearance of more severe dementia. To be able to see an effect in 3.2 years is quite remarkable. I think is going to change clinical practice for people in primary care and the benefits at the population level are going to be substantial.”

Dr. Williamson drove this point home in a later interview, suggesting that physicians may want to think about how the SPRINT MIND results might apply to even younger patients with hypertension, and even if they don’t have other cardiovascular risk factors.

“I can’t say as a scientist that we have evidence to do that, yet. But as a physician, and for my own self and my own patients, I will adhere to the guidelines we have and keep blood pressure at less than 130 mm Hg, and certainly start treating people in their 50s, and probably in their 40s.”

***This article was updated 7/31/18.

[email protected]

SOURCE: Williamson et al. AAIC 2018 DT-0202 Nasrallah et al. AAIC 2018 DT-03-03

Publications
Publications
Topics
Article Type
Sections
Article Source

AT AAIC 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Keeping systolic blood pressure at less than 120 mm Hg reduced the risk of MCI and all-cause dementia more effectively than keeping it less than 140 mm Hg.

Major finding: After 3.2 years of treatment, there was a 19% lower risk of MCI in the intensively managed group relative to the standard of care group.

Study details: SPRINT MIND comprised more than 9,000 subjects treated for 3.2 years.

Disclosures: The study was funded by the National Institutes of Health. Neither presenter had any relevant financial disclosures.

Source: Williamson et al. AAIC 2018 DT-0202

Disqus Comments
Default
Use ProPublica

Early-onset atopic dermatitis linked to elevated risk for seasonal allergies and asthma

Article Type
Changed
Fri, 01/18/2019 - 17:50

 

Progression through the “atopic march” varies by age of atopic dermatitis (AD) onset, and is more pronounced among patients aged two years and younger, results from a large, retrospective cohort study demonstrated.

Dr. Joy Wan

“The atopic march is characterized by a progression from atopic dermatitis, usually early in childhood, to subsequent development of allergic rhinitis and asthma, lead study author Joy Wan, MD, said at the annual meeting of the Society for Pediatric Dermatology. “It is thought that the skin acts as the site of primary sensitization through a defective epithelial barrier, which then allows for allergic sensitization to occur in the airways. It is estimated that 30%-60% of AD patients go on to develop asthma and/or allergic rhinitis. However, not all patients complete the so-called atopic march, and this variation in the risk of asthma and allergic rhinitis among AD patients is not very well understood. Better ways to risk stratify these patients are needed.”

One possible explanation for this variation in the risk of atopy in AD patients could be the timing of their dermatitis onset. “We know that atopic dermatitis begins in infancy, but it can start at any age,” said Dr. Wan, who is a fellow in the section of pediatric dermatology at the Children’s Hospital of Philadelphia. “There has been a distinction between early-onset versus late-onset AD. Some past studies have also suggested that there is an increased risk of asthma and allergic rhinitis in children who have early-onset AD before the age of 1 or 2. This suggests that perhaps the model of the atopic march varies between early- and late-onset AD. However, past studies have had several limitations. They’ve often had short durations of follow-up, they’ve only examined narrow ranges of age of onset for AD, and most of them have been designed to primarily evaluate other exposures and outcomes, rather than looking at the timing of AD onset itself.”

For the current study, Dr. Wan and her associates set out to examine the risk of seasonal allergies and asthma among children with AD with respect to the age of AD onset. They used data from the Pediatric Eczema Elective Registry (PEER), an ongoing, prospective U.S. cohort of more than 7,700 children with physician-confirmed AD (JAMA Dermatol. 2014 Jun;150:593-600). All registry participants had used pimecrolimus cream in the past, but children with lymphoproliferative disease were excluded from the registry, as were those with malignancy or those who required the use of systemic immunosuppression.

The researchers evaluated 3,966 subjects in PEER with at least 3 years of follow-up. The exposure of interest was age of AD onset, and they divided patients into three broad age categories: early onset (age 2 years or younger), mid onset (3-7 years), and late onset (8-17 years). Primary outcomes were prevalent seasonal allergies and asthma at the time of registry enrollment, and incident seasonal allergies and asthma during follow-up, assessed via patient surveys every 3 years.

 

 



The study population included high proportions of white and black children, and there was a slight predominance of females. The median age at PEER enrollment increased with advancing age of AD onset (5.2 years in the early-onset group vs. 8.2 years in the mid-onset group and 13.1 years in the late-onset group), while the duration of follow-up was fairly similar across the three groups (a median of about 8.3 months). Family history of AD was common across all three groups, while patients in the late-onset group tended to have better control of their AD, compared with their younger counterparts.

At baseline, the prevalence of seasonal allergies was highest among the early-onset group at 74.6%, compared with 69.9% among the mid-onset group and 70.1% among the late-onset group. After adjusting for sex, race, and age at registry enrollment, the relative risk for prevalent seasonal allergies was 9% lower in the mid-onset group (0.91) and 18% lower in the late-onset group (0.82), compared with those in the early-onset group. Next, Dr. Wan and her associates calculated the incidence of seasonal allergies among 1,054 patients who did not have allergies at baseline. The cumulative incidence was highest among the early-onset group (56.1%), followed by the mid-onset group (46.8%), and the late-onset group (30.6%). On adjusted analysis, the relative risk for seasonal allergies among patients who had no allergies at baseline was 18% lower in the mid-onset group (0.82) and 36% lower in the late-onset group (0.64), compared with those in the early-onset group.

In the analysis of asthma risk by age of AD onset, prevalence was highest among patients in the early-onset group at 51.5%, compared with 44.7% among the mid-onset age group and 43% among the late-onset age group. On adjusted analysis, the relative risk for asthma was 15% lower in the mid-onset group (0.85) and 29% lower in the late-onset group (0.71), compared with those in the early-onset group. Meanwhile, the cumulative incidence of asthma among patients without asthma at baseline was also highest in the early-onset group (39.2%), compared with 31.9% in the mid-onset group and 29.9% in the late-onset group.


On adjusted analysis, the relative risk for asthma among this subset of patients was 4% lower in the mid-onset group (0.96) and 8% lower in the late-onset group (0.92), compared with those in the early-onset group, a difference that was not statistically significant. “One possible explanation for this is that asthma tends to develop soon after AD does, and the rates of developing asthma later on, as detected by our study, are nondifferential,” Dr. Wan said. “Another possibility is that the impact of early-onset versus late-onset AD is just different for asthma than it is for seasonal allergies.”

She acknowledged certain limitations of the study, including the risk of misclassification bias and limitations in recall with self-reported data, and the fact that the findings may not be generalizable to all patients with AD.

“Future studies with longer follow-up and studies of adult-onset AD will help extend our findings,” she concluded. “Nevertheless, our findings may inform how we risk stratify patients for AD treatment or atopic march prevention efforts in the future.”

PEER is funded by a grant from Valeant Pharmaceuticals, but Valeant had no role in this study. Dr. Wan reported having no financial disclosures. The study won an award at the meeting for best research presented by a dermatology resident or fellow.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Progression through the “atopic march” varies by age of atopic dermatitis (AD) onset, and is more pronounced among patients aged two years and younger, results from a large, retrospective cohort study demonstrated.

Dr. Joy Wan

“The atopic march is characterized by a progression from atopic dermatitis, usually early in childhood, to subsequent development of allergic rhinitis and asthma, lead study author Joy Wan, MD, said at the annual meeting of the Society for Pediatric Dermatology. “It is thought that the skin acts as the site of primary sensitization through a defective epithelial barrier, which then allows for allergic sensitization to occur in the airways. It is estimated that 30%-60% of AD patients go on to develop asthma and/or allergic rhinitis. However, not all patients complete the so-called atopic march, and this variation in the risk of asthma and allergic rhinitis among AD patients is not very well understood. Better ways to risk stratify these patients are needed.”

One possible explanation for this variation in the risk of atopy in AD patients could be the timing of their dermatitis onset. “We know that atopic dermatitis begins in infancy, but it can start at any age,” said Dr. Wan, who is a fellow in the section of pediatric dermatology at the Children’s Hospital of Philadelphia. “There has been a distinction between early-onset versus late-onset AD. Some past studies have also suggested that there is an increased risk of asthma and allergic rhinitis in children who have early-onset AD before the age of 1 or 2. This suggests that perhaps the model of the atopic march varies between early- and late-onset AD. However, past studies have had several limitations. They’ve often had short durations of follow-up, they’ve only examined narrow ranges of age of onset for AD, and most of them have been designed to primarily evaluate other exposures and outcomes, rather than looking at the timing of AD onset itself.”

For the current study, Dr. Wan and her associates set out to examine the risk of seasonal allergies and asthma among children with AD with respect to the age of AD onset. They used data from the Pediatric Eczema Elective Registry (PEER), an ongoing, prospective U.S. cohort of more than 7,700 children with physician-confirmed AD (JAMA Dermatol. 2014 Jun;150:593-600). All registry participants had used pimecrolimus cream in the past, but children with lymphoproliferative disease were excluded from the registry, as were those with malignancy or those who required the use of systemic immunosuppression.

The researchers evaluated 3,966 subjects in PEER with at least 3 years of follow-up. The exposure of interest was age of AD onset, and they divided patients into three broad age categories: early onset (age 2 years or younger), mid onset (3-7 years), and late onset (8-17 years). Primary outcomes were prevalent seasonal allergies and asthma at the time of registry enrollment, and incident seasonal allergies and asthma during follow-up, assessed via patient surveys every 3 years.

 

 



The study population included high proportions of white and black children, and there was a slight predominance of females. The median age at PEER enrollment increased with advancing age of AD onset (5.2 years in the early-onset group vs. 8.2 years in the mid-onset group and 13.1 years in the late-onset group), while the duration of follow-up was fairly similar across the three groups (a median of about 8.3 months). Family history of AD was common across all three groups, while patients in the late-onset group tended to have better control of their AD, compared with their younger counterparts.

At baseline, the prevalence of seasonal allergies was highest among the early-onset group at 74.6%, compared with 69.9% among the mid-onset group and 70.1% among the late-onset group. After adjusting for sex, race, and age at registry enrollment, the relative risk for prevalent seasonal allergies was 9% lower in the mid-onset group (0.91) and 18% lower in the late-onset group (0.82), compared with those in the early-onset group. Next, Dr. Wan and her associates calculated the incidence of seasonal allergies among 1,054 patients who did not have allergies at baseline. The cumulative incidence was highest among the early-onset group (56.1%), followed by the mid-onset group (46.8%), and the late-onset group (30.6%). On adjusted analysis, the relative risk for seasonal allergies among patients who had no allergies at baseline was 18% lower in the mid-onset group (0.82) and 36% lower in the late-onset group (0.64), compared with those in the early-onset group.

In the analysis of asthma risk by age of AD onset, prevalence was highest among patients in the early-onset group at 51.5%, compared with 44.7% among the mid-onset age group and 43% among the late-onset age group. On adjusted analysis, the relative risk for asthma was 15% lower in the mid-onset group (0.85) and 29% lower in the late-onset group (0.71), compared with those in the early-onset group. Meanwhile, the cumulative incidence of asthma among patients without asthma at baseline was also highest in the early-onset group (39.2%), compared with 31.9% in the mid-onset group and 29.9% in the late-onset group.


On adjusted analysis, the relative risk for asthma among this subset of patients was 4% lower in the mid-onset group (0.96) and 8% lower in the late-onset group (0.92), compared with those in the early-onset group, a difference that was not statistically significant. “One possible explanation for this is that asthma tends to develop soon after AD does, and the rates of developing asthma later on, as detected by our study, are nondifferential,” Dr. Wan said. “Another possibility is that the impact of early-onset versus late-onset AD is just different for asthma than it is for seasonal allergies.”

She acknowledged certain limitations of the study, including the risk of misclassification bias and limitations in recall with self-reported data, and the fact that the findings may not be generalizable to all patients with AD.

“Future studies with longer follow-up and studies of adult-onset AD will help extend our findings,” she concluded. “Nevertheless, our findings may inform how we risk stratify patients for AD treatment or atopic march prevention efforts in the future.”

PEER is funded by a grant from Valeant Pharmaceuticals, but Valeant had no role in this study. Dr. Wan reported having no financial disclosures. The study won an award at the meeting for best research presented by a dermatology resident or fellow.

 

Progression through the “atopic march” varies by age of atopic dermatitis (AD) onset, and is more pronounced among patients aged two years and younger, results from a large, retrospective cohort study demonstrated.

Dr. Joy Wan

“The atopic march is characterized by a progression from atopic dermatitis, usually early in childhood, to subsequent development of allergic rhinitis and asthma, lead study author Joy Wan, MD, said at the annual meeting of the Society for Pediatric Dermatology. “It is thought that the skin acts as the site of primary sensitization through a defective epithelial barrier, which then allows for allergic sensitization to occur in the airways. It is estimated that 30%-60% of AD patients go on to develop asthma and/or allergic rhinitis. However, not all patients complete the so-called atopic march, and this variation in the risk of asthma and allergic rhinitis among AD patients is not very well understood. Better ways to risk stratify these patients are needed.”

One possible explanation for this variation in the risk of atopy in AD patients could be the timing of their dermatitis onset. “We know that atopic dermatitis begins in infancy, but it can start at any age,” said Dr. Wan, who is a fellow in the section of pediatric dermatology at the Children’s Hospital of Philadelphia. “There has been a distinction between early-onset versus late-onset AD. Some past studies have also suggested that there is an increased risk of asthma and allergic rhinitis in children who have early-onset AD before the age of 1 or 2. This suggests that perhaps the model of the atopic march varies between early- and late-onset AD. However, past studies have had several limitations. They’ve often had short durations of follow-up, they’ve only examined narrow ranges of age of onset for AD, and most of them have been designed to primarily evaluate other exposures and outcomes, rather than looking at the timing of AD onset itself.”

For the current study, Dr. Wan and her associates set out to examine the risk of seasonal allergies and asthma among children with AD with respect to the age of AD onset. They used data from the Pediatric Eczema Elective Registry (PEER), an ongoing, prospective U.S. cohort of more than 7,700 children with physician-confirmed AD (JAMA Dermatol. 2014 Jun;150:593-600). All registry participants had used pimecrolimus cream in the past, but children with lymphoproliferative disease were excluded from the registry, as were those with malignancy or those who required the use of systemic immunosuppression.

The researchers evaluated 3,966 subjects in PEER with at least 3 years of follow-up. The exposure of interest was age of AD onset, and they divided patients into three broad age categories: early onset (age 2 years or younger), mid onset (3-7 years), and late onset (8-17 years). Primary outcomes were prevalent seasonal allergies and asthma at the time of registry enrollment, and incident seasonal allergies and asthma during follow-up, assessed via patient surveys every 3 years.

 

 



The study population included high proportions of white and black children, and there was a slight predominance of females. The median age at PEER enrollment increased with advancing age of AD onset (5.2 years in the early-onset group vs. 8.2 years in the mid-onset group and 13.1 years in the late-onset group), while the duration of follow-up was fairly similar across the three groups (a median of about 8.3 months). Family history of AD was common across all three groups, while patients in the late-onset group tended to have better control of their AD, compared with their younger counterparts.

At baseline, the prevalence of seasonal allergies was highest among the early-onset group at 74.6%, compared with 69.9% among the mid-onset group and 70.1% among the late-onset group. After adjusting for sex, race, and age at registry enrollment, the relative risk for prevalent seasonal allergies was 9% lower in the mid-onset group (0.91) and 18% lower in the late-onset group (0.82), compared with those in the early-onset group. Next, Dr. Wan and her associates calculated the incidence of seasonal allergies among 1,054 patients who did not have allergies at baseline. The cumulative incidence was highest among the early-onset group (56.1%), followed by the mid-onset group (46.8%), and the late-onset group (30.6%). On adjusted analysis, the relative risk for seasonal allergies among patients who had no allergies at baseline was 18% lower in the mid-onset group (0.82) and 36% lower in the late-onset group (0.64), compared with those in the early-onset group.

In the analysis of asthma risk by age of AD onset, prevalence was highest among patients in the early-onset group at 51.5%, compared with 44.7% among the mid-onset age group and 43% among the late-onset age group. On adjusted analysis, the relative risk for asthma was 15% lower in the mid-onset group (0.85) and 29% lower in the late-onset group (0.71), compared with those in the early-onset group. Meanwhile, the cumulative incidence of asthma among patients without asthma at baseline was also highest in the early-onset group (39.2%), compared with 31.9% in the mid-onset group and 29.9% in the late-onset group.


On adjusted analysis, the relative risk for asthma among this subset of patients was 4% lower in the mid-onset group (0.96) and 8% lower in the late-onset group (0.92), compared with those in the early-onset group, a difference that was not statistically significant. “One possible explanation for this is that asthma tends to develop soon after AD does, and the rates of developing asthma later on, as detected by our study, are nondifferential,” Dr. Wan said. “Another possibility is that the impact of early-onset versus late-onset AD is just different for asthma than it is for seasonal allergies.”

She acknowledged certain limitations of the study, including the risk of misclassification bias and limitations in recall with self-reported data, and the fact that the findings may not be generalizable to all patients with AD.

“Future studies with longer follow-up and studies of adult-onset AD will help extend our findings,” she concluded. “Nevertheless, our findings may inform how we risk stratify patients for AD treatment or atopic march prevention efforts in the future.”

PEER is funded by a grant from Valeant Pharmaceuticals, but Valeant had no role in this study. Dr. Wan reported having no financial disclosures. The study won an award at the meeting for best research presented by a dermatology resident or fellow.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

AT SPD 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Enhanced recovery initiative improved bariatric length of stay

Article Type
Changed
Wed, 01/02/2019 - 10:12

– Adopting a 28-point enhanced recovery protocol for bariatric surgery significantly reduced length of stay without significant effects on complications or readmissions, according to interim results of a large, nationwide surgical quality initiative.

Dr. Stacy A. Brethauer

Thirty-six centers participated in this pilot initiative, making it one of the largest national projects focused on enhanced recovery to date, according to Stacy A. Brethauer, MD, FACS, cochair of the Quality and Data Committee of the Metabolic and Bariatric Surgery Accreditation Quality Improvement Program (MBSAQIP).

The initiative, known as Employing New Enhanced Recovery Goals for Bariatric Surgery (ENERGY), was developed in light of “huge gaps in literature and knowledge” about what best practices of enhanced recovery should look like for bariatric surgery, Dr. Brethauer said in a podium presentation at the American College of Surgeons Quality and Safety Conference.

“Bariatric surgery is very pathway driven, but the pathway can be very cumbersome and very antiquated if you don’t keep it up to date and evidence based,” said Dr. Brethauer, associate professor of surgery at the Cleveland Clinic.

Invitations to join in the ENERGY pilot were targeted to the 80 or so MBSAQIP-accredited centers in the top decile of programs for length of stay. “That’s the needle that we want to move,” Dr. Brethauer said.

ENERGY includes interventions in the preoperative, perioperative, and postoperative setting for each patient who undergoes a primary band, lap sleeve, or lap bypass procedure.

The 36 participating centers were asked to document 28 discrete process measures, starting with “did the patient stop smoking before surgery?” and ending with “did the patient have a follow-up clinic appointment scheduled?” Each one was entered by a trained clinical reviewer. The program included monthly audits for each participating center.

Data collection started on July 1, 2017, and continued to June 30, 2018, following a 6-month run-up period to allow centers to incorporate the measures.

The interim analysis presented included 4,700 patients who underwent procedures in the first 6 months of the data collection period. Nearly 60% (2,790 patients) had a laparoscopic sleeve gastrectomy, while about 40% (1,896 patients) underwent laparoscopic gastric bypass, and 0.1% (6 patients) had a band procedure.

Average length of stay was 1.76 days in the first 6 months of the pilot, down from 2.24 days in 2016 for those same participating centers (P less than .001), Dr. Brethauer reported.

Similarly, the rate of extended length of stay was 4.4% in the first 6 months of the pilot, down from 8.2% in 2016. Extended length of stay decreased with increasing adherence to the protocol, Dr. Brethauer and his colleagues found in their analysis.

Those length-of-stay reductions were accomplished with no increase in bleeding rates, all-cause reoperation rates, or readmissions. “We’re not doing this at the expense of other complications,” Dr. Brethauer said in a comment on the results.

Adherence to the 28 ENERGY measures increased from 26% in the first month of the pilot to 80.2% in March 2017, the latest month included in the interim analysis.

Opioid-sparing pain management strategies are incorporated into ENERGY. Over the first six months of the pilot, the average proportion of patients receiving no opioids postoperatively was 26.8%.

The ultimate goal of ENERGY is a large-scale rollout of enhanced recovery strategies, according to Dr. Brethauer.

ENERGY is the second national quality improvement project of the MBSAQIP. In the first, known as Decreasing Readmissions through Opportunities Provided (DROP), 128 U.S. hospitals implemented a set of standard processes organized into preoperative, inpatient, and postoperative care bundles. Results of a yearlong study of the DROP intervention demonstrated a significant reduction in 30-day all-cause hospital readmissions following sleeve gastrectomy.

“If you look at what’s happened in our specialty, and all the changes and all the work that’s been done, it’s really quite impressive,” Dr. Brethauer told attendees at the meeting. “It’s something that we’re very proud of. “

Dr. Brethauer reported disclosures related to Medtronic and Ethicon outside of the scope of this presentation.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Adopting a 28-point enhanced recovery protocol for bariatric surgery significantly reduced length of stay without significant effects on complications or readmissions, according to interim results of a large, nationwide surgical quality initiative.

Dr. Stacy A. Brethauer

Thirty-six centers participated in this pilot initiative, making it one of the largest national projects focused on enhanced recovery to date, according to Stacy A. Brethauer, MD, FACS, cochair of the Quality and Data Committee of the Metabolic and Bariatric Surgery Accreditation Quality Improvement Program (MBSAQIP).

The initiative, known as Employing New Enhanced Recovery Goals for Bariatric Surgery (ENERGY), was developed in light of “huge gaps in literature and knowledge” about what best practices of enhanced recovery should look like for bariatric surgery, Dr. Brethauer said in a podium presentation at the American College of Surgeons Quality and Safety Conference.

“Bariatric surgery is very pathway driven, but the pathway can be very cumbersome and very antiquated if you don’t keep it up to date and evidence based,” said Dr. Brethauer, associate professor of surgery at the Cleveland Clinic.

Invitations to join in the ENERGY pilot were targeted to the 80 or so MBSAQIP-accredited centers in the top decile of programs for length of stay. “That’s the needle that we want to move,” Dr. Brethauer said.

ENERGY includes interventions in the preoperative, perioperative, and postoperative setting for each patient who undergoes a primary band, lap sleeve, or lap bypass procedure.

The 36 participating centers were asked to document 28 discrete process measures, starting with “did the patient stop smoking before surgery?” and ending with “did the patient have a follow-up clinic appointment scheduled?” Each one was entered by a trained clinical reviewer. The program included monthly audits for each participating center.

Data collection started on July 1, 2017, and continued to June 30, 2018, following a 6-month run-up period to allow centers to incorporate the measures.

The interim analysis presented included 4,700 patients who underwent procedures in the first 6 months of the data collection period. Nearly 60% (2,790 patients) had a laparoscopic sleeve gastrectomy, while about 40% (1,896 patients) underwent laparoscopic gastric bypass, and 0.1% (6 patients) had a band procedure.

Average length of stay was 1.76 days in the first 6 months of the pilot, down from 2.24 days in 2016 for those same participating centers (P less than .001), Dr. Brethauer reported.

Similarly, the rate of extended length of stay was 4.4% in the first 6 months of the pilot, down from 8.2% in 2016. Extended length of stay decreased with increasing adherence to the protocol, Dr. Brethauer and his colleagues found in their analysis.

Those length-of-stay reductions were accomplished with no increase in bleeding rates, all-cause reoperation rates, or readmissions. “We’re not doing this at the expense of other complications,” Dr. Brethauer said in a comment on the results.

Adherence to the 28 ENERGY measures increased from 26% in the first month of the pilot to 80.2% in March 2017, the latest month included in the interim analysis.

Opioid-sparing pain management strategies are incorporated into ENERGY. Over the first six months of the pilot, the average proportion of patients receiving no opioids postoperatively was 26.8%.

The ultimate goal of ENERGY is a large-scale rollout of enhanced recovery strategies, according to Dr. Brethauer.

ENERGY is the second national quality improvement project of the MBSAQIP. In the first, known as Decreasing Readmissions through Opportunities Provided (DROP), 128 U.S. hospitals implemented a set of standard processes organized into preoperative, inpatient, and postoperative care bundles. Results of a yearlong study of the DROP intervention demonstrated a significant reduction in 30-day all-cause hospital readmissions following sleeve gastrectomy.

“If you look at what’s happened in our specialty, and all the changes and all the work that’s been done, it’s really quite impressive,” Dr. Brethauer told attendees at the meeting. “It’s something that we’re very proud of. “

Dr. Brethauer reported disclosures related to Medtronic and Ethicon outside of the scope of this presentation.

– Adopting a 28-point enhanced recovery protocol for bariatric surgery significantly reduced length of stay without significant effects on complications or readmissions, according to interim results of a large, nationwide surgical quality initiative.

Dr. Stacy A. Brethauer

Thirty-six centers participated in this pilot initiative, making it one of the largest national projects focused on enhanced recovery to date, according to Stacy A. Brethauer, MD, FACS, cochair of the Quality and Data Committee of the Metabolic and Bariatric Surgery Accreditation Quality Improvement Program (MBSAQIP).

The initiative, known as Employing New Enhanced Recovery Goals for Bariatric Surgery (ENERGY), was developed in light of “huge gaps in literature and knowledge” about what best practices of enhanced recovery should look like for bariatric surgery, Dr. Brethauer said in a podium presentation at the American College of Surgeons Quality and Safety Conference.

“Bariatric surgery is very pathway driven, but the pathway can be very cumbersome and very antiquated if you don’t keep it up to date and evidence based,” said Dr. Brethauer, associate professor of surgery at the Cleveland Clinic.

Invitations to join in the ENERGY pilot were targeted to the 80 or so MBSAQIP-accredited centers in the top decile of programs for length of stay. “That’s the needle that we want to move,” Dr. Brethauer said.

ENERGY includes interventions in the preoperative, perioperative, and postoperative setting for each patient who undergoes a primary band, lap sleeve, or lap bypass procedure.

The 36 participating centers were asked to document 28 discrete process measures, starting with “did the patient stop smoking before surgery?” and ending with “did the patient have a follow-up clinic appointment scheduled?” Each one was entered by a trained clinical reviewer. The program included monthly audits for each participating center.

Data collection started on July 1, 2017, and continued to June 30, 2018, following a 6-month run-up period to allow centers to incorporate the measures.

The interim analysis presented included 4,700 patients who underwent procedures in the first 6 months of the data collection period. Nearly 60% (2,790 patients) had a laparoscopic sleeve gastrectomy, while about 40% (1,896 patients) underwent laparoscopic gastric bypass, and 0.1% (6 patients) had a band procedure.

Average length of stay was 1.76 days in the first 6 months of the pilot, down from 2.24 days in 2016 for those same participating centers (P less than .001), Dr. Brethauer reported.

Similarly, the rate of extended length of stay was 4.4% in the first 6 months of the pilot, down from 8.2% in 2016. Extended length of stay decreased with increasing adherence to the protocol, Dr. Brethauer and his colleagues found in their analysis.

Those length-of-stay reductions were accomplished with no increase in bleeding rates, all-cause reoperation rates, or readmissions. “We’re not doing this at the expense of other complications,” Dr. Brethauer said in a comment on the results.

Adherence to the 28 ENERGY measures increased from 26% in the first month of the pilot to 80.2% in March 2017, the latest month included in the interim analysis.

Opioid-sparing pain management strategies are incorporated into ENERGY. Over the first six months of the pilot, the average proportion of patients receiving no opioids postoperatively was 26.8%.

The ultimate goal of ENERGY is a large-scale rollout of enhanced recovery strategies, according to Dr. Brethauer.

ENERGY is the second national quality improvement project of the MBSAQIP. In the first, known as Decreasing Readmissions through Opportunities Provided (DROP), 128 U.S. hospitals implemented a set of standard processes organized into preoperative, inpatient, and postoperative care bundles. Results of a yearlong study of the DROP intervention demonstrated a significant reduction in 30-day all-cause hospital readmissions following sleeve gastrectomy.

“If you look at what’s happened in our specialty, and all the changes and all the work that’s been done, it’s really quite impressive,” Dr. Brethauer told attendees at the meeting. “It’s something that we’re very proud of. “

Dr. Brethauer reported disclosures related to Medtronic and Ethicon outside of the scope of this presentation.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM ACSQSC 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: An evidence-based enhanced recovery protocol reduced length of stay for bariatric surgery patients.

Major finding: Average length of stay was 1.76 days in the first 6 months of the pilot, down from 2.24 days in 2016 for those same participating centers.

Study details: Data on 36 bariatric surgery centers and 4,700 patients who underwent procedures in the first 6 months of the data collection period.

Disclosures: Dr. Brethauer reported disclosures related to Medtronic and Ethicon outside of the scope of this presentation.

Disqus Comments
Default
Use ProPublica

HIV infection linked with doubled stroke rate

Article Type
Changed
Tue, 07/21/2020 - 14:18

– Patients infected with HIV had a roughly doubled rate of stroke compared with the general population, based on analysis of data collected from about 23 million people in Taiwan – including nearly 6,000 with HIV infection.

Mitchel L. Zoler/MDedge News

The combined rate of both ischemic and hemorrhagic stroke was especially elevated during the first 7 years after a diagnosis of HIV positivity, with a nearly 47-fold increased incidence rate during the first year following HIV diagnosis compared with the general age- and sex-adjusted general population, and a nearly fourfold increased rate during years 1-7, both statistically significant differences, Hui-Lin Lin, MD, reported in a poster at the 22nd International AIDS Conference.



Once the period studied reached 8 or more years out from the time of HIV diagnosis, infected patients showed no statistically significant difference in their stroke rate compared with the general population. For the entire follow-up, the stroke rate was 94% higher among HIV-infected patients compared with the general population, a statistically significant difference.

HIV-infected patients who had a stroke also developed it at a much younger age than did the general population. The median age of the infected patients with a stroke was 49 years, compared with a median age of 70 years in the general population.

These findings “highlight the importance” of both screening for stroke risk factors and then starting management steps to minimize these risk factors “immediately and aggressively” after HIV infection is first identified, said Dr. Lin, a researcher at Lin Shin Hospital in Taichung, Taiwan.

Dr. Lin and an associate used data collected by the national health system of Taiwan for about 23 million residents who received care during 1998-2005. They identified 5,961 patients first diagnosed with an HIV infection during this period who had no history of stroke, and 22,307,214 people managed in the health system during the same time with no diagnosed HIV infection and no stroke history. The researchers then followed the stroke incidence among these people through 2011.

Following their HIV diagnosis, those with an infection had 59 ischemic strokes, 29 hemorrhagic strokes, and 15 strokes of undetermined type during an average follow-up of about 8 years. The researchers compared these incidence rates with the general population using standardized incidence ratios (SIR) that adjusted for age, sex, and duration of follow-up. The median age of those diagnosed with an HIV infection was 32 years, while the median age of the 22 million uninfected residents was 34 years.

The SIR analysis showed the excess rate of strokes was particularly significant among HIV-infected patients aged 65 or younger at the time of their infection diagnosis, and was more elevated in women than in men. Among infected men aged younger than 45 years, the SIR for any stroke was more than twice as high as in similarly aged men in the general population, and more than five times as high among infected women compared with similarly aged women in the general population. Among people infected with HIV when they were aged 45-65 years, the SIR for any stroke was 59% higher among men and three times as high among women. Both men and women who were first diagnosed with an HIV infection when they were aged older than 65 years did not have a significantly different SIR for stroke than the general population.

The researchers did not report information on which antiretroviral therapy the HIV-infected patients received or their immune status and how the level of infection control in infected patients related to their stroke rate.

[email protected]

SOURCE: Lin H-L et al. AIDS 2018, Abstract TUPDB0105.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Patients infected with HIV had a roughly doubled rate of stroke compared with the general population, based on analysis of data collected from about 23 million people in Taiwan – including nearly 6,000 with HIV infection.

Mitchel L. Zoler/MDedge News

The combined rate of both ischemic and hemorrhagic stroke was especially elevated during the first 7 years after a diagnosis of HIV positivity, with a nearly 47-fold increased incidence rate during the first year following HIV diagnosis compared with the general age- and sex-adjusted general population, and a nearly fourfold increased rate during years 1-7, both statistically significant differences, Hui-Lin Lin, MD, reported in a poster at the 22nd International AIDS Conference.



Once the period studied reached 8 or more years out from the time of HIV diagnosis, infected patients showed no statistically significant difference in their stroke rate compared with the general population. For the entire follow-up, the stroke rate was 94% higher among HIV-infected patients compared with the general population, a statistically significant difference.

HIV-infected patients who had a stroke also developed it at a much younger age than did the general population. The median age of the infected patients with a stroke was 49 years, compared with a median age of 70 years in the general population.

These findings “highlight the importance” of both screening for stroke risk factors and then starting management steps to minimize these risk factors “immediately and aggressively” after HIV infection is first identified, said Dr. Lin, a researcher at Lin Shin Hospital in Taichung, Taiwan.

Dr. Lin and an associate used data collected by the national health system of Taiwan for about 23 million residents who received care during 1998-2005. They identified 5,961 patients first diagnosed with an HIV infection during this period who had no history of stroke, and 22,307,214 people managed in the health system during the same time with no diagnosed HIV infection and no stroke history. The researchers then followed the stroke incidence among these people through 2011.

Following their HIV diagnosis, those with an infection had 59 ischemic strokes, 29 hemorrhagic strokes, and 15 strokes of undetermined type during an average follow-up of about 8 years. The researchers compared these incidence rates with the general population using standardized incidence ratios (SIR) that adjusted for age, sex, and duration of follow-up. The median age of those diagnosed with an HIV infection was 32 years, while the median age of the 22 million uninfected residents was 34 years.

The SIR analysis showed the excess rate of strokes was particularly significant among HIV-infected patients aged 65 or younger at the time of their infection diagnosis, and was more elevated in women than in men. Among infected men aged younger than 45 years, the SIR for any stroke was more than twice as high as in similarly aged men in the general population, and more than five times as high among infected women compared with similarly aged women in the general population. Among people infected with HIV when they were aged 45-65 years, the SIR for any stroke was 59% higher among men and three times as high among women. Both men and women who were first diagnosed with an HIV infection when they were aged older than 65 years did not have a significantly different SIR for stroke than the general population.

The researchers did not report information on which antiretroviral therapy the HIV-infected patients received or their immune status and how the level of infection control in infected patients related to their stroke rate.

[email protected]

SOURCE: Lin H-L et al. AIDS 2018, Abstract TUPDB0105.

– Patients infected with HIV had a roughly doubled rate of stroke compared with the general population, based on analysis of data collected from about 23 million people in Taiwan – including nearly 6,000 with HIV infection.

Mitchel L. Zoler/MDedge News

The combined rate of both ischemic and hemorrhagic stroke was especially elevated during the first 7 years after a diagnosis of HIV positivity, with a nearly 47-fold increased incidence rate during the first year following HIV diagnosis compared with the general age- and sex-adjusted general population, and a nearly fourfold increased rate during years 1-7, both statistically significant differences, Hui-Lin Lin, MD, reported in a poster at the 22nd International AIDS Conference.



Once the period studied reached 8 or more years out from the time of HIV diagnosis, infected patients showed no statistically significant difference in their stroke rate compared with the general population. For the entire follow-up, the stroke rate was 94% higher among HIV-infected patients compared with the general population, a statistically significant difference.

HIV-infected patients who had a stroke also developed it at a much younger age than did the general population. The median age of the infected patients with a stroke was 49 years, compared with a median age of 70 years in the general population.

These findings “highlight the importance” of both screening for stroke risk factors and then starting management steps to minimize these risk factors “immediately and aggressively” after HIV infection is first identified, said Dr. Lin, a researcher at Lin Shin Hospital in Taichung, Taiwan.

Dr. Lin and an associate used data collected by the national health system of Taiwan for about 23 million residents who received care during 1998-2005. They identified 5,961 patients first diagnosed with an HIV infection during this period who had no history of stroke, and 22,307,214 people managed in the health system during the same time with no diagnosed HIV infection and no stroke history. The researchers then followed the stroke incidence among these people through 2011.

Following their HIV diagnosis, those with an infection had 59 ischemic strokes, 29 hemorrhagic strokes, and 15 strokes of undetermined type during an average follow-up of about 8 years. The researchers compared these incidence rates with the general population using standardized incidence ratios (SIR) that adjusted for age, sex, and duration of follow-up. The median age of those diagnosed with an HIV infection was 32 years, while the median age of the 22 million uninfected residents was 34 years.

The SIR analysis showed the excess rate of strokes was particularly significant among HIV-infected patients aged 65 or younger at the time of their infection diagnosis, and was more elevated in women than in men. Among infected men aged younger than 45 years, the SIR for any stroke was more than twice as high as in similarly aged men in the general population, and more than five times as high among infected women compared with similarly aged women in the general population. Among people infected with HIV when they were aged 45-65 years, the SIR for any stroke was 59% higher among men and three times as high among women. Both men and women who were first diagnosed with an HIV infection when they were aged older than 65 years did not have a significantly different SIR for stroke than the general population.

The researchers did not report information on which antiretroviral therapy the HIV-infected patients received or their immune status and how the level of infection control in infected patients related to their stroke rate.

[email protected]

SOURCE: Lin H-L et al. AIDS 2018, Abstract TUPDB0105.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM AIDS 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: Strokes occurred significantly more often in HIV-infected patients than in the general population.

Major finding: The stroke incidence was 94% higher among HIV-infected patients than in the general population.

Study details: A review of health insurance records from about 23 million people in Taiwan during 1998-2011.

Disclosures: Dr. Lin had no disclosures.

Source: Lin H-L et al. AIDS 2018, Abstract TUPDB0105.

Disqus Comments
Default
Use ProPublica

Less is more: Nanotechnology enhances antifungal’s efficacy

Article Type
Changed
Mon, 01/14/2019 - 10:28

The use of nanotechnology significantly reduced the amount of efinaconazole needed to effectively treat nail fungus in a study that pitted nitric oxide–releasing nanoparticles combined with the antifungal against reference strains of Trichophyton rubrum.

Efinaconazole has demonstrated effectiveness as a topical treatment for T. rubrum, but treatment can be expensive, with a single 4-mL bottle costing $691 at a major chain pharmacy, wrote Caroline B. Costa-Orlandi, PhD, of Universidade Estadual Paulista, Sao Paulo, Brazil, and her colleagues.

In a study published in the Journal of Drugs in Dermatology, an international research team evaluated topical efinaconazole and topical terbinafine, each combined with previously characterized, nitric oxide–releasing nanoparticles (NO-np) in a checkerboard design, to attack two reference strains of T. rubrum, ATCC MYA-4438 and ATCC 28189. NO-np was combined with 10% efinaconazole or with terbinafine.

The combination of NO-np and efinaconazole reduced the minimum inhibitory concentration (MIC) of efinaconazole by 16 times compared with treatment alone against ATCC MYA-4438; by 4 times when combined against ATCC 28189. With NO-np plus terbinafine, MICs against ATCC 28189 and ATCC MYA-4438 were reduced by four- and twofold, respectively, when compared with terbinafine alone. These data follow recently published findings in a study cited by the authors that demonstrated that NO-np is superior to topical terbinafine 1% cream in clearing infection in a mouse model of deep dermal dermatophytosis, suggesting that the combination may be even more effective (Nanomedicine. 2017 Oct;13[7]:2267-70).

“What we found was that we could impart the same antifungal activity at the highest concentrations tested of either alone by combining them at a fraction of these concentrations,” corresponding author Adam Friedman, MD, professor of dermatology, George Washington University, Washington, said in a press release issued by the university. The impact of this combination, “which we visualized using electron microscopy as compared to either product alone, highlighted their synergistic damaging effects at concentrations that would be completely safe to human cells,” he added.

Other benefits of NO-np include low cost, safety, ease of use, reduced likelihood for the development of antimicrobial resistance, and proven efficacy against other dermatophyte infections, the researchers noted.

The findings support the potential value of further research to evaluate nanoparticles combined with topical antifungals in a clinical setting, they said.

Dr. Costa-Orlandi had no financial conflicts to disclose. Authors Adam Friedman, MD, and Joel Friedman, MD, are coinventors of the nitric oxide–releasing nanoparticles used in the study. Dr. Adam Friedman is on the advisory board of Dermatology News.
 

SOURCE: Costa-Orlandi C et al. J Drugs Dermatol. 2018;17(7):717-20.

Publications
Topics
Sections

The use of nanotechnology significantly reduced the amount of efinaconazole needed to effectively treat nail fungus in a study that pitted nitric oxide–releasing nanoparticles combined with the antifungal against reference strains of Trichophyton rubrum.

Efinaconazole has demonstrated effectiveness as a topical treatment for T. rubrum, but treatment can be expensive, with a single 4-mL bottle costing $691 at a major chain pharmacy, wrote Caroline B. Costa-Orlandi, PhD, of Universidade Estadual Paulista, Sao Paulo, Brazil, and her colleagues.

In a study published in the Journal of Drugs in Dermatology, an international research team evaluated topical efinaconazole and topical terbinafine, each combined with previously characterized, nitric oxide–releasing nanoparticles (NO-np) in a checkerboard design, to attack two reference strains of T. rubrum, ATCC MYA-4438 and ATCC 28189. NO-np was combined with 10% efinaconazole or with terbinafine.

The combination of NO-np and efinaconazole reduced the minimum inhibitory concentration (MIC) of efinaconazole by 16 times compared with treatment alone against ATCC MYA-4438; by 4 times when combined against ATCC 28189. With NO-np plus terbinafine, MICs against ATCC 28189 and ATCC MYA-4438 were reduced by four- and twofold, respectively, when compared with terbinafine alone. These data follow recently published findings in a study cited by the authors that demonstrated that NO-np is superior to topical terbinafine 1% cream in clearing infection in a mouse model of deep dermal dermatophytosis, suggesting that the combination may be even more effective (Nanomedicine. 2017 Oct;13[7]:2267-70).

“What we found was that we could impart the same antifungal activity at the highest concentrations tested of either alone by combining them at a fraction of these concentrations,” corresponding author Adam Friedman, MD, professor of dermatology, George Washington University, Washington, said in a press release issued by the university. The impact of this combination, “which we visualized using electron microscopy as compared to either product alone, highlighted their synergistic damaging effects at concentrations that would be completely safe to human cells,” he added.

Other benefits of NO-np include low cost, safety, ease of use, reduced likelihood for the development of antimicrobial resistance, and proven efficacy against other dermatophyte infections, the researchers noted.

The findings support the potential value of further research to evaluate nanoparticles combined with topical antifungals in a clinical setting, they said.

Dr. Costa-Orlandi had no financial conflicts to disclose. Authors Adam Friedman, MD, and Joel Friedman, MD, are coinventors of the nitric oxide–releasing nanoparticles used in the study. Dr. Adam Friedman is on the advisory board of Dermatology News.
 

SOURCE: Costa-Orlandi C et al. J Drugs Dermatol. 2018;17(7):717-20.

The use of nanotechnology significantly reduced the amount of efinaconazole needed to effectively treat nail fungus in a study that pitted nitric oxide–releasing nanoparticles combined with the antifungal against reference strains of Trichophyton rubrum.

Efinaconazole has demonstrated effectiveness as a topical treatment for T. rubrum, but treatment can be expensive, with a single 4-mL bottle costing $691 at a major chain pharmacy, wrote Caroline B. Costa-Orlandi, PhD, of Universidade Estadual Paulista, Sao Paulo, Brazil, and her colleagues.

In a study published in the Journal of Drugs in Dermatology, an international research team evaluated topical efinaconazole and topical terbinafine, each combined with previously characterized, nitric oxide–releasing nanoparticles (NO-np) in a checkerboard design, to attack two reference strains of T. rubrum, ATCC MYA-4438 and ATCC 28189. NO-np was combined with 10% efinaconazole or with terbinafine.

The combination of NO-np and efinaconazole reduced the minimum inhibitory concentration (MIC) of efinaconazole by 16 times compared with treatment alone against ATCC MYA-4438; by 4 times when combined against ATCC 28189. With NO-np plus terbinafine, MICs against ATCC 28189 and ATCC MYA-4438 were reduced by four- and twofold, respectively, when compared with terbinafine alone. These data follow recently published findings in a study cited by the authors that demonstrated that NO-np is superior to topical terbinafine 1% cream in clearing infection in a mouse model of deep dermal dermatophytosis, suggesting that the combination may be even more effective (Nanomedicine. 2017 Oct;13[7]:2267-70).

“What we found was that we could impart the same antifungal activity at the highest concentrations tested of either alone by combining them at a fraction of these concentrations,” corresponding author Adam Friedman, MD, professor of dermatology, George Washington University, Washington, said in a press release issued by the university. The impact of this combination, “which we visualized using electron microscopy as compared to either product alone, highlighted their synergistic damaging effects at concentrations that would be completely safe to human cells,” he added.

Other benefits of NO-np include low cost, safety, ease of use, reduced likelihood for the development of antimicrobial resistance, and proven efficacy against other dermatophyte infections, the researchers noted.

The findings support the potential value of further research to evaluate nanoparticles combined with topical antifungals in a clinical setting, they said.

Dr. Costa-Orlandi had no financial conflicts to disclose. Authors Adam Friedman, MD, and Joel Friedman, MD, are coinventors of the nitric oxide–releasing nanoparticles used in the study. Dr. Adam Friedman is on the advisory board of Dermatology News.
 

SOURCE: Costa-Orlandi C et al. J Drugs Dermatol. 2018;17(7):717-20.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JOURNAL OF DRUGS IN DERMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

Key clinical point: Adding nanoparticles to antifungal medication improved the drug’s effectiveness and reduced the amount needed.

Major finding: Efinaconazole combined with nitric oxide–releasing nanoparticles reduced the antifungal’s minimum inhibitory concentration 16-fold, compared with the antifungal alone against T. rubrum reference strains.

Study details: The data come from an in vitro analysis of nanoparticle-enhanced efinaconazole or terbinafine against T. rubrum.

Disclosures: Dr. Costa-Orlandi had no financial conflicts to disclose. Coauthors Dr. Adam Friedman and Dr. Joel Friedman are coinventors of the nitric oxide–releasing nanoparticles used in the study.

Source: Costa-Orlandi C et al. J Drugs Dermatol. 2018;17(7):717-20.

Disqus Comments
Default
Use ProPublica

Real-time microarrays can simultaneously detect HCV and HIV-1, -2 infections

Article Type
Changed
Sat, 12/08/2018 - 15:13

The use of TaqMan Array Card (TAC) microarrays has been extended to permit simultaneous detection of HIV-1, HIV-2, and five hepatitis viruses from a small amount of extracted nucleic acid, according to a study by Timothy C. Granade, MD, and his colleagues at the Centers for Disease Control and Prevention, Atlanta.

copyright Martynasfoto/Thinkstock

This is particularly important for dealing with HIV-infected individuals, because HIV-1 and HIV-2 require different treatment interventions, and approximately one-third of HIV-infected patients have been found to be coinfected with hepatitis C or hepatitis B, according to the study report, published in the Journal of Virological Methods (J Virol Methods. 2018 Sep;259:60-5).

HIV-1-positive plasma samples from a variety of subtypes as well as whole blood specimens were confirmed for HIV-1-infection serologically or by nucleic amplification methods. HIV-2 whole blood and plasma specimens were also obtained.

TAC cards contained one positive control, one negative control, three HIV-1 replicates, and two HIV-2 replicates. In addition, the five common hepatitis viruses (A-E) were each replicated three times on each card. The cards were used to test the RNA isolates obtained from the various samples.

Ninety-five of the 104 known HIV-1-positive specimens were assayed positive using TAC; 23 of 26 HIV-2-seeded specimens were detectable using TAC and no cross-reactivity was seen between HIV-1-positive and HIV-2-positive specimens.

Eighteen of the HIV-1-positive specimens were also reactive in triplicate for HCV; three of the HIV-1-positive specimens were reactive to HBV and one specimen was reactive to HIV-1, HBV, and HCV.

“The TAC assay could be invaluable in large-scale screening environments or in surveying local outbreaks such as the recent HIV cluster found in Indiana. Many of these individuals were later determined to be infected with hepatitis C. The use of TAC could shorten the time to identifying and confirming such cases and permit the detection of multiple blood-borne infections in a single test. Application of TAC technology to general population surveillance could identify problem areas for both HIV prevention and intervention efforts in a variety of global environs,” the researchers concluded.

The authors were employed by the Centers for Disease Control and Prevention, Atlanta, which funded the study.

Publications
Topics
Sections

The use of TaqMan Array Card (TAC) microarrays has been extended to permit simultaneous detection of HIV-1, HIV-2, and five hepatitis viruses from a small amount of extracted nucleic acid, according to a study by Timothy C. Granade, MD, and his colleagues at the Centers for Disease Control and Prevention, Atlanta.

copyright Martynasfoto/Thinkstock

This is particularly important for dealing with HIV-infected individuals, because HIV-1 and HIV-2 require different treatment interventions, and approximately one-third of HIV-infected patients have been found to be coinfected with hepatitis C or hepatitis B, according to the study report, published in the Journal of Virological Methods (J Virol Methods. 2018 Sep;259:60-5).

HIV-1-positive plasma samples from a variety of subtypes as well as whole blood specimens were confirmed for HIV-1-infection serologically or by nucleic amplification methods. HIV-2 whole blood and plasma specimens were also obtained.

TAC cards contained one positive control, one negative control, three HIV-1 replicates, and two HIV-2 replicates. In addition, the five common hepatitis viruses (A-E) were each replicated three times on each card. The cards were used to test the RNA isolates obtained from the various samples.

Ninety-five of the 104 known HIV-1-positive specimens were assayed positive using TAC; 23 of 26 HIV-2-seeded specimens were detectable using TAC and no cross-reactivity was seen between HIV-1-positive and HIV-2-positive specimens.

Eighteen of the HIV-1-positive specimens were also reactive in triplicate for HCV; three of the HIV-1-positive specimens were reactive to HBV and one specimen was reactive to HIV-1, HBV, and HCV.

“The TAC assay could be invaluable in large-scale screening environments or in surveying local outbreaks such as the recent HIV cluster found in Indiana. Many of these individuals were later determined to be infected with hepatitis C. The use of TAC could shorten the time to identifying and confirming such cases and permit the detection of multiple blood-borne infections in a single test. Application of TAC technology to general population surveillance could identify problem areas for both HIV prevention and intervention efforts in a variety of global environs,” the researchers concluded.

The authors were employed by the Centers for Disease Control and Prevention, Atlanta, which funded the study.

The use of TaqMan Array Card (TAC) microarrays has been extended to permit simultaneous detection of HIV-1, HIV-2, and five hepatitis viruses from a small amount of extracted nucleic acid, according to a study by Timothy C. Granade, MD, and his colleagues at the Centers for Disease Control and Prevention, Atlanta.

copyright Martynasfoto/Thinkstock

This is particularly important for dealing with HIV-infected individuals, because HIV-1 and HIV-2 require different treatment interventions, and approximately one-third of HIV-infected patients have been found to be coinfected with hepatitis C or hepatitis B, according to the study report, published in the Journal of Virological Methods (J Virol Methods. 2018 Sep;259:60-5).

HIV-1-positive plasma samples from a variety of subtypes as well as whole blood specimens were confirmed for HIV-1-infection serologically or by nucleic amplification methods. HIV-2 whole blood and plasma specimens were also obtained.

TAC cards contained one positive control, one negative control, three HIV-1 replicates, and two HIV-2 replicates. In addition, the five common hepatitis viruses (A-E) were each replicated three times on each card. The cards were used to test the RNA isolates obtained from the various samples.

Ninety-five of the 104 known HIV-1-positive specimens were assayed positive using TAC; 23 of 26 HIV-2-seeded specimens were detectable using TAC and no cross-reactivity was seen between HIV-1-positive and HIV-2-positive specimens.

Eighteen of the HIV-1-positive specimens were also reactive in triplicate for HCV; three of the HIV-1-positive specimens were reactive to HBV and one specimen was reactive to HIV-1, HBV, and HCV.

“The TAC assay could be invaluable in large-scale screening environments or in surveying local outbreaks such as the recent HIV cluster found in Indiana. Many of these individuals were later determined to be infected with hepatitis C. The use of TAC could shorten the time to identifying and confirming such cases and permit the detection of multiple blood-borne infections in a single test. Application of TAC technology to general population surveillance could identify problem areas for both HIV prevention and intervention efforts in a variety of global environs,” the researchers concluded.

The authors were employed by the Centers for Disease Control and Prevention, Atlanta, which funded the study.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF VIROLOGICAL METHODS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Federal Health Care Data Trends: Respiratory Disorders

Article Type
Changed
Mon, 08/06/2018 - 11:48

Asthma and chronic obstructive pulmonary disease (COPD), which comprises a combination of chronic and slowly progressive respiratory disorders, including emphysema and chronic bronchitis, are prevalent respiratory disorders in the active-duty and veteran populations. Although chronic and manageable, asthma, COPD, and other respiratory diseases represent a significant disease burden. Women tend to develop COPD at younger ages, have more exacerbations, and yet received fewer inhaler medications and less appropriate therapies. Not only do many respiratory diseases present a risk of mortality, but evidence suggests that there is increased risk of developing lung cancer.

Click here to continue reading.

Publications
Topics
Page Number
S30-S31
Sections

Asthma and chronic obstructive pulmonary disease (COPD), which comprises a combination of chronic and slowly progressive respiratory disorders, including emphysema and chronic bronchitis, are prevalent respiratory disorders in the active-duty and veteran populations. Although chronic and manageable, asthma, COPD, and other respiratory diseases represent a significant disease burden. Women tend to develop COPD at younger ages, have more exacerbations, and yet received fewer inhaler medications and less appropriate therapies. Not only do many respiratory diseases present a risk of mortality, but evidence suggests that there is increased risk of developing lung cancer.

Click here to continue reading.

Asthma and chronic obstructive pulmonary disease (COPD), which comprises a combination of chronic and slowly progressive respiratory disorders, including emphysema and chronic bronchitis, are prevalent respiratory disorders in the active-duty and veteran populations. Although chronic and manageable, asthma, COPD, and other respiratory diseases represent a significant disease burden. Women tend to develop COPD at younger ages, have more exacerbations, and yet received fewer inhaler medications and less appropriate therapies. Not only do many respiratory diseases present a risk of mortality, but evidence suggests that there is increased risk of developing lung cancer.

Click here to continue reading.

Page Number
S30-S31
Page Number
S30-S31
Publications
Publications
Topics
Article Type
Sections
Citation Override
Fed Pract. 2018 July;35(5):S30-S31
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 07/18/2018 - 09:45
Un-Gate On Date
Wed, 07/18/2018 - 09:45
Use ProPublica
CFC Schedule Remove Status
Wed, 07/18/2018 - 09:45

Treatments, disease affect spermatogonia in boys

Article Type
Changed
Wed, 07/25/2018 - 00:03
Display Headline
Treatments, disease affect spermatogonia in boys

Image from Dreamstime
Male germinal epithelium showing spermatogonia, spermatocytes, spermatids, and spermatozoa

Alkylating agents, hydroxyurea (HU), and certain non-malignant diseases can significantly deplete spermatogonial cell counts in young boys, according to research published in Human Reproduction.

Boys who received alkylating agents to treat cancer had significantly lower spermatogonial cell counts than control subjects or boys with malignant/nonmalignant diseases treated with non-alkylating agents.

Five of 6 SCD patients treated with HU had a totally depleted spermatogonial pool, and the remaining patient had a low spermatogonial cell count.

Five boys with non-malignant diseases who were not exposed to chemotherapy had significantly lower spermatogonial cell counts than controls.

“Our findings of a dramatic decrease in germ cell numbers in boys treated with alkylating agents and in sickle cell disease patients treated with hydroxyurea suggest that storing frozen testicular tissue from these boys should be performed before these treatments are initiated,” said study author Cecilia Petersen, MD, PhD, of Karolinska Institutet and University Hospital in Stockholm, Sweden.

“This needs to be communicated to physicians as well as patients and their parents or carers. However, until sperm that are able to fertilize eggs are produced from stored testicular tissue, we cannot confirm that germ cell quantity might determine the success of transplantation of the tissue in adulthood. Further research on this is needed to establish a realistic fertility preservation technique.”

Dr Petersen and her colleagues also noted that preserving testicular tissue may not be a viable option for boys who have low spermatogonial cell counts prior to treatment.

Patients and controls

For this study, the researchers analyzed testicular tissue from 32 boys facing treatments that carried a high risk of infertility—testicular irradiation, chemotherapy, or radiotherapy in advance of stem cell transplant.

Twenty boys had the tissue taken after initial chemotherapy, and 12 had it taken before starting any treatment.1

Eight patients had received chemotherapy with non-alkylating agents, 6 (all with malignancies) had received alkylating agents, and 6 (all with SCD) had received HU.

Diseases included acute lymphoblastic leukemia (n=6), SCD (n=6), acute myeloid leukemia (n=3), thalassemia major (n=3), neuroblastoma (n=2), juvenile myelomonocytic leukemia (n=2), myelodysplastic syndromes (n=2), primary immunodeficiency (n=2), Wilms tumor (n=1), adrenoleukodystrophy (n=1), hepatoblastoma (n=1), primitive neuroectodermal tumor (n=1), severe aplastic anemia (n=1), and Fanconi anemia (n=1).

The researchers compared samples from these 32 patients to 14 healthy testicular tissue samples stored in the biobank at the Karolinska University Hospital.

For both sample types, the team counted the number of spermatogonial cells found in a cross-section of seminiferous tubules.

“We could compare the number of spermatogonia with those found in the healthy boys as a way to estimate the effect of medical treatment or the disease itself on the future fertility of a patient,” explained study author Jan-Bernd Stukenborg, PhD, of Karolinska Institutet and University Hospital.

Impact of treatment

There was no significant difference in the mean quantity of spermatogonia per transverse tubular cross-section (S/T) between patients exposed to non-alkylating agents (1.7 ± 1.0, n=8) and biobank controls (4.1 ± 4.6, n=14).

However, samples from patients who received alkylating agents had a significantly lower mean S/T value (0.2 ± 0.3, n=6) than samples from patients treated with non-alkylating agents (P=0.003) and biobank controls (P<0.001).

“We found that the numbers of germ cells present in the cross-sections of the seminiferous tubules were significantly depleted and close to 0 in patients treated with alkylating agents,” Dr Stukenborg said.

Samples from the SCD patients also had a significantly lower mean S/T value (0.3 ± 0.6, n=6) than biobank controls (P=0.003).

 

 

Dr Stukenborg noted that the germ cell pool was totally depleted in 5 of the boys with SCD, and the pool was “very low” in the sixth SCD patient.

“This was not seen in patients who had not started treatment or were treated with non-alkylating agents or in the biobank tissues,” Dr Stukenborg said.2

He and his colleagues noted that it is possible for germ cells to recover to normal levels after treatment that is highly toxic to the testes, but high doses of alkylating agents and radiotherapy to the testicles are strongly associated with permanent or long-term infertility.

“The first group of boys who received bone marrow transplants are now reaching their thirties,” said study author Kirsi Jahnukainen, MD, PhD, of Helsinki University Central Hospital in Finland.

“Recent data suggest they may have a high chance of their sperm production recovering, even if they received high-dose alkylating therapies, so long as they had no testicular irradiation.”

Impact of disease

The researchers also found evidence to suggest that, for some boys, their disease may have affected spermatogonial cell counts before any treatment began.

Five patients with non-malignant disease who had not been exposed to chemotherapy (3 with thalassemia major, 1 with Fanconi anemia, and 1 with primary immunodeficiency) had a significantly lower mean S/T value (0.4 ± 0.5) than controls (P=0.006).

“Among patients who had not been treated previously with chemotherapy, there were several boys with a low number of germ cells for their age,” Dr Jahnukainen said.

“This suggests that some non-malignant diseases that require bone marrow transplants may affect the fertility of young boys even before exposure to therapy that is toxic for the testes.”

The researchers noted that a limitation of this study was that biobank samples had no detailed information regarding previous medical treatments and testicular volumes.

1. Testicular tissue is taken from patients under general anesthesia. The surgeon removes approximately 20% of the tissue from the testicular capsule in one of the testicles. For this study, a third of the tissue was taken to the Karolinska Institutet for analysis.

2. A recent meta-analysis showed that normal testicular tissue samples of newborns contain approximately 2.5 germ cells per tubular cross-section. This number decreases to approximately 1.2 within the first 3 years of age, followed by an increase up to 2.6 germ cells per tubular cross-section at 6 to 7 years, reaching a plateau until the age of 11. At the onset of puberty, an increase of up to 7 spermatogonia per tubular cross-section could be observed.

Publications
Topics

Image from Dreamstime
Male germinal epithelium showing spermatogonia, spermatocytes, spermatids, and spermatozoa

Alkylating agents, hydroxyurea (HU), and certain non-malignant diseases can significantly deplete spermatogonial cell counts in young boys, according to research published in Human Reproduction.

Boys who received alkylating agents to treat cancer had significantly lower spermatogonial cell counts than control subjects or boys with malignant/nonmalignant diseases treated with non-alkylating agents.

Five of 6 SCD patients treated with HU had a totally depleted spermatogonial pool, and the remaining patient had a low spermatogonial cell count.

Five boys with non-malignant diseases who were not exposed to chemotherapy had significantly lower spermatogonial cell counts than controls.

“Our findings of a dramatic decrease in germ cell numbers in boys treated with alkylating agents and in sickle cell disease patients treated with hydroxyurea suggest that storing frozen testicular tissue from these boys should be performed before these treatments are initiated,” said study author Cecilia Petersen, MD, PhD, of Karolinska Institutet and University Hospital in Stockholm, Sweden.

“This needs to be communicated to physicians as well as patients and their parents or carers. However, until sperm that are able to fertilize eggs are produced from stored testicular tissue, we cannot confirm that germ cell quantity might determine the success of transplantation of the tissue in adulthood. Further research on this is needed to establish a realistic fertility preservation technique.”

Dr Petersen and her colleagues also noted that preserving testicular tissue may not be a viable option for boys who have low spermatogonial cell counts prior to treatment.

Patients and controls

For this study, the researchers analyzed testicular tissue from 32 boys facing treatments that carried a high risk of infertility—testicular irradiation, chemotherapy, or radiotherapy in advance of stem cell transplant.

Twenty boys had the tissue taken after initial chemotherapy, and 12 had it taken before starting any treatment.1

Eight patients had received chemotherapy with non-alkylating agents, 6 (all with malignancies) had received alkylating agents, and 6 (all with SCD) had received HU.

Diseases included acute lymphoblastic leukemia (n=6), SCD (n=6), acute myeloid leukemia (n=3), thalassemia major (n=3), neuroblastoma (n=2), juvenile myelomonocytic leukemia (n=2), myelodysplastic syndromes (n=2), primary immunodeficiency (n=2), Wilms tumor (n=1), adrenoleukodystrophy (n=1), hepatoblastoma (n=1), primitive neuroectodermal tumor (n=1), severe aplastic anemia (n=1), and Fanconi anemia (n=1).

The researchers compared samples from these 32 patients to 14 healthy testicular tissue samples stored in the biobank at the Karolinska University Hospital.

For both sample types, the team counted the number of spermatogonial cells found in a cross-section of seminiferous tubules.

“We could compare the number of spermatogonia with those found in the healthy boys as a way to estimate the effect of medical treatment or the disease itself on the future fertility of a patient,” explained study author Jan-Bernd Stukenborg, PhD, of Karolinska Institutet and University Hospital.

Impact of treatment

There was no significant difference in the mean quantity of spermatogonia per transverse tubular cross-section (S/T) between patients exposed to non-alkylating agents (1.7 ± 1.0, n=8) and biobank controls (4.1 ± 4.6, n=14).

However, samples from patients who received alkylating agents had a significantly lower mean S/T value (0.2 ± 0.3, n=6) than samples from patients treated with non-alkylating agents (P=0.003) and biobank controls (P<0.001).

“We found that the numbers of germ cells present in the cross-sections of the seminiferous tubules were significantly depleted and close to 0 in patients treated with alkylating agents,” Dr Stukenborg said.

Samples from the SCD patients also had a significantly lower mean S/T value (0.3 ± 0.6, n=6) than biobank controls (P=0.003).

 

 

Dr Stukenborg noted that the germ cell pool was totally depleted in 5 of the boys with SCD, and the pool was “very low” in the sixth SCD patient.

“This was not seen in patients who had not started treatment or were treated with non-alkylating agents or in the biobank tissues,” Dr Stukenborg said.2

He and his colleagues noted that it is possible for germ cells to recover to normal levels after treatment that is highly toxic to the testes, but high doses of alkylating agents and radiotherapy to the testicles are strongly associated with permanent or long-term infertility.

“The first group of boys who received bone marrow transplants are now reaching their thirties,” said study author Kirsi Jahnukainen, MD, PhD, of Helsinki University Central Hospital in Finland.

“Recent data suggest they may have a high chance of their sperm production recovering, even if they received high-dose alkylating therapies, so long as they had no testicular irradiation.”

Impact of disease

The researchers also found evidence to suggest that, for some boys, their disease may have affected spermatogonial cell counts before any treatment began.

Five patients with non-malignant disease who had not been exposed to chemotherapy (3 with thalassemia major, 1 with Fanconi anemia, and 1 with primary immunodeficiency) had a significantly lower mean S/T value (0.4 ± 0.5) than controls (P=0.006).

“Among patients who had not been treated previously with chemotherapy, there were several boys with a low number of germ cells for their age,” Dr Jahnukainen said.

“This suggests that some non-malignant diseases that require bone marrow transplants may affect the fertility of young boys even before exposure to therapy that is toxic for the testes.”

The researchers noted that a limitation of this study was that biobank samples had no detailed information regarding previous medical treatments and testicular volumes.

1. Testicular tissue is taken from patients under general anesthesia. The surgeon removes approximately 20% of the tissue from the testicular capsule in one of the testicles. For this study, a third of the tissue was taken to the Karolinska Institutet for analysis.

2. A recent meta-analysis showed that normal testicular tissue samples of newborns contain approximately 2.5 germ cells per tubular cross-section. This number decreases to approximately 1.2 within the first 3 years of age, followed by an increase up to 2.6 germ cells per tubular cross-section at 6 to 7 years, reaching a plateau until the age of 11. At the onset of puberty, an increase of up to 7 spermatogonia per tubular cross-section could be observed.

Image from Dreamstime
Male germinal epithelium showing spermatogonia, spermatocytes, spermatids, and spermatozoa

Alkylating agents, hydroxyurea (HU), and certain non-malignant diseases can significantly deplete spermatogonial cell counts in young boys, according to research published in Human Reproduction.

Boys who received alkylating agents to treat cancer had significantly lower spermatogonial cell counts than control subjects or boys with malignant/nonmalignant diseases treated with non-alkylating agents.

Five of 6 SCD patients treated with HU had a totally depleted spermatogonial pool, and the remaining patient had a low spermatogonial cell count.

Five boys with non-malignant diseases who were not exposed to chemotherapy had significantly lower spermatogonial cell counts than controls.

“Our findings of a dramatic decrease in germ cell numbers in boys treated with alkylating agents and in sickle cell disease patients treated with hydroxyurea suggest that storing frozen testicular tissue from these boys should be performed before these treatments are initiated,” said study author Cecilia Petersen, MD, PhD, of Karolinska Institutet and University Hospital in Stockholm, Sweden.

“This needs to be communicated to physicians as well as patients and their parents or carers. However, until sperm that are able to fertilize eggs are produced from stored testicular tissue, we cannot confirm that germ cell quantity might determine the success of transplantation of the tissue in adulthood. Further research on this is needed to establish a realistic fertility preservation technique.”

Dr Petersen and her colleagues also noted that preserving testicular tissue may not be a viable option for boys who have low spermatogonial cell counts prior to treatment.

Patients and controls

For this study, the researchers analyzed testicular tissue from 32 boys facing treatments that carried a high risk of infertility—testicular irradiation, chemotherapy, or radiotherapy in advance of stem cell transplant.

Twenty boys had the tissue taken after initial chemotherapy, and 12 had it taken before starting any treatment.1

Eight patients had received chemotherapy with non-alkylating agents, 6 (all with malignancies) had received alkylating agents, and 6 (all with SCD) had received HU.

Diseases included acute lymphoblastic leukemia (n=6), SCD (n=6), acute myeloid leukemia (n=3), thalassemia major (n=3), neuroblastoma (n=2), juvenile myelomonocytic leukemia (n=2), myelodysplastic syndromes (n=2), primary immunodeficiency (n=2), Wilms tumor (n=1), adrenoleukodystrophy (n=1), hepatoblastoma (n=1), primitive neuroectodermal tumor (n=1), severe aplastic anemia (n=1), and Fanconi anemia (n=1).

The researchers compared samples from these 32 patients to 14 healthy testicular tissue samples stored in the biobank at the Karolinska University Hospital.

For both sample types, the team counted the number of spermatogonial cells found in a cross-section of seminiferous tubules.

“We could compare the number of spermatogonia with those found in the healthy boys as a way to estimate the effect of medical treatment or the disease itself on the future fertility of a patient,” explained study author Jan-Bernd Stukenborg, PhD, of Karolinska Institutet and University Hospital.

Impact of treatment

There was no significant difference in the mean quantity of spermatogonia per transverse tubular cross-section (S/T) between patients exposed to non-alkylating agents (1.7 ± 1.0, n=8) and biobank controls (4.1 ± 4.6, n=14).

However, samples from patients who received alkylating agents had a significantly lower mean S/T value (0.2 ± 0.3, n=6) than samples from patients treated with non-alkylating agents (P=0.003) and biobank controls (P<0.001).

“We found that the numbers of germ cells present in the cross-sections of the seminiferous tubules were significantly depleted and close to 0 in patients treated with alkylating agents,” Dr Stukenborg said.

Samples from the SCD patients also had a significantly lower mean S/T value (0.3 ± 0.6, n=6) than biobank controls (P=0.003).

 

 

Dr Stukenborg noted that the germ cell pool was totally depleted in 5 of the boys with SCD, and the pool was “very low” in the sixth SCD patient.

“This was not seen in patients who had not started treatment or were treated with non-alkylating agents or in the biobank tissues,” Dr Stukenborg said.2

He and his colleagues noted that it is possible for germ cells to recover to normal levels after treatment that is highly toxic to the testes, but high doses of alkylating agents and radiotherapy to the testicles are strongly associated with permanent or long-term infertility.

“The first group of boys who received bone marrow transplants are now reaching their thirties,” said study author Kirsi Jahnukainen, MD, PhD, of Helsinki University Central Hospital in Finland.

“Recent data suggest they may have a high chance of their sperm production recovering, even if they received high-dose alkylating therapies, so long as they had no testicular irradiation.”

Impact of disease

The researchers also found evidence to suggest that, for some boys, their disease may have affected spermatogonial cell counts before any treatment began.

Five patients with non-malignant disease who had not been exposed to chemotherapy (3 with thalassemia major, 1 with Fanconi anemia, and 1 with primary immunodeficiency) had a significantly lower mean S/T value (0.4 ± 0.5) than controls (P=0.006).

“Among patients who had not been treated previously with chemotherapy, there were several boys with a low number of germ cells for their age,” Dr Jahnukainen said.

“This suggests that some non-malignant diseases that require bone marrow transplants may affect the fertility of young boys even before exposure to therapy that is toxic for the testes.”

The researchers noted that a limitation of this study was that biobank samples had no detailed information regarding previous medical treatments and testicular volumes.

1. Testicular tissue is taken from patients under general anesthesia. The surgeon removes approximately 20% of the tissue from the testicular capsule in one of the testicles. For this study, a third of the tissue was taken to the Karolinska Institutet for analysis.

2. A recent meta-analysis showed that normal testicular tissue samples of newborns contain approximately 2.5 germ cells per tubular cross-section. This number decreases to approximately 1.2 within the first 3 years of age, followed by an increase up to 2.6 germ cells per tubular cross-section at 6 to 7 years, reaching a plateau until the age of 11. At the onset of puberty, an increase of up to 7 spermatogonia per tubular cross-section could be observed.

Publications
Publications
Topics
Article Type
Display Headline
Treatments, disease affect spermatogonia in boys
Display Headline
Treatments, disease affect spermatogonia in boys
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Fitness trackers help monitor cancer patients

Article Type
Changed
Wed, 07/25/2018 - 00:02
Display Headline
Fitness trackers help monitor cancer patients

Photo from Cedars-Sinai
Fitness trackers

A small study suggests fitness trackers can be used to assess the quality of life and daily functioning of cancer patients during treatment.

Results indicated that objective data collected from these wearable activity monitors can supplement current assessments of health status and physical function.

This is important because current assessments are limited by their subjectivity and potential for bias, according to Gillian Gresham, PhD, of Cedars-Sinai Medical Center in Los Angeles, California.

Dr Gresham and her colleagues conducted this study and reported the results in npj Digital Medicine.

“One of the challenges in treating patients with advanced cancer is obtaining ongoing, timely, objective data about their physical status during therapy,” said study author Andrew Hendifar, MD, of Cedars-Sinai.

“After all, patients typically spend most of their time at home or work, not in a clinic, and their health statuses change from day to day.”

With this in mind, the researchers studied 37 patients undergoing treatment for advanced cancer at Cedars-Sinai.

The patients wore wrist-mounted fitness trackers throughout the study except when showering or swimming. These devices log the wearer’s step counts, stairs climbed, calories, heart rate, and sleep.

Sets of activity data were collected for 3 consecutive visits during treatment. After the final clinical visit, patients were followed for 6 months to gather additional clinical and survival outcomes.

The researchers compared data from the trackers with patients’ assessments of their own symptoms, including pain, fatigue, and sleep quality, as collected from a National Institutes of Health questionnaire.

These data sets were also compared with Eastern Cooperative Oncology Group Performance Status (ECOG-PS) and Karnofsky Performance Status (KPS) scores.

Results

Patients had a median age of 62 (range, 34-81), about 54% were male, and most (73%) had pancreatic cancer. On average, the patients walked 3700 steps (1.7 miles) per day, climbed 3 flights of stairs per day, and slept 8 hours per night.

The researchers found that activity metrics correlated with ECOG-PS and KPS scores. As scores increased, daily steps and flights of stairs decreased.

The team said the largest correlation coefficients (r) were observed between average steps and increasing ECOG-PS (r=0.63, P<0.01) and KPS (r=0.69, P<0.01) scores.

Patient-reported outcomes also correlated with activity metrics. Average steps were significantly (P<0.05 for all) associated with physical functioning (r=0.57), pain (r=—0.46), and fatigue (r=—0.53). There were significant associations for distance walked and stairs climbed as well.

Finally, the researchers observed an association between activity and grade 3/4 adverse events, hospitalizations, and survival.

An increase of 1000 steps per day, on average, was associated with significantly lower odds of hospitalization (odds ratio: 0.21, 95% CI 0.56, 0.79) and grade 3/4 adverse events (odds ratio: 0.34, 95% CI 0.13, 0.94) as well as increased survival (hazard ratio: 0.48, 95% CI 0.28, 0.83).

“Data gathered through advancements in technology has the potential to help physicians measure the impact of a particular treatment on a patient’s daily functioning,” Dr Gresham said. “Furthermore, continuous activity monitoring may help predict and monitor treatment complications and allow for more timely and appropriate interventions.”

As a next step, the researchers plan to study long-term use of activity monitors in a larger, more diverse group of advanced cancer patients and correlate that data with clinical and self-reported outcomes.

“Our hope is that findings from future studies with wearable activity monitors could lead to development of individualized treatment and exercise plans that may result in increased treatment tolerability and improved survival outcomes for patients,” Dr Hendifar said.

Publications
Topics

Photo from Cedars-Sinai
Fitness trackers

A small study suggests fitness trackers can be used to assess the quality of life and daily functioning of cancer patients during treatment.

Results indicated that objective data collected from these wearable activity monitors can supplement current assessments of health status and physical function.

This is important because current assessments are limited by their subjectivity and potential for bias, according to Gillian Gresham, PhD, of Cedars-Sinai Medical Center in Los Angeles, California.

Dr Gresham and her colleagues conducted this study and reported the results in npj Digital Medicine.

“One of the challenges in treating patients with advanced cancer is obtaining ongoing, timely, objective data about their physical status during therapy,” said study author Andrew Hendifar, MD, of Cedars-Sinai.

“After all, patients typically spend most of their time at home or work, not in a clinic, and their health statuses change from day to day.”

With this in mind, the researchers studied 37 patients undergoing treatment for advanced cancer at Cedars-Sinai.

The patients wore wrist-mounted fitness trackers throughout the study except when showering or swimming. These devices log the wearer’s step counts, stairs climbed, calories, heart rate, and sleep.

Sets of activity data were collected for 3 consecutive visits during treatment. After the final clinical visit, patients were followed for 6 months to gather additional clinical and survival outcomes.

The researchers compared data from the trackers with patients’ assessments of their own symptoms, including pain, fatigue, and sleep quality, as collected from a National Institutes of Health questionnaire.

These data sets were also compared with Eastern Cooperative Oncology Group Performance Status (ECOG-PS) and Karnofsky Performance Status (KPS) scores.

Results

Patients had a median age of 62 (range, 34-81), about 54% were male, and most (73%) had pancreatic cancer. On average, the patients walked 3700 steps (1.7 miles) per day, climbed 3 flights of stairs per day, and slept 8 hours per night.

The researchers found that activity metrics correlated with ECOG-PS and KPS scores. As scores increased, daily steps and flights of stairs decreased.

The team said the largest correlation coefficients (r) were observed between average steps and increasing ECOG-PS (r=0.63, P<0.01) and KPS (r=0.69, P<0.01) scores.

Patient-reported outcomes also correlated with activity metrics. Average steps were significantly (P<0.05 for all) associated with physical functioning (r=0.57), pain (r=—0.46), and fatigue (r=—0.53). There were significant associations for distance walked and stairs climbed as well.

Finally, the researchers observed an association between activity and grade 3/4 adverse events, hospitalizations, and survival.

An increase of 1000 steps per day, on average, was associated with significantly lower odds of hospitalization (odds ratio: 0.21, 95% CI 0.56, 0.79) and grade 3/4 adverse events (odds ratio: 0.34, 95% CI 0.13, 0.94) as well as increased survival (hazard ratio: 0.48, 95% CI 0.28, 0.83).

“Data gathered through advancements in technology has the potential to help physicians measure the impact of a particular treatment on a patient’s daily functioning,” Dr Gresham said. “Furthermore, continuous activity monitoring may help predict and monitor treatment complications and allow for more timely and appropriate interventions.”

As a next step, the researchers plan to study long-term use of activity monitors in a larger, more diverse group of advanced cancer patients and correlate that data with clinical and self-reported outcomes.

“Our hope is that findings from future studies with wearable activity monitors could lead to development of individualized treatment and exercise plans that may result in increased treatment tolerability and improved survival outcomes for patients,” Dr Hendifar said.

Photo from Cedars-Sinai
Fitness trackers

A small study suggests fitness trackers can be used to assess the quality of life and daily functioning of cancer patients during treatment.

Results indicated that objective data collected from these wearable activity monitors can supplement current assessments of health status and physical function.

This is important because current assessments are limited by their subjectivity and potential for bias, according to Gillian Gresham, PhD, of Cedars-Sinai Medical Center in Los Angeles, California.

Dr Gresham and her colleagues conducted this study and reported the results in npj Digital Medicine.

“One of the challenges in treating patients with advanced cancer is obtaining ongoing, timely, objective data about their physical status during therapy,” said study author Andrew Hendifar, MD, of Cedars-Sinai.

“After all, patients typically spend most of their time at home or work, not in a clinic, and their health statuses change from day to day.”

With this in mind, the researchers studied 37 patients undergoing treatment for advanced cancer at Cedars-Sinai.

The patients wore wrist-mounted fitness trackers throughout the study except when showering or swimming. These devices log the wearer’s step counts, stairs climbed, calories, heart rate, and sleep.

Sets of activity data were collected for 3 consecutive visits during treatment. After the final clinical visit, patients were followed for 6 months to gather additional clinical and survival outcomes.

The researchers compared data from the trackers with patients’ assessments of their own symptoms, including pain, fatigue, and sleep quality, as collected from a National Institutes of Health questionnaire.

These data sets were also compared with Eastern Cooperative Oncology Group Performance Status (ECOG-PS) and Karnofsky Performance Status (KPS) scores.

Results

Patients had a median age of 62 (range, 34-81), about 54% were male, and most (73%) had pancreatic cancer. On average, the patients walked 3700 steps (1.7 miles) per day, climbed 3 flights of stairs per day, and slept 8 hours per night.

The researchers found that activity metrics correlated with ECOG-PS and KPS scores. As scores increased, daily steps and flights of stairs decreased.

The team said the largest correlation coefficients (r) were observed between average steps and increasing ECOG-PS (r=0.63, P<0.01) and KPS (r=0.69, P<0.01) scores.

Patient-reported outcomes also correlated with activity metrics. Average steps were significantly (P<0.05 for all) associated with physical functioning (r=0.57), pain (r=—0.46), and fatigue (r=—0.53). There were significant associations for distance walked and stairs climbed as well.

Finally, the researchers observed an association between activity and grade 3/4 adverse events, hospitalizations, and survival.

An increase of 1000 steps per day, on average, was associated with significantly lower odds of hospitalization (odds ratio: 0.21, 95% CI 0.56, 0.79) and grade 3/4 adverse events (odds ratio: 0.34, 95% CI 0.13, 0.94) as well as increased survival (hazard ratio: 0.48, 95% CI 0.28, 0.83).

“Data gathered through advancements in technology has the potential to help physicians measure the impact of a particular treatment on a patient’s daily functioning,” Dr Gresham said. “Furthermore, continuous activity monitoring may help predict and monitor treatment complications and allow for more timely and appropriate interventions.”

As a next step, the researchers plan to study long-term use of activity monitors in a larger, more diverse group of advanced cancer patients and correlate that data with clinical and self-reported outcomes.

“Our hope is that findings from future studies with wearable activity monitors could lead to development of individualized treatment and exercise plans that may result in increased treatment tolerability and improved survival outcomes for patients,” Dr Hendifar said.

Publications
Publications
Topics
Article Type
Display Headline
Fitness trackers help monitor cancer patients
Display Headline
Fitness trackers help monitor cancer patients
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

NIH aims to improve access to cloud computing

Article Type
Changed
Wed, 07/25/2018 - 00:01
Display Headline
NIH aims to improve access to cloud computing

Photo by Darren Baker
Researcher at a computer

The National Institutes of Health (NIH) is attempting to improve biomedical researchers’ access to cloud computing.

With its new STRIDES* initiative, the NIH intends to establish partnerships with commercial cloud service providers (CSPs) to reduce economic and technological barriers to accessing and computing on large biomedical data sets.

The CSPs will work with the NIH and its funded researchers to develop and test new ways to make large data sets and associated computational tools available to wider audiences.

The NIH’s initial efforts with the STRIDES initiative will focus on making NIH high-value data sets more accessible through the cloud, leveraging partnerships with CSPs to take advantage of data-related innovations such as machine learning and artificial intelligence, and experimenting with new ways to optimize technology-intensive research.

The goals of the STRIDES initiative are to:

  • Support researchers’ transition to conducting biomedical research using commercial cloud technologies through cost-effective storage and computing arrangements with CSPs
  • Provide NIH researchers access to and training on new and emerging cloud-based tools and services
  • Facilitate researchers’ access to and use of high-value NIH research data that are currently stored on, or will be moved into, cloud environments
  • Enable the formation of an interconnected ecosystem that breaks down silos related to generating, analyzing, and sharing research data.

The NIH has already partnered with Google Cloud for the STRIDES initiative, but the agency hopes to create partnerships with other CSPs as well.

“NIH is in a unique position to bring together academic and innovation industry partners to create a biomedical data ecosystem that maximizes the use of NIH-supported biomedical research data for the greatest benefit to human health,” said NIH Principal Deputy Director Lawrence A. Tabak, DDS, PhD.

The NIH says its agreement with Google Cloud creates a cost-efficient framework for NIH researchers, as well as researchers receiving NIH support, to make use of Google Cloud’s storage, computing, and machine learning technologies.

The partnership will also enable the creation of training programs for researchers at NIH-funded institutions on how to use the Google Cloud platform. And the partnership will involve collaboration with NIH’s Data Commons Pilot—a group of projects testing new tools and methods for working with and sharing data in the cloud.

“Through our partnership with NIH, we are bringing the power of data and the cloud to the biomedical research community globally,” said Gregory Moore, MD, PhD, vice-president of healthcare at Google Cloud.

“Together, we are making it easier for scientists and physicians to access and garner insights from NIH-funded data sets with appropriate privacy protections, which will ultimately accelerate biomedical research progress toward finding treatments and cures for the most devastating diseases of our time.”

A central tenet of STRIDES is that data made available through these partnerships will incorporate standards endorsed by the biomedical research community to make data findable, accessible, interoperable, and reusable.

*Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability

Publications
Topics

Photo by Darren Baker
Researcher at a computer

The National Institutes of Health (NIH) is attempting to improve biomedical researchers’ access to cloud computing.

With its new STRIDES* initiative, the NIH intends to establish partnerships with commercial cloud service providers (CSPs) to reduce economic and technological barriers to accessing and computing on large biomedical data sets.

The CSPs will work with the NIH and its funded researchers to develop and test new ways to make large data sets and associated computational tools available to wider audiences.

The NIH’s initial efforts with the STRIDES initiative will focus on making NIH high-value data sets more accessible through the cloud, leveraging partnerships with CSPs to take advantage of data-related innovations such as machine learning and artificial intelligence, and experimenting with new ways to optimize technology-intensive research.

The goals of the STRIDES initiative are to:

  • Support researchers’ transition to conducting biomedical research using commercial cloud technologies through cost-effective storage and computing arrangements with CSPs
  • Provide NIH researchers access to and training on new and emerging cloud-based tools and services
  • Facilitate researchers’ access to and use of high-value NIH research data that are currently stored on, or will be moved into, cloud environments
  • Enable the formation of an interconnected ecosystem that breaks down silos related to generating, analyzing, and sharing research data.

The NIH has already partnered with Google Cloud for the STRIDES initiative, but the agency hopes to create partnerships with other CSPs as well.

“NIH is in a unique position to bring together academic and innovation industry partners to create a biomedical data ecosystem that maximizes the use of NIH-supported biomedical research data for the greatest benefit to human health,” said NIH Principal Deputy Director Lawrence A. Tabak, DDS, PhD.

The NIH says its agreement with Google Cloud creates a cost-efficient framework for NIH researchers, as well as researchers receiving NIH support, to make use of Google Cloud’s storage, computing, and machine learning technologies.

The partnership will also enable the creation of training programs for researchers at NIH-funded institutions on how to use the Google Cloud platform. And the partnership will involve collaboration with NIH’s Data Commons Pilot—a group of projects testing new tools and methods for working with and sharing data in the cloud.

“Through our partnership with NIH, we are bringing the power of data and the cloud to the biomedical research community globally,” said Gregory Moore, MD, PhD, vice-president of healthcare at Google Cloud.

“Together, we are making it easier for scientists and physicians to access and garner insights from NIH-funded data sets with appropriate privacy protections, which will ultimately accelerate biomedical research progress toward finding treatments and cures for the most devastating diseases of our time.”

A central tenet of STRIDES is that data made available through these partnerships will incorporate standards endorsed by the biomedical research community to make data findable, accessible, interoperable, and reusable.

*Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability

Photo by Darren Baker
Researcher at a computer

The National Institutes of Health (NIH) is attempting to improve biomedical researchers’ access to cloud computing.

With its new STRIDES* initiative, the NIH intends to establish partnerships with commercial cloud service providers (CSPs) to reduce economic and technological barriers to accessing and computing on large biomedical data sets.

The CSPs will work with the NIH and its funded researchers to develop and test new ways to make large data sets and associated computational tools available to wider audiences.

The NIH’s initial efforts with the STRIDES initiative will focus on making NIH high-value data sets more accessible through the cloud, leveraging partnerships with CSPs to take advantage of data-related innovations such as machine learning and artificial intelligence, and experimenting with new ways to optimize technology-intensive research.

The goals of the STRIDES initiative are to:

  • Support researchers’ transition to conducting biomedical research using commercial cloud technologies through cost-effective storage and computing arrangements with CSPs
  • Provide NIH researchers access to and training on new and emerging cloud-based tools and services
  • Facilitate researchers’ access to and use of high-value NIH research data that are currently stored on, or will be moved into, cloud environments
  • Enable the formation of an interconnected ecosystem that breaks down silos related to generating, analyzing, and sharing research data.

The NIH has already partnered with Google Cloud for the STRIDES initiative, but the agency hopes to create partnerships with other CSPs as well.

“NIH is in a unique position to bring together academic and innovation industry partners to create a biomedical data ecosystem that maximizes the use of NIH-supported biomedical research data for the greatest benefit to human health,” said NIH Principal Deputy Director Lawrence A. Tabak, DDS, PhD.

The NIH says its agreement with Google Cloud creates a cost-efficient framework for NIH researchers, as well as researchers receiving NIH support, to make use of Google Cloud’s storage, computing, and machine learning technologies.

The partnership will also enable the creation of training programs for researchers at NIH-funded institutions on how to use the Google Cloud platform. And the partnership will involve collaboration with NIH’s Data Commons Pilot—a group of projects testing new tools and methods for working with and sharing data in the cloud.

“Through our partnership with NIH, we are bringing the power of data and the cloud to the biomedical research community globally,” said Gregory Moore, MD, PhD, vice-president of healthcare at Google Cloud.

“Together, we are making it easier for scientists and physicians to access and garner insights from NIH-funded data sets with appropriate privacy protections, which will ultimately accelerate biomedical research progress toward finding treatments and cures for the most devastating diseases of our time.”

A central tenet of STRIDES is that data made available through these partnerships will incorporate standards endorsed by the biomedical research community to make data findable, accessible, interoperable, and reusable.

*Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability

Publications
Publications
Topics
Article Type
Display Headline
NIH aims to improve access to cloud computing
Display Headline
NIH aims to improve access to cloud computing
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica