One-time AMH level predicts rapid perimenopausal bone loss

A tremendous benefit
Article Type
Changed
Fri, 01/18/2019 - 15:54
Display Headline
One-time AMH level predicts rapid perimenopausal bone loss

BOSTON – Anti-Müllerian hormone levels strongly predict the rate of perimenopausal loss of bone mineral density and might help identify women who need early intervention to prevent future osteoporotic fractures, according to data from a review of 474 perimenopausal women that was presented at the annual meeting of the Endocrine Society.

The team matched anti-Müllerian hormone (AMH) levels and bone mineral density (BMD) measurements taken 2-4 years before the final menstrual period to BMD measurements taken 3 years later. The women were part of the Study of Women’s Health Across the Nation (SWAN), an ongoing multicenter study of women during their middle years.

Dr. Arun Karlamangla

When perimenopausal AMH “goes below 250 pg/mL, you are beginning to lose bone, and, when it goes below 200 pg/mL, you are losing bone fast, so that’s when you might want to intervene.” The finding “opens up the possibility of identifying women who are going to lose the most bone mass during the transition and targeting them before they have lost a substantial amount,” said lead investigator Dr. Arun Karlamangla of the department of geriatrics at the University of California, Los Angeles.

BMD loss is normal during menopause but rates of decline vary among women. AMH is a product of ovarian granulosa cells commonly used in fertility clinics to gauge ovarian reserve, but AMH levels also decline during menopause, and in a fairly stable fashion, he explained.

The women in SWAN were 42-52 years old at baseline with an intact uterus, at least one ovary, and no use of exogenous hormones. Blood was drawn during the early follicular phase of the menstrual cycle.

The median rate of BMD decline was 1.26% per year in the lumbar spine and 1.03% per year in the femoral neck. The median AMH was 49 pg/mL but varied widely.

Adjusted for age, body mass index, smoking, race, and study site, the team found that for each 75% (or fourfold) decrement in AMH level, there was a 0.15% per year faster decline in spine BMD and 0.13% per year faster decline in femoral neck BMD. Each fourfold decrement was also associated with an 18% increase in the odds of faster than median decline in spine BMD and 17% increase in the odds of faster than median decline in femoral neck BMD. The fast losers lost more than 2% of their BMD per year in both the lumbar spine and femoral neck.

The results were the same after adjustment for follicle-stimulating hormone and estrogen levels, “so AMH provides information that cannot be obtained from estrogen and FSH,” Dr. Karlamangla said.

He cautioned that the technique needs further development and validation before it’s ready for the clinic. The team used the PicoAMH test from Ansh Labs in Webster, Tex.

The investigators had no disclosures. Ansh provided the assays for free. SWAN is funded by the National Institutes of Health.

[email protected]

References

Body

The current recommendation is to start bone mineral density screening in women at age 65 years. All of us who see patients in the menopause years worry that we are missing someone with faster than normal bone loss. Fast losers are critical to identify because if we wait until they are 65 years old, it’s too late. A clinical test such as this to identify fast losers for earlier BMD measurement would be a tremendous benefit.

Dr. Cynthia Stuenkel is a clinical professor of endocrinology at the University of California, San Diego. She moderated the presentation and was not involved in the research.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event
Body

The current recommendation is to start bone mineral density screening in women at age 65 years. All of us who see patients in the menopause years worry that we are missing someone with faster than normal bone loss. Fast losers are critical to identify because if we wait until they are 65 years old, it’s too late. A clinical test such as this to identify fast losers for earlier BMD measurement would be a tremendous benefit.

Dr. Cynthia Stuenkel is a clinical professor of endocrinology at the University of California, San Diego. She moderated the presentation and was not involved in the research.

Body

The current recommendation is to start bone mineral density screening in women at age 65 years. All of us who see patients in the menopause years worry that we are missing someone with faster than normal bone loss. Fast losers are critical to identify because if we wait until they are 65 years old, it’s too late. A clinical test such as this to identify fast losers for earlier BMD measurement would be a tremendous benefit.

Dr. Cynthia Stuenkel is a clinical professor of endocrinology at the University of California, San Diego. She moderated the presentation and was not involved in the research.

Title
A tremendous benefit
A tremendous benefit

BOSTON – Anti-Müllerian hormone levels strongly predict the rate of perimenopausal loss of bone mineral density and might help identify women who need early intervention to prevent future osteoporotic fractures, according to data from a review of 474 perimenopausal women that was presented at the annual meeting of the Endocrine Society.

The team matched anti-Müllerian hormone (AMH) levels and bone mineral density (BMD) measurements taken 2-4 years before the final menstrual period to BMD measurements taken 3 years later. The women were part of the Study of Women’s Health Across the Nation (SWAN), an ongoing multicenter study of women during their middle years.

Dr. Arun Karlamangla

When perimenopausal AMH “goes below 250 pg/mL, you are beginning to lose bone, and, when it goes below 200 pg/mL, you are losing bone fast, so that’s when you might want to intervene.” The finding “opens up the possibility of identifying women who are going to lose the most bone mass during the transition and targeting them before they have lost a substantial amount,” said lead investigator Dr. Arun Karlamangla of the department of geriatrics at the University of California, Los Angeles.

BMD loss is normal during menopause but rates of decline vary among women. AMH is a product of ovarian granulosa cells commonly used in fertility clinics to gauge ovarian reserve, but AMH levels also decline during menopause, and in a fairly stable fashion, he explained.

The women in SWAN were 42-52 years old at baseline with an intact uterus, at least one ovary, and no use of exogenous hormones. Blood was drawn during the early follicular phase of the menstrual cycle.

The median rate of BMD decline was 1.26% per year in the lumbar spine and 1.03% per year in the femoral neck. The median AMH was 49 pg/mL but varied widely.

Adjusted for age, body mass index, smoking, race, and study site, the team found that for each 75% (or fourfold) decrement in AMH level, there was a 0.15% per year faster decline in spine BMD and 0.13% per year faster decline in femoral neck BMD. Each fourfold decrement was also associated with an 18% increase in the odds of faster than median decline in spine BMD and 17% increase in the odds of faster than median decline in femoral neck BMD. The fast losers lost more than 2% of their BMD per year in both the lumbar spine and femoral neck.

The results were the same after adjustment for follicle-stimulating hormone and estrogen levels, “so AMH provides information that cannot be obtained from estrogen and FSH,” Dr. Karlamangla said.

He cautioned that the technique needs further development and validation before it’s ready for the clinic. The team used the PicoAMH test from Ansh Labs in Webster, Tex.

The investigators had no disclosures. Ansh provided the assays for free. SWAN is funded by the National Institutes of Health.

[email protected]

BOSTON – Anti-Müllerian hormone levels strongly predict the rate of perimenopausal loss of bone mineral density and might help identify women who need early intervention to prevent future osteoporotic fractures, according to data from a review of 474 perimenopausal women that was presented at the annual meeting of the Endocrine Society.

The team matched anti-Müllerian hormone (AMH) levels and bone mineral density (BMD) measurements taken 2-4 years before the final menstrual period to BMD measurements taken 3 years later. The women were part of the Study of Women’s Health Across the Nation (SWAN), an ongoing multicenter study of women during their middle years.

Dr. Arun Karlamangla

When perimenopausal AMH “goes below 250 pg/mL, you are beginning to lose bone, and, when it goes below 200 pg/mL, you are losing bone fast, so that’s when you might want to intervene.” The finding “opens up the possibility of identifying women who are going to lose the most bone mass during the transition and targeting them before they have lost a substantial amount,” said lead investigator Dr. Arun Karlamangla of the department of geriatrics at the University of California, Los Angeles.

BMD loss is normal during menopause but rates of decline vary among women. AMH is a product of ovarian granulosa cells commonly used in fertility clinics to gauge ovarian reserve, but AMH levels also decline during menopause, and in a fairly stable fashion, he explained.

The women in SWAN were 42-52 years old at baseline with an intact uterus, at least one ovary, and no use of exogenous hormones. Blood was drawn during the early follicular phase of the menstrual cycle.

The median rate of BMD decline was 1.26% per year in the lumbar spine and 1.03% per year in the femoral neck. The median AMH was 49 pg/mL but varied widely.

Adjusted for age, body mass index, smoking, race, and study site, the team found that for each 75% (or fourfold) decrement in AMH level, there was a 0.15% per year faster decline in spine BMD and 0.13% per year faster decline in femoral neck BMD. Each fourfold decrement was also associated with an 18% increase in the odds of faster than median decline in spine BMD and 17% increase in the odds of faster than median decline in femoral neck BMD. The fast losers lost more than 2% of their BMD per year in both the lumbar spine and femoral neck.

The results were the same after adjustment for follicle-stimulating hormone and estrogen levels, “so AMH provides information that cannot be obtained from estrogen and FSH,” Dr. Karlamangla said.

He cautioned that the technique needs further development and validation before it’s ready for the clinic. The team used the PicoAMH test from Ansh Labs in Webster, Tex.

The investigators had no disclosures. Ansh provided the assays for free. SWAN is funded by the National Institutes of Health.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
One-time AMH level predicts rapid perimenopausal bone loss
Display Headline
One-time AMH level predicts rapid perimenopausal bone loss
Article Source

AT ENDO 2016

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Anti-Müllerian hormone levels strongly predict the rate of perimenopausal bone mineral density loss and might help identify women who need early intervention to prevent future osteoporotic fractures, according to a review of 474 perimenopausal women that was presented at the Endocrine Society annual meeting.

Major finding: Adjusted for age, body mass index, smoking, race, and study site, the team found that for each 75% (or fourfold) decrement in AMH level, there was a 0.15% per year faster decline in lumbar spine BMD and 0.13% per year faster decline in femoral neck BMD.

Data source: Review of 474 perimenopausal women in the Study of Women’s Health Across the Nation.

Disclosures: The investigators had no disclosures. Ansh Labs provided the assays for free. SWAN is funded by the National Institutes of Health.

FDA approves first drug to treat hallucinations in Parkinson’s psychosis

Article Type
Changed
Fri, 01/18/2019 - 15:54
Display Headline
FDA approves first drug to treat hallucinations in Parkinson’s psychosis

Pimavanserin on April 29 became the first drug to receive approval from the Food and Drug Administration for the indication of hallucinations and delusions associated with psychosis in Parkinson’s disease.

The drug, to be marketed under the brand name Nuplazid by Acadia Pharmaceuticals, was shown in a 6-week clinical trial of 199 participants to be superior to placebo in decreasing the frequency and/or severity of hallucinations and delusions without worsening the primary motor symptoms of Parkinson’s disease.

The most common side effects reported from the clinical trial included peripheral edema, nausea, and confused state of mind.

“Nuplazid represents an important treatment for people with Parkinson’s disease who experience these symptoms,” said Dr. Mitchell Mathis, director of the Division of Psychiatry Products in the FDA’s Center for Drug Evaluation and Research.

The FDA gave pimavanserin a Boxed Warning in relation to the general increased risk of death associated with the use of atypical antipsychotic drugs to treat older people with dementia-related psychosis. No other drug atypical antipsychotic drug has been approved to treat patients with dementia-related psychosis.

Read the agency’s full statement on the FDA website.

[email protected]

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Related Articles

Pimavanserin on April 29 became the first drug to receive approval from the Food and Drug Administration for the indication of hallucinations and delusions associated with psychosis in Parkinson’s disease.

The drug, to be marketed under the brand name Nuplazid by Acadia Pharmaceuticals, was shown in a 6-week clinical trial of 199 participants to be superior to placebo in decreasing the frequency and/or severity of hallucinations and delusions without worsening the primary motor symptoms of Parkinson’s disease.

The most common side effects reported from the clinical trial included peripheral edema, nausea, and confused state of mind.

“Nuplazid represents an important treatment for people with Parkinson’s disease who experience these symptoms,” said Dr. Mitchell Mathis, director of the Division of Psychiatry Products in the FDA’s Center for Drug Evaluation and Research.

The FDA gave pimavanserin a Boxed Warning in relation to the general increased risk of death associated with the use of atypical antipsychotic drugs to treat older people with dementia-related psychosis. No other drug atypical antipsychotic drug has been approved to treat patients with dementia-related psychosis.

Read the agency’s full statement on the FDA website.

[email protected]

Pimavanserin on April 29 became the first drug to receive approval from the Food and Drug Administration for the indication of hallucinations and delusions associated with psychosis in Parkinson’s disease.

The drug, to be marketed under the brand name Nuplazid by Acadia Pharmaceuticals, was shown in a 6-week clinical trial of 199 participants to be superior to placebo in decreasing the frequency and/or severity of hallucinations and delusions without worsening the primary motor symptoms of Parkinson’s disease.

The most common side effects reported from the clinical trial included peripheral edema, nausea, and confused state of mind.

“Nuplazid represents an important treatment for people with Parkinson’s disease who experience these symptoms,” said Dr. Mitchell Mathis, director of the Division of Psychiatry Products in the FDA’s Center for Drug Evaluation and Research.

The FDA gave pimavanserin a Boxed Warning in relation to the general increased risk of death associated with the use of atypical antipsychotic drugs to treat older people with dementia-related psychosis. No other drug atypical antipsychotic drug has been approved to treat patients with dementia-related psychosis.

Read the agency’s full statement on the FDA website.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
FDA approves first drug to treat hallucinations in Parkinson’s psychosis
Display Headline
FDA approves first drug to treat hallucinations in Parkinson’s psychosis
Article Source

PURLs Copyright

Inside the Article

The Elongated Dermatofibroma: A New Dermoscopic Variant?

Article Type
Changed
Thu, 01/10/2019 - 13:30
Display Headline
The Elongated Dermatofibroma: A New Dermoscopic Variant?

To the Editor:

Dermatofibroma is a common cutaneous lesion that most frequently affects young or middle-aged adults, especially women.1 Clinically, it appears as a firm, pink or brown nodule. It may be painful or show a tendency for scarring. The pathognomonic feature of dermatofibroma, regarded as a fibrohistiocytic tumor, is the so-called button sign caused by skin depression following pressure. We present a unique case of elongated dermatofibroma with a linear, white, scarlike patch with a brownish pigmented network and globules.

A 40-year-old woman presented with a linear elongated lesion localized to the right side of the infrascapular region of 10 years’ duration. The lesion initially was a small brownish plaque. There was no history of trauma or scratching. Over the next 10 years, the lesion slowly progressed, finally becoming a linear, atrophic, brownish plaque that was 2.5-cm long (Figure 1). The button sign was positive. On dermoscopy the central, elongated, white patch was visualized not as a typical round patch but as a scarlike white line (Figure 2A) surrounded by a brownish network that was especially pronounced in the distal parts of the lesion. In the upper part of the lesion, multiple marginally disseminated, dark brown dots were present. Brownish globules within the linear white patch also were observed in the lower central part. Figure 2B presents a dermoscopic picture of the linear variant of dermatofibroma. For cosmetic reasons, the patient underwent total surgical excision of the lesion. Histopathology revealed distinct characteristics of dermatofibroma (Figures 3A and 3B).

Figure 1. Macroscopic view of a linear white-brown plaqueextending along the Blaschko line in the infrascapular region.

Figure 2. Dermoscopy of the elongated dermatofibroma revealed a linear scarlike structure in the upper part (A). Brownish globules within the linear white patch area also were observed in the lower central part of the lesion on dermoscopy (B).

Figure 3. Histopathology revealed dermatofibroma (A and B)(both H&E, original magnifications ×40 and ×100). A storiform pattern of spindled and bland fibroblasts and histiocytelike cells in the mid dermis and subcutaneous tissue were seen with infiltrative margins but sparing the epidermis. Spindle cells had scant cytoplasm and thin elongated nuclei with pointed ends. Nuclei almost touched each other, unlike smooth muscle lesions.

 

 

The most common features of dermatofibromas seen in polarized and nonpolarized dermoscopy are central white scarlike patches, brown globulelike structures, vascular structures, and a peripheral fine pigmented network.2 Kilinc Karaarslan et al3 described atypical dermatofibromas with linear irregular crypts, which were seen in 26.9% of all studied cases. These irregular crypts were mainly medium in size (10 lesions), with only 2 lesions being tiny and regularly distributed. Only one lesion had atypical clinical and dermoscopic features occurring as an atrophic plaque with multiple small scarlike areas and peripherally distributed pigment network.3 Based on this typology, we believe our patient represents a case of elongated dermatofibroma that could be an atrophic variant of dermatofibroma. This form would not appear as a small scarlike area with pigment network in a somewhat patchy distribution3 but as a scarlike linear chord with a bipolar pigment network. Zaballos et al1 described 10 dermoscopic patterns of dermatofibroma (N=412); the most common was a central white patch and peripheral pigment network in approximately 35% of cases. A white scarlike patch was observed in 57.0% of dermat-ofibromas in 4 variants: (1) a solitary structure located in the center; (2) multiple white scarlike patches; (3) white scarlike patch extending throughout the lesion or irregularly distributed; and (4) white network (central, total, or irregular).1 Agero et al2 first described the new feature as a central white patch characterized by shiny white streaks. The most frequent dermoscopic pattern associated with dermatofibromas is the central white scarlike patch and peripheral delicate pigment network.1,4 Arpaia et al4 observed that dermoscopic patterns may correspond to distinct sequential stages of the formation of dermatofibroma. The linear character we described may be related to a variant of scarring keloid dermatofibroma.5

References
  1. Zaballos P, Puig S, Llambrich A, et al. Dermoscopy of dermatofibromas: a prospective morphological study of 412 cases. Arch Dermatol. 2008;144:75-83.
  2. Agero AL, Taliercio S, Dusza SW, et al. Conventional and polarized dermoscopy features of dermatofibroma. Arch Dermatol. 2006;142:1431-1437.
  3. Kilinc Karaarslan I, Gencoglan G, Akalin T, et al. Different dermoscopic faces of dermatofibromas. J Am Acad Dermatol. 2007;57:401-406.
  4. Arpaia N, Cassano N, Vena GA. Dermoscopic patterns of dermatofibroma. Dermatol Surg. 2005;31:1336-1339.
  5. Kuo TT, Hu S, Chan HL. Keloidal dermatofibroma: report of 10 cases of a new variant. Am J Surg Pathol. 1998;22:564-568.
Article PDF
Author and Disclosure Information

Dr. Kaminska-Winciorek is from the Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, Poland. Dr. Antosz is from Regional Specialist Hospital, Tychy, Poland. Dr. Spiewak is from the Department of Experimental Dermatology and Cosmetology, Faculty of Pharmacy, Jagiellonian University Medical College, Krakow, Poland.

The authors report no conflict of interest.

Correspondence: Grazyna Kaminska-Winciorek, MD, PhD, The Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, 44-101 Gliwice, Wybrzeze Armii Krajowej 15, Poland ([email protected]).

Issue
Cutis - 97(5)
Publications
Topics
Page Number
E1-E3
Legacy Keywords
dermatofibroma, linear variant, dermoscopy
Sections
Author and Disclosure Information

Dr. Kaminska-Winciorek is from the Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, Poland. Dr. Antosz is from Regional Specialist Hospital, Tychy, Poland. Dr. Spiewak is from the Department of Experimental Dermatology and Cosmetology, Faculty of Pharmacy, Jagiellonian University Medical College, Krakow, Poland.

The authors report no conflict of interest.

Correspondence: Grazyna Kaminska-Winciorek, MD, PhD, The Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, 44-101 Gliwice, Wybrzeze Armii Krajowej 15, Poland ([email protected]).

Author and Disclosure Information

Dr. Kaminska-Winciorek is from the Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, Poland. Dr. Antosz is from Regional Specialist Hospital, Tychy, Poland. Dr. Spiewak is from the Department of Experimental Dermatology and Cosmetology, Faculty of Pharmacy, Jagiellonian University Medical College, Krakow, Poland.

The authors report no conflict of interest.

Correspondence: Grazyna Kaminska-Winciorek, MD, PhD, The Department of Bone Marrow Transplantation and Onco-Hematology, The Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology Gliwice Branch, 44-101 Gliwice, Wybrzeze Armii Krajowej 15, Poland ([email protected]).

Article PDF
Article PDF
Related Articles

To the Editor:

Dermatofibroma is a common cutaneous lesion that most frequently affects young or middle-aged adults, especially women.1 Clinically, it appears as a firm, pink or brown nodule. It may be painful or show a tendency for scarring. The pathognomonic feature of dermatofibroma, regarded as a fibrohistiocytic tumor, is the so-called button sign caused by skin depression following pressure. We present a unique case of elongated dermatofibroma with a linear, white, scarlike patch with a brownish pigmented network and globules.

A 40-year-old woman presented with a linear elongated lesion localized to the right side of the infrascapular region of 10 years’ duration. The lesion initially was a small brownish plaque. There was no history of trauma or scratching. Over the next 10 years, the lesion slowly progressed, finally becoming a linear, atrophic, brownish plaque that was 2.5-cm long (Figure 1). The button sign was positive. On dermoscopy the central, elongated, white patch was visualized not as a typical round patch but as a scarlike white line (Figure 2A) surrounded by a brownish network that was especially pronounced in the distal parts of the lesion. In the upper part of the lesion, multiple marginally disseminated, dark brown dots were present. Brownish globules within the linear white patch also were observed in the lower central part. Figure 2B presents a dermoscopic picture of the linear variant of dermatofibroma. For cosmetic reasons, the patient underwent total surgical excision of the lesion. Histopathology revealed distinct characteristics of dermatofibroma (Figures 3A and 3B).

Figure 1. Macroscopic view of a linear white-brown plaqueextending along the Blaschko line in the infrascapular region.

Figure 2. Dermoscopy of the elongated dermatofibroma revealed a linear scarlike structure in the upper part (A). Brownish globules within the linear white patch area also were observed in the lower central part of the lesion on dermoscopy (B).

Figure 3. Histopathology revealed dermatofibroma (A and B)(both H&E, original magnifications ×40 and ×100). A storiform pattern of spindled and bland fibroblasts and histiocytelike cells in the mid dermis and subcutaneous tissue were seen with infiltrative margins but sparing the epidermis. Spindle cells had scant cytoplasm and thin elongated nuclei with pointed ends. Nuclei almost touched each other, unlike smooth muscle lesions.

 

 

The most common features of dermatofibromas seen in polarized and nonpolarized dermoscopy are central white scarlike patches, brown globulelike structures, vascular structures, and a peripheral fine pigmented network.2 Kilinc Karaarslan et al3 described atypical dermatofibromas with linear irregular crypts, which were seen in 26.9% of all studied cases. These irregular crypts were mainly medium in size (10 lesions), with only 2 lesions being tiny and regularly distributed. Only one lesion had atypical clinical and dermoscopic features occurring as an atrophic plaque with multiple small scarlike areas and peripherally distributed pigment network.3 Based on this typology, we believe our patient represents a case of elongated dermatofibroma that could be an atrophic variant of dermatofibroma. This form would not appear as a small scarlike area with pigment network in a somewhat patchy distribution3 but as a scarlike linear chord with a bipolar pigment network. Zaballos et al1 described 10 dermoscopic patterns of dermatofibroma (N=412); the most common was a central white patch and peripheral pigment network in approximately 35% of cases. A white scarlike patch was observed in 57.0% of dermat-ofibromas in 4 variants: (1) a solitary structure located in the center; (2) multiple white scarlike patches; (3) white scarlike patch extending throughout the lesion or irregularly distributed; and (4) white network (central, total, or irregular).1 Agero et al2 first described the new feature as a central white patch characterized by shiny white streaks. The most frequent dermoscopic pattern associated with dermatofibromas is the central white scarlike patch and peripheral delicate pigment network.1,4 Arpaia et al4 observed that dermoscopic patterns may correspond to distinct sequential stages of the formation of dermatofibroma. The linear character we described may be related to a variant of scarring keloid dermatofibroma.5

To the Editor:

Dermatofibroma is a common cutaneous lesion that most frequently affects young or middle-aged adults, especially women.1 Clinically, it appears as a firm, pink or brown nodule. It may be painful or show a tendency for scarring. The pathognomonic feature of dermatofibroma, regarded as a fibrohistiocytic tumor, is the so-called button sign caused by skin depression following pressure. We present a unique case of elongated dermatofibroma with a linear, white, scarlike patch with a brownish pigmented network and globules.

A 40-year-old woman presented with a linear elongated lesion localized to the right side of the infrascapular region of 10 years’ duration. The lesion initially was a small brownish plaque. There was no history of trauma or scratching. Over the next 10 years, the lesion slowly progressed, finally becoming a linear, atrophic, brownish plaque that was 2.5-cm long (Figure 1). The button sign was positive. On dermoscopy the central, elongated, white patch was visualized not as a typical round patch but as a scarlike white line (Figure 2A) surrounded by a brownish network that was especially pronounced in the distal parts of the lesion. In the upper part of the lesion, multiple marginally disseminated, dark brown dots were present. Brownish globules within the linear white patch also were observed in the lower central part. Figure 2B presents a dermoscopic picture of the linear variant of dermatofibroma. For cosmetic reasons, the patient underwent total surgical excision of the lesion. Histopathology revealed distinct characteristics of dermatofibroma (Figures 3A and 3B).

Figure 1. Macroscopic view of a linear white-brown plaqueextending along the Blaschko line in the infrascapular region.

Figure 2. Dermoscopy of the elongated dermatofibroma revealed a linear scarlike structure in the upper part (A). Brownish globules within the linear white patch area also were observed in the lower central part of the lesion on dermoscopy (B).

Figure 3. Histopathology revealed dermatofibroma (A and B)(both H&E, original magnifications ×40 and ×100). A storiform pattern of spindled and bland fibroblasts and histiocytelike cells in the mid dermis and subcutaneous tissue were seen with infiltrative margins but sparing the epidermis. Spindle cells had scant cytoplasm and thin elongated nuclei with pointed ends. Nuclei almost touched each other, unlike smooth muscle lesions.

 

 

The most common features of dermatofibromas seen in polarized and nonpolarized dermoscopy are central white scarlike patches, brown globulelike structures, vascular structures, and a peripheral fine pigmented network.2 Kilinc Karaarslan et al3 described atypical dermatofibromas with linear irregular crypts, which were seen in 26.9% of all studied cases. These irregular crypts were mainly medium in size (10 lesions), with only 2 lesions being tiny and regularly distributed. Only one lesion had atypical clinical and dermoscopic features occurring as an atrophic plaque with multiple small scarlike areas and peripherally distributed pigment network.3 Based on this typology, we believe our patient represents a case of elongated dermatofibroma that could be an atrophic variant of dermatofibroma. This form would not appear as a small scarlike area with pigment network in a somewhat patchy distribution3 but as a scarlike linear chord with a bipolar pigment network. Zaballos et al1 described 10 dermoscopic patterns of dermatofibroma (N=412); the most common was a central white patch and peripheral pigment network in approximately 35% of cases. A white scarlike patch was observed in 57.0% of dermat-ofibromas in 4 variants: (1) a solitary structure located in the center; (2) multiple white scarlike patches; (3) white scarlike patch extending throughout the lesion or irregularly distributed; and (4) white network (central, total, or irregular).1 Agero et al2 first described the new feature as a central white patch characterized by shiny white streaks. The most frequent dermoscopic pattern associated with dermatofibromas is the central white scarlike patch and peripheral delicate pigment network.1,4 Arpaia et al4 observed that dermoscopic patterns may correspond to distinct sequential stages of the formation of dermatofibroma. The linear character we described may be related to a variant of scarring keloid dermatofibroma.5

References
  1. Zaballos P, Puig S, Llambrich A, et al. Dermoscopy of dermatofibromas: a prospective morphological study of 412 cases. Arch Dermatol. 2008;144:75-83.
  2. Agero AL, Taliercio S, Dusza SW, et al. Conventional and polarized dermoscopy features of dermatofibroma. Arch Dermatol. 2006;142:1431-1437.
  3. Kilinc Karaarslan I, Gencoglan G, Akalin T, et al. Different dermoscopic faces of dermatofibromas. J Am Acad Dermatol. 2007;57:401-406.
  4. Arpaia N, Cassano N, Vena GA. Dermoscopic patterns of dermatofibroma. Dermatol Surg. 2005;31:1336-1339.
  5. Kuo TT, Hu S, Chan HL. Keloidal dermatofibroma: report of 10 cases of a new variant. Am J Surg Pathol. 1998;22:564-568.
References
  1. Zaballos P, Puig S, Llambrich A, et al. Dermoscopy of dermatofibromas: a prospective morphological study of 412 cases. Arch Dermatol. 2008;144:75-83.
  2. Agero AL, Taliercio S, Dusza SW, et al. Conventional and polarized dermoscopy features of dermatofibroma. Arch Dermatol. 2006;142:1431-1437.
  3. Kilinc Karaarslan I, Gencoglan G, Akalin T, et al. Different dermoscopic faces of dermatofibromas. J Am Acad Dermatol. 2007;57:401-406.
  4. Arpaia N, Cassano N, Vena GA. Dermoscopic patterns of dermatofibroma. Dermatol Surg. 2005;31:1336-1339.
  5. Kuo TT, Hu S, Chan HL. Keloidal dermatofibroma: report of 10 cases of a new variant. Am J Surg Pathol. 1998;22:564-568.
Issue
Cutis - 97(5)
Issue
Cutis - 97(5)
Page Number
E1-E3
Page Number
E1-E3
Publications
Publications
Topics
Article Type
Display Headline
The Elongated Dermatofibroma: A New Dermoscopic Variant?
Display Headline
The Elongated Dermatofibroma: A New Dermoscopic Variant?
Legacy Keywords
dermatofibroma, linear variant, dermoscopy
Legacy Keywords
dermatofibroma, linear variant, dermoscopy
Sections
Inside the Article

Practice Points

  • The most common features of dermatofibromas are white scarlike patches, brown globulelike structures, vascular structures, and a peripheral fine pigmented network.
  • Dermoscopy may be used in the diagnostic workup of pigmented nonmelanocytic lesions. 
Disallow All Ads
Article PDF Media

Caring for Children With Seizures Who Use Cannabinoids

Article Type
Changed
Thu, 12/15/2022 - 16:02
Display Headline
Caring for Children With Seizures Who Use Cannabinoids

As Colorado was among the first states to allow the medical use of marijuana, neurologists there have experience treating children with seizures who use cannabinoids. Their findings and recommendations regarding parent perceptions, administrative policies, and clinical practice may be useful to pediatric neurologists in other states.

At Marijuana and Cannabinoids: A Neuroscience Research Summit, convened by the NIH, Amy Brooks-Kayal, MD, Chief of Pediatric Neurology at the University of Colorado School of Medicine and Children’s Hospital Colorado in Aurora, described her facility’s experiences caring for this patient group.

Amy Brooks-Kayal, MD

Colorado has allowed the medical use of marijuana since November 2000, while other states more recently have legalized its use. Of the 107,798 patients in Colorado who hold a card that permits medical marijuana use, 349, or about 0.3%, are minors.

Seizures are a relatively rare reason for medical marijuana use. Dr. Brooks-Kayal said that she is not aware of any neurologists or pediatricians who prescribe cannabinoids for pediatric seizures. Any physician in Colorado who has a relationship with a patient can issue a card permitting marijuana use, and two physicians are needed to issue cards to minors.

To examine the use of medical mari­juana in Colorado in children with seizure disorders, Craig Press, MD, PhD, and his coauthors conducted an observational study of 75 patients with pediatric seizures who used medical marijuana, when Dr. Press was a pediatric neurology resident at Children’s Hospital Colorado. The study was published in the April 2015 issue of Epilepsy & Behavior. “We had no ability to determine what was in the substances given, other than parental report,” Dr. Brooks-Kayal said.

Parents’ Perception of Response

Overall, 33% of parents reported a greater than 50% reduction in seizures; this group was judged to be responders, with no significant difference in response rate by seizure type. A variety of cannabis products were used, including cannabidiol alone and cannabidiol with other oral cannabis extracts (OCEs). All produced similar response rates.

However, only 30 patients had pre- and post-cannabis EEGs. Of this group, none of the cannabis responders had an improvement in their EEGs after cannabis use, whereas three of the nonresponders showed EEG improvement. “The most interesting finding that we saw was that the response rate dramatically varied depending on whether the families had moved out of state,” Dr. Brooks-Kayal said. Families who had moved to Colorado from another state for treatment were three times more likely to report response to OCEs, compared with those families who were from Colorado (47% vs 22%; odds ratio, 3.16).

This result, she said, raised the possibility that “the degree of investment that the family had made in getting this therapy might be impacting the parents’ perception of response.”

Navigating State and Federal Policies

Since state and federal policies vary, it’s hard to know what to do when a family comes to you asking about cannabis for pediatric seizure control, Dr. Brooks-Kayal said.

She therefore outlined Children’s Hospital Colorado’s approach. There, “providers do not recommend use of cannabinoids for treatment of epilepsy outside of a clinical trial,” she said.

However, families are provided with the most current information about cannabinoids. This includes being frank about the current lack of evidence regarding efficacy and safety, as well as unknowns around dosing and drug interactions. She said providers also share concerns about what’s in artisanal marijuana products, since purity and consistency of content aren’t regulated.

It’s critical for families to feel comfortable disclosing whether their children with seizures are using cannabinoids, so providers can help track safety and efficacy. Disclosure may be more likely if you reinforce that you won’t stop caring for these children if they are on cannabinoids, Dr. Brooks-Kayal said. “We strongly encourage disclosure,” and it’s a standard part of intake at every appointment to ask about cannabinoids, she said.

When cannabinoids are being used, Dr. Brooks-Kayal recommends obtaining the following tests at baseline and monthly thereafter: complete blood count, liver function tests, basic metabolic panel, and trough antiseizure medication levels. Clobazam, N-desmethylclobazam, and valproic acid levels have all been seen to change with concomitant cannabinoid use, she said.

“We ask families not to change other medications,” Dr. Brooks-Kayal said. Her practice frequently sees statusepilepticus when other medications are stopped and cannabinoids started, she said. “That is a huge risk.”

Tracking Efficacy

To help families and providers track efficacy when cannabinoids are used, Dr. Brooks-Kayal asks families to keep a seizure diary. She obtains a baseline EEG and another EEG about three months later. Since the EEG should capture seizure frequency, the length of the EEG is tailored to the patient’s seizure frequency. Dr. Brooks-Kayal often obtains 24-hour EEGs for her patients.

 

 

If it’s appropriate, families can enroll their children in an observational research study. Families can also consider participating in pharmaceutical double-blind, placebo-controlled trials. Other practical tips include standardizing the way neurologists care for children who use cannabinoids in their practice, and working in advance with hospital administrators and the inpatient pharmacy to address the use of these products for inpatients.

A 2014 Cochrane review concluded that “no reliable conclusions can be drawn at present regarding the efficacy of cannabinoids as a treatment for epilepsy,” Dr. Brooks-Kayal said. A systematic review by the American Academy of Neurology reached the same conclusion. The American Epilepsy Society, the American Academy of Pediatrics, and the American Medical Association do not recommend routine clinical use of cannabinoids for seizures, but call for additional research. “We need better data,” Dr. Brooks-Kayal said.

Kari Oakes

References

Suggested Reading
Press CA, Knupp KG, Chapman KE. Parental reporting of response to oral cannabis extracts for treatment of refractory epilepsy. Epilepsy Behav. 2015;45:49-52.
Gloss D, Vickrey B. Cannabinoids for epilepsy. Cochrane Database Syst Rev. 2014;3:CD009270.

Author and Disclosure Information

Issue
Neurology Reviews - 24(5)
Publications
Topics
Page Number
31
Legacy Keywords
cannabinoids, seizures, medical marijuana, children, pediatric neurology, epilepsy, Amy Brooks-Kayal
Sections
Author and Disclosure Information

Author and Disclosure Information

Related Articles

As Colorado was among the first states to allow the medical use of marijuana, neurologists there have experience treating children with seizures who use cannabinoids. Their findings and recommendations regarding parent perceptions, administrative policies, and clinical practice may be useful to pediatric neurologists in other states.

At Marijuana and Cannabinoids: A Neuroscience Research Summit, convened by the NIH, Amy Brooks-Kayal, MD, Chief of Pediatric Neurology at the University of Colorado School of Medicine and Children’s Hospital Colorado in Aurora, described her facility’s experiences caring for this patient group.

Amy Brooks-Kayal, MD

Colorado has allowed the medical use of marijuana since November 2000, while other states more recently have legalized its use. Of the 107,798 patients in Colorado who hold a card that permits medical marijuana use, 349, or about 0.3%, are minors.

Seizures are a relatively rare reason for medical marijuana use. Dr. Brooks-Kayal said that she is not aware of any neurologists or pediatricians who prescribe cannabinoids for pediatric seizures. Any physician in Colorado who has a relationship with a patient can issue a card permitting marijuana use, and two physicians are needed to issue cards to minors.

To examine the use of medical mari­juana in Colorado in children with seizure disorders, Craig Press, MD, PhD, and his coauthors conducted an observational study of 75 patients with pediatric seizures who used medical marijuana, when Dr. Press was a pediatric neurology resident at Children’s Hospital Colorado. The study was published in the April 2015 issue of Epilepsy & Behavior. “We had no ability to determine what was in the substances given, other than parental report,” Dr. Brooks-Kayal said.

Parents’ Perception of Response

Overall, 33% of parents reported a greater than 50% reduction in seizures; this group was judged to be responders, with no significant difference in response rate by seizure type. A variety of cannabis products were used, including cannabidiol alone and cannabidiol with other oral cannabis extracts (OCEs). All produced similar response rates.

However, only 30 patients had pre- and post-cannabis EEGs. Of this group, none of the cannabis responders had an improvement in their EEGs after cannabis use, whereas three of the nonresponders showed EEG improvement. “The most interesting finding that we saw was that the response rate dramatically varied depending on whether the families had moved out of state,” Dr. Brooks-Kayal said. Families who had moved to Colorado from another state for treatment were three times more likely to report response to OCEs, compared with those families who were from Colorado (47% vs 22%; odds ratio, 3.16).

This result, she said, raised the possibility that “the degree of investment that the family had made in getting this therapy might be impacting the parents’ perception of response.”

Navigating State and Federal Policies

Since state and federal policies vary, it’s hard to know what to do when a family comes to you asking about cannabis for pediatric seizure control, Dr. Brooks-Kayal said.

She therefore outlined Children’s Hospital Colorado’s approach. There, “providers do not recommend use of cannabinoids for treatment of epilepsy outside of a clinical trial,” she said.

However, families are provided with the most current information about cannabinoids. This includes being frank about the current lack of evidence regarding efficacy and safety, as well as unknowns around dosing and drug interactions. She said providers also share concerns about what’s in artisanal marijuana products, since purity and consistency of content aren’t regulated.

It’s critical for families to feel comfortable disclosing whether their children with seizures are using cannabinoids, so providers can help track safety and efficacy. Disclosure may be more likely if you reinforce that you won’t stop caring for these children if they are on cannabinoids, Dr. Brooks-Kayal said. “We strongly encourage disclosure,” and it’s a standard part of intake at every appointment to ask about cannabinoids, she said.

When cannabinoids are being used, Dr. Brooks-Kayal recommends obtaining the following tests at baseline and monthly thereafter: complete blood count, liver function tests, basic metabolic panel, and trough antiseizure medication levels. Clobazam, N-desmethylclobazam, and valproic acid levels have all been seen to change with concomitant cannabinoid use, she said.

“We ask families not to change other medications,” Dr. Brooks-Kayal said. Her practice frequently sees statusepilepticus when other medications are stopped and cannabinoids started, she said. “That is a huge risk.”

Tracking Efficacy

To help families and providers track efficacy when cannabinoids are used, Dr. Brooks-Kayal asks families to keep a seizure diary. She obtains a baseline EEG and another EEG about three months later. Since the EEG should capture seizure frequency, the length of the EEG is tailored to the patient’s seizure frequency. Dr. Brooks-Kayal often obtains 24-hour EEGs for her patients.

 

 

If it’s appropriate, families can enroll their children in an observational research study. Families can also consider participating in pharmaceutical double-blind, placebo-controlled trials. Other practical tips include standardizing the way neurologists care for children who use cannabinoids in their practice, and working in advance with hospital administrators and the inpatient pharmacy to address the use of these products for inpatients.

A 2014 Cochrane review concluded that “no reliable conclusions can be drawn at present regarding the efficacy of cannabinoids as a treatment for epilepsy,” Dr. Brooks-Kayal said. A systematic review by the American Academy of Neurology reached the same conclusion. The American Epilepsy Society, the American Academy of Pediatrics, and the American Medical Association do not recommend routine clinical use of cannabinoids for seizures, but call for additional research. “We need better data,” Dr. Brooks-Kayal said.

Kari Oakes

As Colorado was among the first states to allow the medical use of marijuana, neurologists there have experience treating children with seizures who use cannabinoids. Their findings and recommendations regarding parent perceptions, administrative policies, and clinical practice may be useful to pediatric neurologists in other states.

At Marijuana and Cannabinoids: A Neuroscience Research Summit, convened by the NIH, Amy Brooks-Kayal, MD, Chief of Pediatric Neurology at the University of Colorado School of Medicine and Children’s Hospital Colorado in Aurora, described her facility’s experiences caring for this patient group.

Amy Brooks-Kayal, MD

Colorado has allowed the medical use of marijuana since November 2000, while other states more recently have legalized its use. Of the 107,798 patients in Colorado who hold a card that permits medical marijuana use, 349, or about 0.3%, are minors.

Seizures are a relatively rare reason for medical marijuana use. Dr. Brooks-Kayal said that she is not aware of any neurologists or pediatricians who prescribe cannabinoids for pediatric seizures. Any physician in Colorado who has a relationship with a patient can issue a card permitting marijuana use, and two physicians are needed to issue cards to minors.

To examine the use of medical mari­juana in Colorado in children with seizure disorders, Craig Press, MD, PhD, and his coauthors conducted an observational study of 75 patients with pediatric seizures who used medical marijuana, when Dr. Press was a pediatric neurology resident at Children’s Hospital Colorado. The study was published in the April 2015 issue of Epilepsy & Behavior. “We had no ability to determine what was in the substances given, other than parental report,” Dr. Brooks-Kayal said.

Parents’ Perception of Response

Overall, 33% of parents reported a greater than 50% reduction in seizures; this group was judged to be responders, with no significant difference in response rate by seizure type. A variety of cannabis products were used, including cannabidiol alone and cannabidiol with other oral cannabis extracts (OCEs). All produced similar response rates.

However, only 30 patients had pre- and post-cannabis EEGs. Of this group, none of the cannabis responders had an improvement in their EEGs after cannabis use, whereas three of the nonresponders showed EEG improvement. “The most interesting finding that we saw was that the response rate dramatically varied depending on whether the families had moved out of state,” Dr. Brooks-Kayal said. Families who had moved to Colorado from another state for treatment were three times more likely to report response to OCEs, compared with those families who were from Colorado (47% vs 22%; odds ratio, 3.16).

This result, she said, raised the possibility that “the degree of investment that the family had made in getting this therapy might be impacting the parents’ perception of response.”

Navigating State and Federal Policies

Since state and federal policies vary, it’s hard to know what to do when a family comes to you asking about cannabis for pediatric seizure control, Dr. Brooks-Kayal said.

She therefore outlined Children’s Hospital Colorado’s approach. There, “providers do not recommend use of cannabinoids for treatment of epilepsy outside of a clinical trial,” she said.

However, families are provided with the most current information about cannabinoids. This includes being frank about the current lack of evidence regarding efficacy and safety, as well as unknowns around dosing and drug interactions. She said providers also share concerns about what’s in artisanal marijuana products, since purity and consistency of content aren’t regulated.

It’s critical for families to feel comfortable disclosing whether their children with seizures are using cannabinoids, so providers can help track safety and efficacy. Disclosure may be more likely if you reinforce that you won’t stop caring for these children if they are on cannabinoids, Dr. Brooks-Kayal said. “We strongly encourage disclosure,” and it’s a standard part of intake at every appointment to ask about cannabinoids, she said.

When cannabinoids are being used, Dr. Brooks-Kayal recommends obtaining the following tests at baseline and monthly thereafter: complete blood count, liver function tests, basic metabolic panel, and trough antiseizure medication levels. Clobazam, N-desmethylclobazam, and valproic acid levels have all been seen to change with concomitant cannabinoid use, she said.

“We ask families not to change other medications,” Dr. Brooks-Kayal said. Her practice frequently sees statusepilepticus when other medications are stopped and cannabinoids started, she said. “That is a huge risk.”

Tracking Efficacy

To help families and providers track efficacy when cannabinoids are used, Dr. Brooks-Kayal asks families to keep a seizure diary. She obtains a baseline EEG and another EEG about three months later. Since the EEG should capture seizure frequency, the length of the EEG is tailored to the patient’s seizure frequency. Dr. Brooks-Kayal often obtains 24-hour EEGs for her patients.

 

 

If it’s appropriate, families can enroll their children in an observational research study. Families can also consider participating in pharmaceutical double-blind, placebo-controlled trials. Other practical tips include standardizing the way neurologists care for children who use cannabinoids in their practice, and working in advance with hospital administrators and the inpatient pharmacy to address the use of these products for inpatients.

A 2014 Cochrane review concluded that “no reliable conclusions can be drawn at present regarding the efficacy of cannabinoids as a treatment for epilepsy,” Dr. Brooks-Kayal said. A systematic review by the American Academy of Neurology reached the same conclusion. The American Epilepsy Society, the American Academy of Pediatrics, and the American Medical Association do not recommend routine clinical use of cannabinoids for seizures, but call for additional research. “We need better data,” Dr. Brooks-Kayal said.

Kari Oakes

References

Suggested Reading
Press CA, Knupp KG, Chapman KE. Parental reporting of response to oral cannabis extracts for treatment of refractory epilepsy. Epilepsy Behav. 2015;45:49-52.
Gloss D, Vickrey B. Cannabinoids for epilepsy. Cochrane Database Syst Rev. 2014;3:CD009270.

References

Suggested Reading
Press CA, Knupp KG, Chapman KE. Parental reporting of response to oral cannabis extracts for treatment of refractory epilepsy. Epilepsy Behav. 2015;45:49-52.
Gloss D, Vickrey B. Cannabinoids for epilepsy. Cochrane Database Syst Rev. 2014;3:CD009270.

Issue
Neurology Reviews - 24(5)
Issue
Neurology Reviews - 24(5)
Page Number
31
Page Number
31
Publications
Publications
Topics
Article Type
Display Headline
Caring for Children With Seizures Who Use Cannabinoids
Display Headline
Caring for Children With Seizures Who Use Cannabinoids
Legacy Keywords
cannabinoids, seizures, medical marijuana, children, pediatric neurology, epilepsy, Amy Brooks-Kayal
Legacy Keywords
cannabinoids, seizures, medical marijuana, children, pediatric neurology, epilepsy, Amy Brooks-Kayal
Sections
Article Source

PURLs Copyright

Inside the Article

Point/Counterpoint: Are we too quick to treat May-Thurner syndrome?

Article Type
Changed
Tue, 12/13/2016 - 10:27
Display Headline
Point/Counterpoint: Are we too quick to treat May-Thurner syndrome?

YES: New tech promotes treatment where none is needed.

BY SAMUEL P. MARTIN, MD

As science and technology continue to advance, we have the ability to treat more and more conditions with less invasive, better-tolerated procedures. In the realm of vascular disease, this has been evidenced by a variable explosion in the endovascular treatment of arterial disease. With new technology, we have witnessed a tremendous relaxation of former standards in the pursuit of “quality of life.” Our new hammer is ever searching for a nail, resulting in the treatment of “anatomical” disease, such as seen in endovascular stenting of renal artery stenosis.

Nowhere is this trend becoming more evident than in the treatment of May-Thurner anatomy.

Dr. Samuel P. Martin

Despite years of awareness, there is neither an accepted radiologic definition for May-Thurner syndrome, nor established diagnostic criteria. Fortunately, our ability to image has improved from biplanar venography, formerly the gold standard.

Because May-Thurner is a permanent process, the luminal diameter of the iliac vein should not change with patient positioning. Now, with the recent development of blood pool imaging using contrast agents such as gadofosveset trisodium, magnetic resonance venography (MRV) studies can be performed in supine and prone position on a single dose of contrast. This would seem to obviate the former limitations of biplanar venography or contrast CT or traditional MRV, and would appear to provide an objective means of evaluating May-Thurner anatomy. However, upon evaluation of patients with lower-limb venous disorders, a prevalence of left common iliac vein compression was found in 14%-32% of patients, but a prevalence of May-Thurner syndrome in only 2%-5%, leading to the conclusion that left common iliac vein compression is necessary but not sufficient to cause the syndrome.

Thus, the point to be made: May-Thurner anatomy does not equal May-Thurner syndrome (Diagn Interv Radiol. 2013 Jan-Feb;19[1]:44-8).

Sadly, at the present time, there are no clear-cut guidelines.

With the advent of intravascular ultrasound (IVUS), we are seeing a large number of patients with the suspect anatomy undergoing treatment with balloon angioplasty and stents in the iliac system before adequate treatment of chronic venous insufficiency (CVI) in the extremities. What are the consequences? We have no data on primary or secondary patency of these stents (usually Wallstents). How often is anticoagulation necessary, and, is this permanent? I hate to suggest an industry or monetary motivation, but we are even seeing advertising for stent treatment of May-Thurner syndrome for people who have had treatment of their CVI (often with little or no swelling and minimal pain) with angioplasty and stenting. We also have seen patients who have undergone the procedure and had to have secondary procedures and long-term anticoagulation. Worse, they never had the procedure adequately explained, including potential complications or the possibility of future problems, procedures, or permanent anticoagulation.

So, as we face a situation – May-Thurner anatomy – which exists in more than 20% of our population, it raises several questions that need to be answered as we marshal our ever-increasing health care expenditures. Can we clearly define indications for further investigation and possible intervention, realizing that the syndrome of increased pain, swelling, and risk of thrombosis only exists in 2%-3% of those with the anatomy?

As McDermott and associates have shown in gated MRV studies, conditions such as hydration and especially position can significantly affect anatomical findings. My feelings based on 30-plus years of experience is that treatment of the leg should take precedence, and only after this avenue has been exhausted should one progress to suprainguinal investigation unless there is swelling of the entire leg. What are the long-term consequences of a Wallstent in the venous system, and are we “correcting” one risk by supplanting it with another – the long-term risk of stent thrombosis and subsequent interventions with long-term anticoagulation? There have been no reported cases of pulmonary emboli with May-Thurner and it is thought that the “spur” (synechiae) have some protective properties. In contrast, a stent is a definite theoretical risk for thrombosis, and even embolization.

Dr. Samuel P. Martin is a vascular surgeon in Orlando.

NO: Or rather, ‘maybe,’ by unethical practitioners.

BY ENRICO ASCHER, MD

Significant ipsilateral iliac vein stenosis or occlusion may have continued untoward effects in symptomatic patients particularly those with advanced venous stasis changes including venous ulcerations, skin discoloration, edema and/or pain (CEAP class 3-6). Conversely, successful iliac vein stenting (IVS) has been shown to normalize venous outflow, enhance calf vein muscle pump function, improve venous claudication, decrease pain, ameliorate edema, and accelerate wound healing.

 

 

Additionally, IVS can be safely performed in an ambulatory/office setting under local anesthesia with minimal or no sedation. The technical success can exceed 95% and long-term patency rates are excellent. Indeed, IVS is much cheaper and more durable than arterial stenting for claudication.

Dr. Enrico Ascher

These advantages cannot and should not be used as an alternative to conservative therapy that includes mild exercise, regular use of appropriately measured elastic stockings, and intermittent leg elevation whenever feasible. Moreover, venous ulcers should be treated with compressive bandages placed by well-trained providers. If all else fails then one should consider the minimally invasive procedures available to treat this debilitating, progressive disease. Unfortunately, the conservative approach fails in a substantial number of patients

It is possible that Dr. Martin is correct regarding advertisements for IVS in the presence of minimal symptoms. There is little one can do about this misleading information.

However, the physician who knowingly implants these stents in patients with no potential benefits or in those who did not have the risks, benefits, and alternatives explained should not be allowed to continue this practice. No longer can one remain silent when confronted with such horrendous unprofessional behavior.

Maybe the SVS should create a hotline that can be utilized by anonymous complainers in an attempt to identify potential abusers who fraudulently have the capacity to expose their patients to potential harm. A letter from the SVS will then be sent to the “guilty” party as an alert. Of course such a suggestion needs to be vetted by expert lawyers prior to implementation. It is only a suggestion. Others should come up with more suggestions to stop or minimize these unlawful practices.

I, too, have heard gossip and more gossip about this or that practitioner performing unnecessary procedures. These have included arterial and venous interventions. They were infrainguinal, suprainguinal or both. Some were stents, some were vein ablations. Is an unnecessary IVS worse than an unnecessary great saphenous vein ablation? What if the patient is a candidate for multiple coronary bypasses and has only one good great saphenous vein? What if the patient needs a limb salvage vein bypass operation as the only solution to maintain limb viability? If someone puts a gun to my head and ask me to choose between two unnecessary procedures I may well opt for the IVS. I am a member of the Save the GSV club founded by Dr. Samson. One can argue that the ablated vein is gone forever; the stent may be salvaged if it occludes. All unnecessary procedures are just unnecessary.

I believe that Dr. Martin makes a point to exhaust all infrainguinal options prior to IVS. In fact, he does not advocate IVS at all in any circumstance. I respect his 3 decades of clinical experience coupled to the fact that iliac vein narrowing is a fairly common finding in the general population. Nevertheless, the literature is getting filled up with large and small series of patients highlighting the importance of IVS as an important tool in our armamentarium against this chronic, debilitating disease that affects an important segment of the working population in this country and abroad. Although a small, prospective, randomized study from Brazil published in the Journal of Vascular Surgery conclusively showed the value of IVS in patients with advanced venous stasis (J Vasc Surg Venous Lymphat Disord. 2015;3:117-8), a larger one involving multiple centers will provide many needed answers.

Dr. Ascher is chief of vascular and endovascular surgery, NYU Lutheran Medical Center.

References

Author and Disclosure Information

Publications
Sections
Author and Disclosure Information

Author and Disclosure Information

YES: New tech promotes treatment where none is needed.

BY SAMUEL P. MARTIN, MD

As science and technology continue to advance, we have the ability to treat more and more conditions with less invasive, better-tolerated procedures. In the realm of vascular disease, this has been evidenced by a variable explosion in the endovascular treatment of arterial disease. With new technology, we have witnessed a tremendous relaxation of former standards in the pursuit of “quality of life.” Our new hammer is ever searching for a nail, resulting in the treatment of “anatomical” disease, such as seen in endovascular stenting of renal artery stenosis.

Nowhere is this trend becoming more evident than in the treatment of May-Thurner anatomy.

Dr. Samuel P. Martin

Despite years of awareness, there is neither an accepted radiologic definition for May-Thurner syndrome, nor established diagnostic criteria. Fortunately, our ability to image has improved from biplanar venography, formerly the gold standard.

Because May-Thurner is a permanent process, the luminal diameter of the iliac vein should not change with patient positioning. Now, with the recent development of blood pool imaging using contrast agents such as gadofosveset trisodium, magnetic resonance venography (MRV) studies can be performed in supine and prone position on a single dose of contrast. This would seem to obviate the former limitations of biplanar venography or contrast CT or traditional MRV, and would appear to provide an objective means of evaluating May-Thurner anatomy. However, upon evaluation of patients with lower-limb venous disorders, a prevalence of left common iliac vein compression was found in 14%-32% of patients, but a prevalence of May-Thurner syndrome in only 2%-5%, leading to the conclusion that left common iliac vein compression is necessary but not sufficient to cause the syndrome.

Thus, the point to be made: May-Thurner anatomy does not equal May-Thurner syndrome (Diagn Interv Radiol. 2013 Jan-Feb;19[1]:44-8).

Sadly, at the present time, there are no clear-cut guidelines.

With the advent of intravascular ultrasound (IVUS), we are seeing a large number of patients with the suspect anatomy undergoing treatment with balloon angioplasty and stents in the iliac system before adequate treatment of chronic venous insufficiency (CVI) in the extremities. What are the consequences? We have no data on primary or secondary patency of these stents (usually Wallstents). How often is anticoagulation necessary, and, is this permanent? I hate to suggest an industry or monetary motivation, but we are even seeing advertising for stent treatment of May-Thurner syndrome for people who have had treatment of their CVI (often with little or no swelling and minimal pain) with angioplasty and stenting. We also have seen patients who have undergone the procedure and had to have secondary procedures and long-term anticoagulation. Worse, they never had the procedure adequately explained, including potential complications or the possibility of future problems, procedures, or permanent anticoagulation.

So, as we face a situation – May-Thurner anatomy – which exists in more than 20% of our population, it raises several questions that need to be answered as we marshal our ever-increasing health care expenditures. Can we clearly define indications for further investigation and possible intervention, realizing that the syndrome of increased pain, swelling, and risk of thrombosis only exists in 2%-3% of those with the anatomy?

As McDermott and associates have shown in gated MRV studies, conditions such as hydration and especially position can significantly affect anatomical findings. My feelings based on 30-plus years of experience is that treatment of the leg should take precedence, and only after this avenue has been exhausted should one progress to suprainguinal investigation unless there is swelling of the entire leg. What are the long-term consequences of a Wallstent in the venous system, and are we “correcting” one risk by supplanting it with another – the long-term risk of stent thrombosis and subsequent interventions with long-term anticoagulation? There have been no reported cases of pulmonary emboli with May-Thurner and it is thought that the “spur” (synechiae) have some protective properties. In contrast, a stent is a definite theoretical risk for thrombosis, and even embolization.

Dr. Samuel P. Martin is a vascular surgeon in Orlando.

NO: Or rather, ‘maybe,’ by unethical practitioners.

BY ENRICO ASCHER, MD

Significant ipsilateral iliac vein stenosis or occlusion may have continued untoward effects in symptomatic patients particularly those with advanced venous stasis changes including venous ulcerations, skin discoloration, edema and/or pain (CEAP class 3-6). Conversely, successful iliac vein stenting (IVS) has been shown to normalize venous outflow, enhance calf vein muscle pump function, improve venous claudication, decrease pain, ameliorate edema, and accelerate wound healing.

 

 

Additionally, IVS can be safely performed in an ambulatory/office setting under local anesthesia with minimal or no sedation. The technical success can exceed 95% and long-term patency rates are excellent. Indeed, IVS is much cheaper and more durable than arterial stenting for claudication.

Dr. Enrico Ascher

These advantages cannot and should not be used as an alternative to conservative therapy that includes mild exercise, regular use of appropriately measured elastic stockings, and intermittent leg elevation whenever feasible. Moreover, venous ulcers should be treated with compressive bandages placed by well-trained providers. If all else fails then one should consider the minimally invasive procedures available to treat this debilitating, progressive disease. Unfortunately, the conservative approach fails in a substantial number of patients

It is possible that Dr. Martin is correct regarding advertisements for IVS in the presence of minimal symptoms. There is little one can do about this misleading information.

However, the physician who knowingly implants these stents in patients with no potential benefits or in those who did not have the risks, benefits, and alternatives explained should not be allowed to continue this practice. No longer can one remain silent when confronted with such horrendous unprofessional behavior.

Maybe the SVS should create a hotline that can be utilized by anonymous complainers in an attempt to identify potential abusers who fraudulently have the capacity to expose their patients to potential harm. A letter from the SVS will then be sent to the “guilty” party as an alert. Of course such a suggestion needs to be vetted by expert lawyers prior to implementation. It is only a suggestion. Others should come up with more suggestions to stop or minimize these unlawful practices.

I, too, have heard gossip and more gossip about this or that practitioner performing unnecessary procedures. These have included arterial and venous interventions. They were infrainguinal, suprainguinal or both. Some were stents, some were vein ablations. Is an unnecessary IVS worse than an unnecessary great saphenous vein ablation? What if the patient is a candidate for multiple coronary bypasses and has only one good great saphenous vein? What if the patient needs a limb salvage vein bypass operation as the only solution to maintain limb viability? If someone puts a gun to my head and ask me to choose between two unnecessary procedures I may well opt for the IVS. I am a member of the Save the GSV club founded by Dr. Samson. One can argue that the ablated vein is gone forever; the stent may be salvaged if it occludes. All unnecessary procedures are just unnecessary.

I believe that Dr. Martin makes a point to exhaust all infrainguinal options prior to IVS. In fact, he does not advocate IVS at all in any circumstance. I respect his 3 decades of clinical experience coupled to the fact that iliac vein narrowing is a fairly common finding in the general population. Nevertheless, the literature is getting filled up with large and small series of patients highlighting the importance of IVS as an important tool in our armamentarium against this chronic, debilitating disease that affects an important segment of the working population in this country and abroad. Although a small, prospective, randomized study from Brazil published in the Journal of Vascular Surgery conclusively showed the value of IVS in patients with advanced venous stasis (J Vasc Surg Venous Lymphat Disord. 2015;3:117-8), a larger one involving multiple centers will provide many needed answers.

Dr. Ascher is chief of vascular and endovascular surgery, NYU Lutheran Medical Center.

YES: New tech promotes treatment where none is needed.

BY SAMUEL P. MARTIN, MD

As science and technology continue to advance, we have the ability to treat more and more conditions with less invasive, better-tolerated procedures. In the realm of vascular disease, this has been evidenced by a variable explosion in the endovascular treatment of arterial disease. With new technology, we have witnessed a tremendous relaxation of former standards in the pursuit of “quality of life.” Our new hammer is ever searching for a nail, resulting in the treatment of “anatomical” disease, such as seen in endovascular stenting of renal artery stenosis.

Nowhere is this trend becoming more evident than in the treatment of May-Thurner anatomy.

Dr. Samuel P. Martin

Despite years of awareness, there is neither an accepted radiologic definition for May-Thurner syndrome, nor established diagnostic criteria. Fortunately, our ability to image has improved from biplanar venography, formerly the gold standard.

Because May-Thurner is a permanent process, the luminal diameter of the iliac vein should not change with patient positioning. Now, with the recent development of blood pool imaging using contrast agents such as gadofosveset trisodium, magnetic resonance venography (MRV) studies can be performed in supine and prone position on a single dose of contrast. This would seem to obviate the former limitations of biplanar venography or contrast CT or traditional MRV, and would appear to provide an objective means of evaluating May-Thurner anatomy. However, upon evaluation of patients with lower-limb venous disorders, a prevalence of left common iliac vein compression was found in 14%-32% of patients, but a prevalence of May-Thurner syndrome in only 2%-5%, leading to the conclusion that left common iliac vein compression is necessary but not sufficient to cause the syndrome.

Thus, the point to be made: May-Thurner anatomy does not equal May-Thurner syndrome (Diagn Interv Radiol. 2013 Jan-Feb;19[1]:44-8).

Sadly, at the present time, there are no clear-cut guidelines.

With the advent of intravascular ultrasound (IVUS), we are seeing a large number of patients with the suspect anatomy undergoing treatment with balloon angioplasty and stents in the iliac system before adequate treatment of chronic venous insufficiency (CVI) in the extremities. What are the consequences? We have no data on primary or secondary patency of these stents (usually Wallstents). How often is anticoagulation necessary, and, is this permanent? I hate to suggest an industry or monetary motivation, but we are even seeing advertising for stent treatment of May-Thurner syndrome for people who have had treatment of their CVI (often with little or no swelling and minimal pain) with angioplasty and stenting. We also have seen patients who have undergone the procedure and had to have secondary procedures and long-term anticoagulation. Worse, they never had the procedure adequately explained, including potential complications or the possibility of future problems, procedures, or permanent anticoagulation.

So, as we face a situation – May-Thurner anatomy – which exists in more than 20% of our population, it raises several questions that need to be answered as we marshal our ever-increasing health care expenditures. Can we clearly define indications for further investigation and possible intervention, realizing that the syndrome of increased pain, swelling, and risk of thrombosis only exists in 2%-3% of those with the anatomy?

As McDermott and associates have shown in gated MRV studies, conditions such as hydration and especially position can significantly affect anatomical findings. My feelings based on 30-plus years of experience is that treatment of the leg should take precedence, and only after this avenue has been exhausted should one progress to suprainguinal investigation unless there is swelling of the entire leg. What are the long-term consequences of a Wallstent in the venous system, and are we “correcting” one risk by supplanting it with another – the long-term risk of stent thrombosis and subsequent interventions with long-term anticoagulation? There have been no reported cases of pulmonary emboli with May-Thurner and it is thought that the “spur” (synechiae) have some protective properties. In contrast, a stent is a definite theoretical risk for thrombosis, and even embolization.

Dr. Samuel P. Martin is a vascular surgeon in Orlando.

NO: Or rather, ‘maybe,’ by unethical practitioners.

BY ENRICO ASCHER, MD

Significant ipsilateral iliac vein stenosis or occlusion may have continued untoward effects in symptomatic patients particularly those with advanced venous stasis changes including venous ulcerations, skin discoloration, edema and/or pain (CEAP class 3-6). Conversely, successful iliac vein stenting (IVS) has been shown to normalize venous outflow, enhance calf vein muscle pump function, improve venous claudication, decrease pain, ameliorate edema, and accelerate wound healing.

 

 

Additionally, IVS can be safely performed in an ambulatory/office setting under local anesthesia with minimal or no sedation. The technical success can exceed 95% and long-term patency rates are excellent. Indeed, IVS is much cheaper and more durable than arterial stenting for claudication.

Dr. Enrico Ascher

These advantages cannot and should not be used as an alternative to conservative therapy that includes mild exercise, regular use of appropriately measured elastic stockings, and intermittent leg elevation whenever feasible. Moreover, venous ulcers should be treated with compressive bandages placed by well-trained providers. If all else fails then one should consider the minimally invasive procedures available to treat this debilitating, progressive disease. Unfortunately, the conservative approach fails in a substantial number of patients

It is possible that Dr. Martin is correct regarding advertisements for IVS in the presence of minimal symptoms. There is little one can do about this misleading information.

However, the physician who knowingly implants these stents in patients with no potential benefits or in those who did not have the risks, benefits, and alternatives explained should not be allowed to continue this practice. No longer can one remain silent when confronted with such horrendous unprofessional behavior.

Maybe the SVS should create a hotline that can be utilized by anonymous complainers in an attempt to identify potential abusers who fraudulently have the capacity to expose their patients to potential harm. A letter from the SVS will then be sent to the “guilty” party as an alert. Of course such a suggestion needs to be vetted by expert lawyers prior to implementation. It is only a suggestion. Others should come up with more suggestions to stop or minimize these unlawful practices.

I, too, have heard gossip and more gossip about this or that practitioner performing unnecessary procedures. These have included arterial and venous interventions. They were infrainguinal, suprainguinal or both. Some were stents, some were vein ablations. Is an unnecessary IVS worse than an unnecessary great saphenous vein ablation? What if the patient is a candidate for multiple coronary bypasses and has only one good great saphenous vein? What if the patient needs a limb salvage vein bypass operation as the only solution to maintain limb viability? If someone puts a gun to my head and ask me to choose between two unnecessary procedures I may well opt for the IVS. I am a member of the Save the GSV club founded by Dr. Samson. One can argue that the ablated vein is gone forever; the stent may be salvaged if it occludes. All unnecessary procedures are just unnecessary.

I believe that Dr. Martin makes a point to exhaust all infrainguinal options prior to IVS. In fact, he does not advocate IVS at all in any circumstance. I respect his 3 decades of clinical experience coupled to the fact that iliac vein narrowing is a fairly common finding in the general population. Nevertheless, the literature is getting filled up with large and small series of patients highlighting the importance of IVS as an important tool in our armamentarium against this chronic, debilitating disease that affects an important segment of the working population in this country and abroad. Although a small, prospective, randomized study from Brazil published in the Journal of Vascular Surgery conclusively showed the value of IVS in patients with advanced venous stasis (J Vasc Surg Venous Lymphat Disord. 2015;3:117-8), a larger one involving multiple centers will provide many needed answers.

Dr. Ascher is chief of vascular and endovascular surgery, NYU Lutheran Medical Center.

References

References

Publications
Publications
Article Type
Display Headline
Point/Counterpoint: Are we too quick to treat May-Thurner syndrome?
Display Headline
Point/Counterpoint: Are we too quick to treat May-Thurner syndrome?
Sections
Article Source

PURLs Copyright

Inside the Article

New onset of tics

Article Type
Changed
Thu, 12/06/2018 - 17:54
Display Headline
New onset of tics

A tic is described by the DSM-5 as a sudden, rapid, recurrent, nonrhythmic movement or vocalization. Tics are a common occurrence in childhood and can range from mild to severe, transient to chronic, simple to complex. It is not uncommon for parents to ask pediatric care providers when and how to manage tics in children. Here, we present a case to illustrate just such an issue.

Case summary

Adam is an 8-year-old with a previous diagnosis of attention-deficit/hyperactivity disorder (ADHD) who is being seen for follow-up after being started on a stimulant 3 months ago because of declining performance in school and at home, despite adequate accommodations, parent education, and nonpharmacologic treatments. He has done well on a small dose of methylphenidate (0.5 mg/kg per day), but in the context of being asked about other symptoms, his mother, Mary, mentions that she has noticed that Adam is frequently clearing his throat. This began about 6 weeks ago after experiencing allergic rhinitis for almost a week. Since that time, Mary has noticed that he clears his throat as frequently as once every 5 minutes.

 

Dr. Robert R. Althoff

The behavior was reported to occur in the classroom, but not nearly with the frequency experienced at home. If asked to not clear his throat, Adam can suppress it. None of his classmates have said anything or appear to have noticed. His parents have never noticed any tics previously. There is a family history of ADHD in his father. There is no other family history of neurodevelopmental disorders, including no obsessive compulsive disorder (OCD), Tourette’s disorder, or other chronic tic disorders. There is nothing else of concern on physical or mental status examination. His mother has concerns that the stimulant medication may be inducing a tic and wonders about stopping it.

Case discussion

Adam has a mild simple vocal tic. The vast majority of tics that develop in childhood will not last the requisite 1 year required to make the diagnosis of a persistent (chronic) motor or vocal tic, nor will they occur with both vocal and motor tics over 1 year required to make the diagnosis of Tourette’s disorder. In the DSM-IV, tics lasting less than 1 year would have been given the diagnosis of transient tic disorder.

In the DSM-5, the diagnosis is now provisional tic disorder because there is no way to tell which tics will be transient and which will be persistent or chronic. Chronic tics occur with a prevalence of between 0.5% and 3%1, with a male predominance, and are more common in children with ADHD and OCD. In addition, children with chronic tic disorders often have higher incidence of learning problems and, perhaps, autism spectrum disorders. Simple motor and vocal tics (those involving a single muscle group) are more common than complex tics, in which coordinated movements are made. Despite the portrayal in the popular media, it is particularly rare to have complex tics that include copropraxia (an obscene gesture), coprolalia (an obscene movement), echolalia (repeating another’s words), or echopraxia (repeating another’s actions).

Tics tend to have their onset in early school age, with the highest prevalence and severity between the ages of 9 and 12 years.2 When present, tics tend to be somewhat suppressed when the child is in school or when the child is engaged in a task. Furthermore, most tics, even when chronic, do not lead to impairment. When impairment does occur, it is often the result of social problems from teasing by peers. Most tics wax and wane over time, but eventually resolve without intervention.

In the case of Adam, there is no clear reason to begin to treat immediately. If one wanted to follow his tics, there are several parent and clinic measures that are available. Taking a history of his case would include ensuring that there are no other predisposing causes and no other psychiatric comorbidities. Induction of tics by the initiation of a stimulant might be considered, although recent data suggest that stimulants are less likely to induce or worsen tics in the course of treatment for ADHD than previously thought.3,4 If concerned, however, alternative ADHD treatment such as alpha-2 agonist treatment could be considered. Education could be provided to the parents regarding the likelihood of resolution. Should the tics worsen in severity and/or become chronic, there are several behavioral interventions, including habit reversal training and the Comprehensive Behavioral Intervention for Tics, which could be considered as first line.

Medications could be considered if the tics are moderate to severe and behavioral interventions are not sufficient to reduce impairment. The only Food and Drug Administration–approved agents are haloperidol and pimozide, although there is ample support for other agents, and practitioners are most likely to use alternatives, given the side-effect profiles of these typical antipsychotics. Co-occurring symptoms should be considered when thinking about medication. Alpha-2 agonists appear to be most effective in the context of ADHD, while second-generation antipsychotics appear to be more useful if OCD is comorbid. In general, though, in cases like Adam’s, taking a watchful-waiting approach will most often lead to symptom resolution.

 

 

References

1. Eur Child Adolesc Psychiatry. 2012 Jan;21(1):5-13.

2. J Am Acad Child Adolesc Psychiatry. 2013 Dec;52(12):1341-59.

3. J Am Acad Child Adolesc Psychiatry. 2015 Sep;54(9):728-36.

4. Cochrane Database Syst Rev. 2011 Apr 13;(4):CD007990.

Dr. Althoff is associate professor of psychiatry, psychology, and pediatrics at the University of Vermont, Burlington. He is director of the division of behavioral genetics and conducts research on the development of self-regulation in children. Email him at [email protected].

 

Publications
Topics
Sections

A tic is described by the DSM-5 as a sudden, rapid, recurrent, nonrhythmic movement or vocalization. Tics are a common occurrence in childhood and can range from mild to severe, transient to chronic, simple to complex. It is not uncommon for parents to ask pediatric care providers when and how to manage tics in children. Here, we present a case to illustrate just such an issue.

Case summary

Adam is an 8-year-old with a previous diagnosis of attention-deficit/hyperactivity disorder (ADHD) who is being seen for follow-up after being started on a stimulant 3 months ago because of declining performance in school and at home, despite adequate accommodations, parent education, and nonpharmacologic treatments. He has done well on a small dose of methylphenidate (0.5 mg/kg per day), but in the context of being asked about other symptoms, his mother, Mary, mentions that she has noticed that Adam is frequently clearing his throat. This began about 6 weeks ago after experiencing allergic rhinitis for almost a week. Since that time, Mary has noticed that he clears his throat as frequently as once every 5 minutes.

 

Dr. Robert R. Althoff

The behavior was reported to occur in the classroom, but not nearly with the frequency experienced at home. If asked to not clear his throat, Adam can suppress it. None of his classmates have said anything or appear to have noticed. His parents have never noticed any tics previously. There is a family history of ADHD in his father. There is no other family history of neurodevelopmental disorders, including no obsessive compulsive disorder (OCD), Tourette’s disorder, or other chronic tic disorders. There is nothing else of concern on physical or mental status examination. His mother has concerns that the stimulant medication may be inducing a tic and wonders about stopping it.

Case discussion

Adam has a mild simple vocal tic. The vast majority of tics that develop in childhood will not last the requisite 1 year required to make the diagnosis of a persistent (chronic) motor or vocal tic, nor will they occur with both vocal and motor tics over 1 year required to make the diagnosis of Tourette’s disorder. In the DSM-IV, tics lasting less than 1 year would have been given the diagnosis of transient tic disorder.

In the DSM-5, the diagnosis is now provisional tic disorder because there is no way to tell which tics will be transient and which will be persistent or chronic. Chronic tics occur with a prevalence of between 0.5% and 3%1, with a male predominance, and are more common in children with ADHD and OCD. In addition, children with chronic tic disorders often have higher incidence of learning problems and, perhaps, autism spectrum disorders. Simple motor and vocal tics (those involving a single muscle group) are more common than complex tics, in which coordinated movements are made. Despite the portrayal in the popular media, it is particularly rare to have complex tics that include copropraxia (an obscene gesture), coprolalia (an obscene movement), echolalia (repeating another’s words), or echopraxia (repeating another’s actions).

Tics tend to have their onset in early school age, with the highest prevalence and severity between the ages of 9 and 12 years.2 When present, tics tend to be somewhat suppressed when the child is in school or when the child is engaged in a task. Furthermore, most tics, even when chronic, do not lead to impairment. When impairment does occur, it is often the result of social problems from teasing by peers. Most tics wax and wane over time, but eventually resolve without intervention.

In the case of Adam, there is no clear reason to begin to treat immediately. If one wanted to follow his tics, there are several parent and clinic measures that are available. Taking a history of his case would include ensuring that there are no other predisposing causes and no other psychiatric comorbidities. Induction of tics by the initiation of a stimulant might be considered, although recent data suggest that stimulants are less likely to induce or worsen tics in the course of treatment for ADHD than previously thought.3,4 If concerned, however, alternative ADHD treatment such as alpha-2 agonist treatment could be considered. Education could be provided to the parents regarding the likelihood of resolution. Should the tics worsen in severity and/or become chronic, there are several behavioral interventions, including habit reversal training and the Comprehensive Behavioral Intervention for Tics, which could be considered as first line.

Medications could be considered if the tics are moderate to severe and behavioral interventions are not sufficient to reduce impairment. The only Food and Drug Administration–approved agents are haloperidol and pimozide, although there is ample support for other agents, and practitioners are most likely to use alternatives, given the side-effect profiles of these typical antipsychotics. Co-occurring symptoms should be considered when thinking about medication. Alpha-2 agonists appear to be most effective in the context of ADHD, while second-generation antipsychotics appear to be more useful if OCD is comorbid. In general, though, in cases like Adam’s, taking a watchful-waiting approach will most often lead to symptom resolution.

 

 

References

1. Eur Child Adolesc Psychiatry. 2012 Jan;21(1):5-13.

2. J Am Acad Child Adolesc Psychiatry. 2013 Dec;52(12):1341-59.

3. J Am Acad Child Adolesc Psychiatry. 2015 Sep;54(9):728-36.

4. Cochrane Database Syst Rev. 2011 Apr 13;(4):CD007990.

Dr. Althoff is associate professor of psychiatry, psychology, and pediatrics at the University of Vermont, Burlington. He is director of the division of behavioral genetics and conducts research on the development of self-regulation in children. Email him at [email protected].

 

A tic is described by the DSM-5 as a sudden, rapid, recurrent, nonrhythmic movement or vocalization. Tics are a common occurrence in childhood and can range from mild to severe, transient to chronic, simple to complex. It is not uncommon for parents to ask pediatric care providers when and how to manage tics in children. Here, we present a case to illustrate just such an issue.

Case summary

Adam is an 8-year-old with a previous diagnosis of attention-deficit/hyperactivity disorder (ADHD) who is being seen for follow-up after being started on a stimulant 3 months ago because of declining performance in school and at home, despite adequate accommodations, parent education, and nonpharmacologic treatments. He has done well on a small dose of methylphenidate (0.5 mg/kg per day), but in the context of being asked about other symptoms, his mother, Mary, mentions that she has noticed that Adam is frequently clearing his throat. This began about 6 weeks ago after experiencing allergic rhinitis for almost a week. Since that time, Mary has noticed that he clears his throat as frequently as once every 5 minutes.

 

Dr. Robert R. Althoff

The behavior was reported to occur in the classroom, but not nearly with the frequency experienced at home. If asked to not clear his throat, Adam can suppress it. None of his classmates have said anything or appear to have noticed. His parents have never noticed any tics previously. There is a family history of ADHD in his father. There is no other family history of neurodevelopmental disorders, including no obsessive compulsive disorder (OCD), Tourette’s disorder, or other chronic tic disorders. There is nothing else of concern on physical or mental status examination. His mother has concerns that the stimulant medication may be inducing a tic and wonders about stopping it.

Case discussion

Adam has a mild simple vocal tic. The vast majority of tics that develop in childhood will not last the requisite 1 year required to make the diagnosis of a persistent (chronic) motor or vocal tic, nor will they occur with both vocal and motor tics over 1 year required to make the diagnosis of Tourette’s disorder. In the DSM-IV, tics lasting less than 1 year would have been given the diagnosis of transient tic disorder.

In the DSM-5, the diagnosis is now provisional tic disorder because there is no way to tell which tics will be transient and which will be persistent or chronic. Chronic tics occur with a prevalence of between 0.5% and 3%1, with a male predominance, and are more common in children with ADHD and OCD. In addition, children with chronic tic disorders often have higher incidence of learning problems and, perhaps, autism spectrum disorders. Simple motor and vocal tics (those involving a single muscle group) are more common than complex tics, in which coordinated movements are made. Despite the portrayal in the popular media, it is particularly rare to have complex tics that include copropraxia (an obscene gesture), coprolalia (an obscene movement), echolalia (repeating another’s words), or echopraxia (repeating another’s actions).

Tics tend to have their onset in early school age, with the highest prevalence and severity between the ages of 9 and 12 years.2 When present, tics tend to be somewhat suppressed when the child is in school or when the child is engaged in a task. Furthermore, most tics, even when chronic, do not lead to impairment. When impairment does occur, it is often the result of social problems from teasing by peers. Most tics wax and wane over time, but eventually resolve without intervention.

In the case of Adam, there is no clear reason to begin to treat immediately. If one wanted to follow his tics, there are several parent and clinic measures that are available. Taking a history of his case would include ensuring that there are no other predisposing causes and no other psychiatric comorbidities. Induction of tics by the initiation of a stimulant might be considered, although recent data suggest that stimulants are less likely to induce or worsen tics in the course of treatment for ADHD than previously thought.3,4 If concerned, however, alternative ADHD treatment such as alpha-2 agonist treatment could be considered. Education could be provided to the parents regarding the likelihood of resolution. Should the tics worsen in severity and/or become chronic, there are several behavioral interventions, including habit reversal training and the Comprehensive Behavioral Intervention for Tics, which could be considered as first line.

Medications could be considered if the tics are moderate to severe and behavioral interventions are not sufficient to reduce impairment. The only Food and Drug Administration–approved agents are haloperidol and pimozide, although there is ample support for other agents, and practitioners are most likely to use alternatives, given the side-effect profiles of these typical antipsychotics. Co-occurring symptoms should be considered when thinking about medication. Alpha-2 agonists appear to be most effective in the context of ADHD, while second-generation antipsychotics appear to be more useful if OCD is comorbid. In general, though, in cases like Adam’s, taking a watchful-waiting approach will most often lead to symptom resolution.

 

 

References

1. Eur Child Adolesc Psychiatry. 2012 Jan;21(1):5-13.

2. J Am Acad Child Adolesc Psychiatry. 2013 Dec;52(12):1341-59.

3. J Am Acad Child Adolesc Psychiatry. 2015 Sep;54(9):728-36.

4. Cochrane Database Syst Rev. 2011 Apr 13;(4):CD007990.

Dr. Althoff is associate professor of psychiatry, psychology, and pediatrics at the University of Vermont, Burlington. He is director of the division of behavioral genetics and conducts research on the development of self-regulation in children. Email him at [email protected].

 

Publications
Publications
Topics
Article Type
Display Headline
New onset of tics
Display Headline
New onset of tics
Sections
Disallow All Ads

APA guideline stresses judicious antipsychotics in dementia

Article Type
Changed
Mon, 01/07/2019 - 12:36
Display Headline
APA guideline stresses judicious antipsychotics in dementia

Antipsychotics should be used judiciously when patients with dementia develop agitation or psychosis, according to the American Psychiatric Association’s first practice guideline on this issue published May 1 in the American Journal of Psychiatry.

Most patients with dementia develop psychosis or agitation during their illness, and treatment, based on expert consensus, often has involved antipsychotics. Antipsychotics have been thought to minimize the risk of violence, reduce patient distress, improve the patient’s quality of life, and reduce the burden on caregivers. But recent clinical trial results suggest that the benefits in this patient population are small at best, while the harms – including increased mortality and accelerated cognitive decline – are significant, said Dr. Victor I. Reus, chair of the APA Practice Guideline Writing Group and his associates.

The group developed this guideline based on a systematic review of the evidence, including results of a survey of experts in the treatment of agitation or psychosis in people with dementia. Overall, the guideline’s 15 recommendations stress that these medications must be just one part of a comprehensive, patient-centered treatment plan that includes both drug and nondrug components, said Dr. Reus, also professor of psychiatry at the University of California, San Francisco, and his associates.

Among the new recommendations, the APA emphasizes that antipsychotics should be used only when agitation or psychosis are severe, dangerous, and/or cause the patient significant distress. The potential risks and benefits should be discussed with the patient (if feasible), family, and other caregivers.

If antipsychotic treatment is initiated, it should be started at low doses and titrated up to the minimum effective dose tolerated. Haloperidol should not be used as a first-line agent, and long-acting injectable antipsychotics should not be used unless indicated for a concomitant chronic psychotic disorder, Dr. Reus and his associates said (Am J Psychiatry. 2016;173:1-4 doi: 10.1176/appi.ajp.2015.173501).

If any side effects develop, the clinician should review all side effects, risks, and benefits, and discuss these with the patient, family, and caregivers, to determine whether tapering or discontinuing the drug is indicated.

Response to treatment should be assessed using a quantitative measure. If there is no clinically relevant response after a 4-week trial of an adequate dose of an antipsychotic medication, it should be tapered and withdrawn. If there is a positive response, eventual tapering of the drug should be discussed with the patient, family, and caregivers, including the potential risks of continued use of these agents.

Attempts to taper and withdraw the antipsychotic should commence within 4 months. Symptoms should be monitored at least monthly during drug tapering and for at least 4 months after treatment cessation to identify signs of recurrence of psychosis or agitation.

The full Guideline is available online at http://psychiatryonline.org.

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Antipsychotics should be used judiciously when patients with dementia develop agitation or psychosis, according to the American Psychiatric Association’s first practice guideline on this issue published May 1 in the American Journal of Psychiatry.

Most patients with dementia develop psychosis or agitation during their illness, and treatment, based on expert consensus, often has involved antipsychotics. Antipsychotics have been thought to minimize the risk of violence, reduce patient distress, improve the patient’s quality of life, and reduce the burden on caregivers. But recent clinical trial results suggest that the benefits in this patient population are small at best, while the harms – including increased mortality and accelerated cognitive decline – are significant, said Dr. Victor I. Reus, chair of the APA Practice Guideline Writing Group and his associates.

The group developed this guideline based on a systematic review of the evidence, including results of a survey of experts in the treatment of agitation or psychosis in people with dementia. Overall, the guideline’s 15 recommendations stress that these medications must be just one part of a comprehensive, patient-centered treatment plan that includes both drug and nondrug components, said Dr. Reus, also professor of psychiatry at the University of California, San Francisco, and his associates.

Among the new recommendations, the APA emphasizes that antipsychotics should be used only when agitation or psychosis are severe, dangerous, and/or cause the patient significant distress. The potential risks and benefits should be discussed with the patient (if feasible), family, and other caregivers.

If antipsychotic treatment is initiated, it should be started at low doses and titrated up to the minimum effective dose tolerated. Haloperidol should not be used as a first-line agent, and long-acting injectable antipsychotics should not be used unless indicated for a concomitant chronic psychotic disorder, Dr. Reus and his associates said (Am J Psychiatry. 2016;173:1-4 doi: 10.1176/appi.ajp.2015.173501).

If any side effects develop, the clinician should review all side effects, risks, and benefits, and discuss these with the patient, family, and caregivers, to determine whether tapering or discontinuing the drug is indicated.

Response to treatment should be assessed using a quantitative measure. If there is no clinically relevant response after a 4-week trial of an adequate dose of an antipsychotic medication, it should be tapered and withdrawn. If there is a positive response, eventual tapering of the drug should be discussed with the patient, family, and caregivers, including the potential risks of continued use of these agents.

Attempts to taper and withdraw the antipsychotic should commence within 4 months. Symptoms should be monitored at least monthly during drug tapering and for at least 4 months after treatment cessation to identify signs of recurrence of psychosis or agitation.

The full Guideline is available online at http://psychiatryonline.org.

Antipsychotics should be used judiciously when patients with dementia develop agitation or psychosis, according to the American Psychiatric Association’s first practice guideline on this issue published May 1 in the American Journal of Psychiatry.

Most patients with dementia develop psychosis or agitation during their illness, and treatment, based on expert consensus, often has involved antipsychotics. Antipsychotics have been thought to minimize the risk of violence, reduce patient distress, improve the patient’s quality of life, and reduce the burden on caregivers. But recent clinical trial results suggest that the benefits in this patient population are small at best, while the harms – including increased mortality and accelerated cognitive decline – are significant, said Dr. Victor I. Reus, chair of the APA Practice Guideline Writing Group and his associates.

The group developed this guideline based on a systematic review of the evidence, including results of a survey of experts in the treatment of agitation or psychosis in people with dementia. Overall, the guideline’s 15 recommendations stress that these medications must be just one part of a comprehensive, patient-centered treatment plan that includes both drug and nondrug components, said Dr. Reus, also professor of psychiatry at the University of California, San Francisco, and his associates.

Among the new recommendations, the APA emphasizes that antipsychotics should be used only when agitation or psychosis are severe, dangerous, and/or cause the patient significant distress. The potential risks and benefits should be discussed with the patient (if feasible), family, and other caregivers.

If antipsychotic treatment is initiated, it should be started at low doses and titrated up to the minimum effective dose tolerated. Haloperidol should not be used as a first-line agent, and long-acting injectable antipsychotics should not be used unless indicated for a concomitant chronic psychotic disorder, Dr. Reus and his associates said (Am J Psychiatry. 2016;173:1-4 doi: 10.1176/appi.ajp.2015.173501).

If any side effects develop, the clinician should review all side effects, risks, and benefits, and discuss these with the patient, family, and caregivers, to determine whether tapering or discontinuing the drug is indicated.

Response to treatment should be assessed using a quantitative measure. If there is no clinically relevant response after a 4-week trial of an adequate dose of an antipsychotic medication, it should be tapered and withdrawn. If there is a positive response, eventual tapering of the drug should be discussed with the patient, family, and caregivers, including the potential risks of continued use of these agents.

Attempts to taper and withdraw the antipsychotic should commence within 4 months. Symptoms should be monitored at least monthly during drug tapering and for at least 4 months after treatment cessation to identify signs of recurrence of psychosis or agitation.

The full Guideline is available online at http://psychiatryonline.org.

References

References

Publications
Publications
Topics
Article Type
Display Headline
APA guideline stresses judicious antipsychotics in dementia
Display Headline
APA guideline stresses judicious antipsychotics in dementia
Article Source

FROM THE AMERICAN JOURNAL OF PSYCHIATRY

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Antipsychotics should be used only when agitation or psychosis are severe, dangerous, and/or cause significant distress for dementia patients.

Major finding: The guideline’s 15 recommendations stress that these medications must be just one part of a comprehensive, patient-centered treatment plan that includes drug and nondrug components.

Data source: A review of the evidence and compilation of a new clinical practice guideline with 15 recommendations.

Disclosures: This guideline was developed by the APA Practice Guideline Writing Group. No financial disclosures were provided.

New and Noteworthy Information—May 2016

Article Type
Changed
Mon, 01/07/2019 - 10:18
Display Headline
New and Noteworthy Information—May 2016

Patients who have type 1 diabetes have an increased risk of developing epilepsy, according to a study published online ahead of print March 31 in Diabetologia. This study cohort included 2,568 patients with type 1 diabetes, each of whom was frequency-matched by sex, residence, and index year with 10 patients without type 1 diabetes. Cox proportional hazard regression analysis was conducted to estimate the effects of type 1 diabetes on epilepsy risk. After adjusting for potential confounders, the type 1 diabetes cohort was 2.84 times more likely to develop epilepsy than the control cohort. “Metabolic abnormalities of type 1 diabetes, such as hyperglycemia and hypoglycemia, may have a damaging effect on the central nervous system and be associated with significant long-term neurological sequelae,” said the authors.

Antipsychotics are associated with a significantly increased mortality risk in patients with Parkinson’s disease, after adjusting for measurable confounders, according to a study published online ahead of print March 21 in JAMA Neurology. This retrospective matched-cohort study used data from the fiscal years of 1999 to 2010. The rates of 180-day mortality were compared in 7,877 patients initiating antipsychotic therapy and in 7,877 patients who did not initiate antipsychotic therapy. Antipsychotic use was associated with more than twice the hazard ratio of death, compared with nonuse. The hazard ratio was significantly higher for patients who used typical versus atypical antipsychotics. Among the atypical antipsychotics used, hazard ratios relative to nonuse of antipsychotics were, in descending order, 2.79 for olanzapine, 2.46 for risperidone, and 2.16 for quetiapine fumarate.

White matter tracts related to regulation of sleep and wakefulness, and limbic cognitive and sensorimotor regions, are disrupted in the right brain in patients with primary insomnia, according to a study published online ahead of print April 5 in Radiology. Investigators used tract-based spatial statistics to compare changes in diffusion parameters of white matter tracts from 23 patients with primary insomnia and 30 healthy controls. They evaluated how accurately these changes could distinguish patients with insomnia from healthy controls. Patients with primary insomnia had lower fractional anisotropy values mainly in the right anterior limb of the internal capsule, the right posterior limb of the internal capsule, the right anterior corona radiata, the right superior corona radiata, the right superior longitudinal fasciculus, the body of the corpus callosum, and the right thalamus.

Among Chinese adults, a higher level of fruit consumption is associated with lower blood pressure and blood glucose levels and, largely independent of these and other factors, with significantly lower risks of major cardiovascular diseases, according to a study published April 7 in the New England Journal of Medicine. Between 2004 and 2008, researchers recruited 512,891 adults between ages 30 and 79 from 10 locations in China. In all, 5,173 deaths from cardiovascular disease, 2,551 incident major coronary events, 14,579 ischemic strokes, and 3,523 intracerebral hemorrhages were recorded among the 451,665 participants without a history of cardiovascular disease or antihypertensive treatments at baseline. The adjusted hazard ratios for daily consumption versus nonconsumption were 0.75 for ischemic stroke and 0.64 for hemorrhagic stroke.

A low level of leisure-time physical activity is independently associated with greater decline in cognitive performance over time, according to a study published online ahead of print March 23 in Neurology. Researchers assessed cognition in participants in the Northern Manhattan Study using a standard neuropsychologic examination. The neuropsychologic examination was repeated five years later and subcategorized using standardized z scores over validated domains. No or low levels of leisure-time physical activity were associated with worse executive function, semantic memory, and processing speed scores on the first neuropsychologic examination. The associations were slightly attenuated and not significant after adjusting for vascular risk factors. Cognitively unimpaired participants who reported no to low leisure-time physical activity versus moderate to high levels declined more over time in processing speed and episodic memory.

Participation in the “Sleep for Success” education program is associated with significant improvement in children’s sleep and academic performance, according to a study published in Sleep Medicine. Using a community-based participatory research approach, researchers composed a program of four modules that addressed the children, their family and community, the school staff, and decision-makers within the school setting. In all, 71 students participated in the evaluation of the program. The effectiveness of the program was evaluated using nonrandomized, controlled before-and-after study groups that were assessed at two time points. In the intervention group, true sleep was extended by 18.2 minutes per night, sleep efficiency improved by 2.3%, and sleep latency was shortened by 2.3 minutes. Report card grades also improved significantly in English and mathematics.

 

 

The combination of rosuvastatin, candesartan, and hydrochlorothiazide is associated with a significantly lower rate of cardiovascular events than dual placebo in people who do not have cardiovascular disease, according to a study published online ahead of print April 2 in the New England Journal of Medicine. Researchers randomly assigned 12,705 participants at intermediate risk who did not have cardiovascular disease to 10 mg/day of rosuvastatin or placebo, and to 16 mg/day of candesartan, plus 12.5 mg/day of hydrochlorothiazide or placebo. The decrease in the low-density lipoprotein cholesterol level was 33.7 mg/dL greater in the combined-therapy group than in the dual-placebo group, and the decrease in systolic blood pressure was 6.2 mm Hg greater with combined therapy than with dual placebo.

Right hemisphere white matter integrity is related to speech fluency measures in patients with chronic aphasia, according to a study published online ahead of print March 30 in Neurology. The study included 33 people with an average age of 58 who had a stroke on the left side of their brain. Fractional anisotropy values for the right middle temporal gyrus, precentral gyrus, and pars opercularis significantly predicted speech fluency, but fractional anisotropy values of the pars triangularis and superior parietal lobule did not. A multiple regression analysis showed that combining fractional anisotropy of the significant right hemisphere regions with the lesion load of the left arcuate fasciculus provided the best model for predicting speech fluency. Fractional anisotropy of corpus callosum fibers connecting left and right supplementary motor areas was also correlated with speech fluency.

Differences in white matter microstructure may partially account for the variance in functional outcomes among veterans with combat-related mild traumatic brain injury (mTBI), according to a study published online ahead of print March 29 in Radiology. From 2010 to 2013, an initial post deployment evaluation, including clinical assessment and brain MRI with diffusion tensor imaging, was performed in combat veterans who sustained mTBI while deployed. Veterans who did and did not return to work were also compared for differences in clinical variables by using t and χ2 tests. After a mean follow-up of 1.4 years, 34 of 57 veterans had returned to work. Cumulative health care visits over time were inversely correlated with diffusion anisotropy of the splenium of the corpus callosum and adjacent parietal white matter.

Cognitive status and intelligibility may be associated with everyday communicative outcomes in Parkinson’s disease, according to a study published online ahead of print March 16 in the Journal of Parkinson’s Disease. Investigators searched five online databases in May 2015 and also conducted supplementary searches. In all, 4,816 records were identified through database searches, and 16 additional records were identified through supplementary searches. Forty-one articles were suitable for full-text screening, and 15 articles met the eligibility criteria. Ten studies assessed the role of cognitive status, and nine found that participants with greater cognitive impairment had greater everyday communication difficulties. Four studies assessed the role of intelligibility, and all found that participants with greater intelligibility impairment had greater everyday communication difficulties. Effects often were weak, however, and not consistent.

Common variants near FOXF2 are associated with increased stroke susceptibility, according to a study published online ahead of print April 7 in the Lancet Neurology. Researchers completed a genome-wide analysis of common genetic variants associated with incident stroke risk in 18 population-based cohorts consisting of 84,961 participants, of whom 4,348 had stroke. Investigators completed validation analyses for variants yielding a significant association with all stroke, ischemic stroke, cardioembolic ischemic stroke, or noncardioembolic ischemic stroke. Study authors replicated seven of eight known loci associated with risk for ischemic stroke and identified a novel locus at chromosome 6p25 associated with risk of all stroke. The rs12204590 stroke risk allele was also associated with increased white matter hyperintensity in adults without stroke. Young patients with segmental deletions of FOXF2 showed extensive white matter hyperintensity.

Young women with musculoskeletal pain complaints may have comorbid sleep problems that require treatment, according to a study published in the April issue of Pain. Researchers investigated the cross-sectional and longitudinal relationship between sleep problems and chronic pain, and musculoskeletal pain, headache, and abdominal pain severity in a general population of adults between ages 19 and 22. They studied whether relationships were moderated by sex and whether symptoms of anxiety and depression, fatigue, or physical inactivity mediated these effects. Follow-up data were collected in 1,753 participants. Sleep problems were associated with chronic pain, musculoskeletal pain, headache, and abdominal pain severity. They also predicted chronic pain and an increase in musculoskeletal pain severity at three years of follow-up. The effect was stronger in females than in males.

 

 

Kimberly Williams

References

Author and Disclosure Information

Issue
Neurology Reviews - 24(5)
Publications
Page Number
6-7
Legacy Keywords
Parkinson's disease, type 1 diabetes, insomnia, Sleep for Success, MRI, TBI
Sections
Author and Disclosure Information

Author and Disclosure Information

Patients who have type 1 diabetes have an increased risk of developing epilepsy, according to a study published online ahead of print March 31 in Diabetologia. This study cohort included 2,568 patients with type 1 diabetes, each of whom was frequency-matched by sex, residence, and index year with 10 patients without type 1 diabetes. Cox proportional hazard regression analysis was conducted to estimate the effects of type 1 diabetes on epilepsy risk. After adjusting for potential confounders, the type 1 diabetes cohort was 2.84 times more likely to develop epilepsy than the control cohort. “Metabolic abnormalities of type 1 diabetes, such as hyperglycemia and hypoglycemia, may have a damaging effect on the central nervous system and be associated with significant long-term neurological sequelae,” said the authors.

Antipsychotics are associated with a significantly increased mortality risk in patients with Parkinson’s disease, after adjusting for measurable confounders, according to a study published online ahead of print March 21 in JAMA Neurology. This retrospective matched-cohort study used data from the fiscal years of 1999 to 2010. The rates of 180-day mortality were compared in 7,877 patients initiating antipsychotic therapy and in 7,877 patients who did not initiate antipsychotic therapy. Antipsychotic use was associated with more than twice the hazard ratio of death, compared with nonuse. The hazard ratio was significantly higher for patients who used typical versus atypical antipsychotics. Among the atypical antipsychotics used, hazard ratios relative to nonuse of antipsychotics were, in descending order, 2.79 for olanzapine, 2.46 for risperidone, and 2.16 for quetiapine fumarate.

White matter tracts related to regulation of sleep and wakefulness, and limbic cognitive and sensorimotor regions, are disrupted in the right brain in patients with primary insomnia, according to a study published online ahead of print April 5 in Radiology. Investigators used tract-based spatial statistics to compare changes in diffusion parameters of white matter tracts from 23 patients with primary insomnia and 30 healthy controls. They evaluated how accurately these changes could distinguish patients with insomnia from healthy controls. Patients with primary insomnia had lower fractional anisotropy values mainly in the right anterior limb of the internal capsule, the right posterior limb of the internal capsule, the right anterior corona radiata, the right superior corona radiata, the right superior longitudinal fasciculus, the body of the corpus callosum, and the right thalamus.

Among Chinese adults, a higher level of fruit consumption is associated with lower blood pressure and blood glucose levels and, largely independent of these and other factors, with significantly lower risks of major cardiovascular diseases, according to a study published April 7 in the New England Journal of Medicine. Between 2004 and 2008, researchers recruited 512,891 adults between ages 30 and 79 from 10 locations in China. In all, 5,173 deaths from cardiovascular disease, 2,551 incident major coronary events, 14,579 ischemic strokes, and 3,523 intracerebral hemorrhages were recorded among the 451,665 participants without a history of cardiovascular disease or antihypertensive treatments at baseline. The adjusted hazard ratios for daily consumption versus nonconsumption were 0.75 for ischemic stroke and 0.64 for hemorrhagic stroke.

A low level of leisure-time physical activity is independently associated with greater decline in cognitive performance over time, according to a study published online ahead of print March 23 in Neurology. Researchers assessed cognition in participants in the Northern Manhattan Study using a standard neuropsychologic examination. The neuropsychologic examination was repeated five years later and subcategorized using standardized z scores over validated domains. No or low levels of leisure-time physical activity were associated with worse executive function, semantic memory, and processing speed scores on the first neuropsychologic examination. The associations were slightly attenuated and not significant after adjusting for vascular risk factors. Cognitively unimpaired participants who reported no to low leisure-time physical activity versus moderate to high levels declined more over time in processing speed and episodic memory.

Participation in the “Sleep for Success” education program is associated with significant improvement in children’s sleep and academic performance, according to a study published in Sleep Medicine. Using a community-based participatory research approach, researchers composed a program of four modules that addressed the children, their family and community, the school staff, and decision-makers within the school setting. In all, 71 students participated in the evaluation of the program. The effectiveness of the program was evaluated using nonrandomized, controlled before-and-after study groups that were assessed at two time points. In the intervention group, true sleep was extended by 18.2 minutes per night, sleep efficiency improved by 2.3%, and sleep latency was shortened by 2.3 minutes. Report card grades also improved significantly in English and mathematics.

 

 

The combination of rosuvastatin, candesartan, and hydrochlorothiazide is associated with a significantly lower rate of cardiovascular events than dual placebo in people who do not have cardiovascular disease, according to a study published online ahead of print April 2 in the New England Journal of Medicine. Researchers randomly assigned 12,705 participants at intermediate risk who did not have cardiovascular disease to 10 mg/day of rosuvastatin or placebo, and to 16 mg/day of candesartan, plus 12.5 mg/day of hydrochlorothiazide or placebo. The decrease in the low-density lipoprotein cholesterol level was 33.7 mg/dL greater in the combined-therapy group than in the dual-placebo group, and the decrease in systolic blood pressure was 6.2 mm Hg greater with combined therapy than with dual placebo.

Right hemisphere white matter integrity is related to speech fluency measures in patients with chronic aphasia, according to a study published online ahead of print March 30 in Neurology. The study included 33 people with an average age of 58 who had a stroke on the left side of their brain. Fractional anisotropy values for the right middle temporal gyrus, precentral gyrus, and pars opercularis significantly predicted speech fluency, but fractional anisotropy values of the pars triangularis and superior parietal lobule did not. A multiple regression analysis showed that combining fractional anisotropy of the significant right hemisphere regions with the lesion load of the left arcuate fasciculus provided the best model for predicting speech fluency. Fractional anisotropy of corpus callosum fibers connecting left and right supplementary motor areas was also correlated with speech fluency.

Differences in white matter microstructure may partially account for the variance in functional outcomes among veterans with combat-related mild traumatic brain injury (mTBI), according to a study published online ahead of print March 29 in Radiology. From 2010 to 2013, an initial post deployment evaluation, including clinical assessment and brain MRI with diffusion tensor imaging, was performed in combat veterans who sustained mTBI while deployed. Veterans who did and did not return to work were also compared for differences in clinical variables by using t and χ2 tests. After a mean follow-up of 1.4 years, 34 of 57 veterans had returned to work. Cumulative health care visits over time were inversely correlated with diffusion anisotropy of the splenium of the corpus callosum and adjacent parietal white matter.

Cognitive status and intelligibility may be associated with everyday communicative outcomes in Parkinson’s disease, according to a study published online ahead of print March 16 in the Journal of Parkinson’s Disease. Investigators searched five online databases in May 2015 and also conducted supplementary searches. In all, 4,816 records were identified through database searches, and 16 additional records were identified through supplementary searches. Forty-one articles were suitable for full-text screening, and 15 articles met the eligibility criteria. Ten studies assessed the role of cognitive status, and nine found that participants with greater cognitive impairment had greater everyday communication difficulties. Four studies assessed the role of intelligibility, and all found that participants with greater intelligibility impairment had greater everyday communication difficulties. Effects often were weak, however, and not consistent.

Common variants near FOXF2 are associated with increased stroke susceptibility, according to a study published online ahead of print April 7 in the Lancet Neurology. Researchers completed a genome-wide analysis of common genetic variants associated with incident stroke risk in 18 population-based cohorts consisting of 84,961 participants, of whom 4,348 had stroke. Investigators completed validation analyses for variants yielding a significant association with all stroke, ischemic stroke, cardioembolic ischemic stroke, or noncardioembolic ischemic stroke. Study authors replicated seven of eight known loci associated with risk for ischemic stroke and identified a novel locus at chromosome 6p25 associated with risk of all stroke. The rs12204590 stroke risk allele was also associated with increased white matter hyperintensity in adults without stroke. Young patients with segmental deletions of FOXF2 showed extensive white matter hyperintensity.

Young women with musculoskeletal pain complaints may have comorbid sleep problems that require treatment, according to a study published in the April issue of Pain. Researchers investigated the cross-sectional and longitudinal relationship between sleep problems and chronic pain, and musculoskeletal pain, headache, and abdominal pain severity in a general population of adults between ages 19 and 22. They studied whether relationships were moderated by sex and whether symptoms of anxiety and depression, fatigue, or physical inactivity mediated these effects. Follow-up data were collected in 1,753 participants. Sleep problems were associated with chronic pain, musculoskeletal pain, headache, and abdominal pain severity. They also predicted chronic pain and an increase in musculoskeletal pain severity at three years of follow-up. The effect was stronger in females than in males.

 

 

Kimberly Williams

Patients who have type 1 diabetes have an increased risk of developing epilepsy, according to a study published online ahead of print March 31 in Diabetologia. This study cohort included 2,568 patients with type 1 diabetes, each of whom was frequency-matched by sex, residence, and index year with 10 patients without type 1 diabetes. Cox proportional hazard regression analysis was conducted to estimate the effects of type 1 diabetes on epilepsy risk. After adjusting for potential confounders, the type 1 diabetes cohort was 2.84 times more likely to develop epilepsy than the control cohort. “Metabolic abnormalities of type 1 diabetes, such as hyperglycemia and hypoglycemia, may have a damaging effect on the central nervous system and be associated with significant long-term neurological sequelae,” said the authors.

Antipsychotics are associated with a significantly increased mortality risk in patients with Parkinson’s disease, after adjusting for measurable confounders, according to a study published online ahead of print March 21 in JAMA Neurology. This retrospective matched-cohort study used data from the fiscal years of 1999 to 2010. The rates of 180-day mortality were compared in 7,877 patients initiating antipsychotic therapy and in 7,877 patients who did not initiate antipsychotic therapy. Antipsychotic use was associated with more than twice the hazard ratio of death, compared with nonuse. The hazard ratio was significantly higher for patients who used typical versus atypical antipsychotics. Among the atypical antipsychotics used, hazard ratios relative to nonuse of antipsychotics were, in descending order, 2.79 for olanzapine, 2.46 for risperidone, and 2.16 for quetiapine fumarate.

White matter tracts related to regulation of sleep and wakefulness, and limbic cognitive and sensorimotor regions, are disrupted in the right brain in patients with primary insomnia, according to a study published online ahead of print April 5 in Radiology. Investigators used tract-based spatial statistics to compare changes in diffusion parameters of white matter tracts from 23 patients with primary insomnia and 30 healthy controls. They evaluated how accurately these changes could distinguish patients with insomnia from healthy controls. Patients with primary insomnia had lower fractional anisotropy values mainly in the right anterior limb of the internal capsule, the right posterior limb of the internal capsule, the right anterior corona radiata, the right superior corona radiata, the right superior longitudinal fasciculus, the body of the corpus callosum, and the right thalamus.

Among Chinese adults, a higher level of fruit consumption is associated with lower blood pressure and blood glucose levels and, largely independent of these and other factors, with significantly lower risks of major cardiovascular diseases, according to a study published April 7 in the New England Journal of Medicine. Between 2004 and 2008, researchers recruited 512,891 adults between ages 30 and 79 from 10 locations in China. In all, 5,173 deaths from cardiovascular disease, 2,551 incident major coronary events, 14,579 ischemic strokes, and 3,523 intracerebral hemorrhages were recorded among the 451,665 participants without a history of cardiovascular disease or antihypertensive treatments at baseline. The adjusted hazard ratios for daily consumption versus nonconsumption were 0.75 for ischemic stroke and 0.64 for hemorrhagic stroke.

A low level of leisure-time physical activity is independently associated with greater decline in cognitive performance over time, according to a study published online ahead of print March 23 in Neurology. Researchers assessed cognition in participants in the Northern Manhattan Study using a standard neuropsychologic examination. The neuropsychologic examination was repeated five years later and subcategorized using standardized z scores over validated domains. No or low levels of leisure-time physical activity were associated with worse executive function, semantic memory, and processing speed scores on the first neuropsychologic examination. The associations were slightly attenuated and not significant after adjusting for vascular risk factors. Cognitively unimpaired participants who reported no to low leisure-time physical activity versus moderate to high levels declined more over time in processing speed and episodic memory.

Participation in the “Sleep for Success” education program is associated with significant improvement in children’s sleep and academic performance, according to a study published in Sleep Medicine. Using a community-based participatory research approach, researchers composed a program of four modules that addressed the children, their family and community, the school staff, and decision-makers within the school setting. In all, 71 students participated in the evaluation of the program. The effectiveness of the program was evaluated using nonrandomized, controlled before-and-after study groups that were assessed at two time points. In the intervention group, true sleep was extended by 18.2 minutes per night, sleep efficiency improved by 2.3%, and sleep latency was shortened by 2.3 minutes. Report card grades also improved significantly in English and mathematics.

 

 

The combination of rosuvastatin, candesartan, and hydrochlorothiazide is associated with a significantly lower rate of cardiovascular events than dual placebo in people who do not have cardiovascular disease, according to a study published online ahead of print April 2 in the New England Journal of Medicine. Researchers randomly assigned 12,705 participants at intermediate risk who did not have cardiovascular disease to 10 mg/day of rosuvastatin or placebo, and to 16 mg/day of candesartan, plus 12.5 mg/day of hydrochlorothiazide or placebo. The decrease in the low-density lipoprotein cholesterol level was 33.7 mg/dL greater in the combined-therapy group than in the dual-placebo group, and the decrease in systolic blood pressure was 6.2 mm Hg greater with combined therapy than with dual placebo.

Right hemisphere white matter integrity is related to speech fluency measures in patients with chronic aphasia, according to a study published online ahead of print March 30 in Neurology. The study included 33 people with an average age of 58 who had a stroke on the left side of their brain. Fractional anisotropy values for the right middle temporal gyrus, precentral gyrus, and pars opercularis significantly predicted speech fluency, but fractional anisotropy values of the pars triangularis and superior parietal lobule did not. A multiple regression analysis showed that combining fractional anisotropy of the significant right hemisphere regions with the lesion load of the left arcuate fasciculus provided the best model for predicting speech fluency. Fractional anisotropy of corpus callosum fibers connecting left and right supplementary motor areas was also correlated with speech fluency.

Differences in white matter microstructure may partially account for the variance in functional outcomes among veterans with combat-related mild traumatic brain injury (mTBI), according to a study published online ahead of print March 29 in Radiology. From 2010 to 2013, an initial post deployment evaluation, including clinical assessment and brain MRI with diffusion tensor imaging, was performed in combat veterans who sustained mTBI while deployed. Veterans who did and did not return to work were also compared for differences in clinical variables by using t and χ2 tests. After a mean follow-up of 1.4 years, 34 of 57 veterans had returned to work. Cumulative health care visits over time were inversely correlated with diffusion anisotropy of the splenium of the corpus callosum and adjacent parietal white matter.

Cognitive status and intelligibility may be associated with everyday communicative outcomes in Parkinson’s disease, according to a study published online ahead of print March 16 in the Journal of Parkinson’s Disease. Investigators searched five online databases in May 2015 and also conducted supplementary searches. In all, 4,816 records were identified through database searches, and 16 additional records were identified through supplementary searches. Forty-one articles were suitable for full-text screening, and 15 articles met the eligibility criteria. Ten studies assessed the role of cognitive status, and nine found that participants with greater cognitive impairment had greater everyday communication difficulties. Four studies assessed the role of intelligibility, and all found that participants with greater intelligibility impairment had greater everyday communication difficulties. Effects often were weak, however, and not consistent.

Common variants near FOXF2 are associated with increased stroke susceptibility, according to a study published online ahead of print April 7 in the Lancet Neurology. Researchers completed a genome-wide analysis of common genetic variants associated with incident stroke risk in 18 population-based cohorts consisting of 84,961 participants, of whom 4,348 had stroke. Investigators completed validation analyses for variants yielding a significant association with all stroke, ischemic stroke, cardioembolic ischemic stroke, or noncardioembolic ischemic stroke. Study authors replicated seven of eight known loci associated with risk for ischemic stroke and identified a novel locus at chromosome 6p25 associated with risk of all stroke. The rs12204590 stroke risk allele was also associated with increased white matter hyperintensity in adults without stroke. Young patients with segmental deletions of FOXF2 showed extensive white matter hyperintensity.

Young women with musculoskeletal pain complaints may have comorbid sleep problems that require treatment, according to a study published in the April issue of Pain. Researchers investigated the cross-sectional and longitudinal relationship between sleep problems and chronic pain, and musculoskeletal pain, headache, and abdominal pain severity in a general population of adults between ages 19 and 22. They studied whether relationships were moderated by sex and whether symptoms of anxiety and depression, fatigue, or physical inactivity mediated these effects. Follow-up data were collected in 1,753 participants. Sleep problems were associated with chronic pain, musculoskeletal pain, headache, and abdominal pain severity. They also predicted chronic pain and an increase in musculoskeletal pain severity at three years of follow-up. The effect was stronger in females than in males.

 

 

Kimberly Williams

References

References

Issue
Neurology Reviews - 24(5)
Issue
Neurology Reviews - 24(5)
Page Number
6-7
Page Number
6-7
Publications
Publications
Article Type
Display Headline
New and Noteworthy Information—May 2016
Display Headline
New and Noteworthy Information—May 2016
Legacy Keywords
Parkinson's disease, type 1 diabetes, insomnia, Sleep for Success, MRI, TBI
Legacy Keywords
Parkinson's disease, type 1 diabetes, insomnia, Sleep for Success, MRI, TBI
Sections
Article Source

PURLs Copyright

Inside the Article

Parathyroidectomy before kidney transplant may reduce complications

Article Type
Changed
Wed, 01/02/2019 - 09:34
Display Headline
Parathyroidectomy before kidney transplant may reduce complications

BALTIMORE – Performing a parathyroidectomy in kidney transplant patients before their transplant can reduce the risk of graft failure and provide other benefits, the findings of a retrospective study of 913 patients suggest.

Uremic hyperparathyroidism (UHPT) is common in patients with end-stage kidney disease, and elevated parathyroid hormone (PTH) levels have been linked with delayed graft function after kidney transplants, but current guidelines for PTH levels may not go far enough to reduce the risk of graft failure and other post–kidney transplant complications in people with elevated PTH before transplant, according to Dr. Glenda G. Callender of Yale University, New Haven, Conn., and her colleagues.

Dr. Glenda G. Callender

“Uremic hyperparathyroidism was associated with an increased risk of complications in the first year post kidney transplant,” Dr. Callender said at the annual meeting of the American Association of Endocrine Surgeons. “Pre–kidney transplant parathyroidectomy was associated with a decreased risk of post–kidney transplant graft failure. This implies that pre–kidney transplant reduction of PTH levels should be considered in patients with UPHT.”

The Yale researchers reviewed outcomes of 913 patients at their institution who had a kidney transplant from 2005 to 2014. They analyzed biochemical values before kidney transplant and at three intervals post transplant: at 1 month, 6 months, and 1 year. Among the outcomes they evaluated were calcium and PTH levels, estimated glomerular filtration rate, complications, delayed graft function, and graft failure. The overall graft survival rate was 97.8% 1 year after kidney transplantation.

Overall, 49.4% of patients (451) had a diagnosis with UHPT before kidney transplant; 6.2% of all patients (57) had parathyroidectomy before kidney transplant and another 2% (18) had parathyroidectomy at some point after their kidney transplant operations. Median baseline PTH levels were higher in the UHPT patients: 206 pg/mL vs. 159 pg/mL for the non-UHPT group, Dr. Callender reported.

The researchers captured complete data on 37 of the 57 patients who had pretransplant parathyroidectomy. Twenty-four (65% of the group) had subtotal parathyroidectomy in which 3.5 glands were removed, and 12 (32%) had fewer than 3.5 glands removed. One patient had total parathyroidectomy, she said.

Among the patients with UHPT, the median pre–kidney transplant PTH was similar between the pretransplant parathyroidectomy and the no-parathyroidectomy groups: 218 pg/mL and 180 pg/mL, respectively, Dr. Callender said.

Pre–kidney transplant diagnosis of UHPT had an odds ratio of 1.44 for complications in the first year after transplant surgery, but not necessarily a greater risk for graft function or graft failure, she said. However, those relative risks changed with the degree of PTH above normal. Patients with UHPT who had pretransplant parathyroidectomy had a lower risk of graft failure, with an odds ratio of 0.547.

Current Kidney Disease Improving Global Outcomes (KDIGO) guidelines recommend maintaining PTH levels in patients with UHPT before they have kidney transplant surgery at no more than nine times normal. To test the optimal PTH levels before kidney transplant, the researchers analyzed thresholds ranging from two to nine times the normal limit.

“A pre–kidney transplant [PTH] level greater than or equal to six times normal was associated with post-transplant graft failure but not with delayed graft function or complications in the first year post kidney transplant,” Dr. Callender said. “Although the thresholds at two and four times normal were statistically significant, there was a continued risk significant for graft failure above six times normal.”

This finding “suggests that perhaps the current KDIGO guideline of maintaining patient PTH at up to nine times normal is too liberal,” Dr. Callender said.

She acknowledged several limitations of the study: its retrospective nature, small sample size, and “many missing data points” because a wide variety of dialysis centers with varying documentation standards collected information.

“However,” Dr. Callender said, “we believe these findings support the design and implementation of a multi-institutional, prospective, randomized control trial to evaluate whether a change in management of patients with uremic hyperparathyroidism is warranted.”

Dr. Callender and her coauthors had no financial relationships to disclose.

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BALTIMORE – Performing a parathyroidectomy in kidney transplant patients before their transplant can reduce the risk of graft failure and provide other benefits, the findings of a retrospective study of 913 patients suggest.

Uremic hyperparathyroidism (UHPT) is common in patients with end-stage kidney disease, and elevated parathyroid hormone (PTH) levels have been linked with delayed graft function after kidney transplants, but current guidelines for PTH levels may not go far enough to reduce the risk of graft failure and other post–kidney transplant complications in people with elevated PTH before transplant, according to Dr. Glenda G. Callender of Yale University, New Haven, Conn., and her colleagues.

Dr. Glenda G. Callender

“Uremic hyperparathyroidism was associated with an increased risk of complications in the first year post kidney transplant,” Dr. Callender said at the annual meeting of the American Association of Endocrine Surgeons. “Pre–kidney transplant parathyroidectomy was associated with a decreased risk of post–kidney transplant graft failure. This implies that pre–kidney transplant reduction of PTH levels should be considered in patients with UPHT.”

The Yale researchers reviewed outcomes of 913 patients at their institution who had a kidney transplant from 2005 to 2014. They analyzed biochemical values before kidney transplant and at three intervals post transplant: at 1 month, 6 months, and 1 year. Among the outcomes they evaluated were calcium and PTH levels, estimated glomerular filtration rate, complications, delayed graft function, and graft failure. The overall graft survival rate was 97.8% 1 year after kidney transplantation.

Overall, 49.4% of patients (451) had a diagnosis with UHPT before kidney transplant; 6.2% of all patients (57) had parathyroidectomy before kidney transplant and another 2% (18) had parathyroidectomy at some point after their kidney transplant operations. Median baseline PTH levels were higher in the UHPT patients: 206 pg/mL vs. 159 pg/mL for the non-UHPT group, Dr. Callender reported.

The researchers captured complete data on 37 of the 57 patients who had pretransplant parathyroidectomy. Twenty-four (65% of the group) had subtotal parathyroidectomy in which 3.5 glands were removed, and 12 (32%) had fewer than 3.5 glands removed. One patient had total parathyroidectomy, she said.

Among the patients with UHPT, the median pre–kidney transplant PTH was similar between the pretransplant parathyroidectomy and the no-parathyroidectomy groups: 218 pg/mL and 180 pg/mL, respectively, Dr. Callender said.

Pre–kidney transplant diagnosis of UHPT had an odds ratio of 1.44 for complications in the first year after transplant surgery, but not necessarily a greater risk for graft function or graft failure, she said. However, those relative risks changed with the degree of PTH above normal. Patients with UHPT who had pretransplant parathyroidectomy had a lower risk of graft failure, with an odds ratio of 0.547.

Current Kidney Disease Improving Global Outcomes (KDIGO) guidelines recommend maintaining PTH levels in patients with UHPT before they have kidney transplant surgery at no more than nine times normal. To test the optimal PTH levels before kidney transplant, the researchers analyzed thresholds ranging from two to nine times the normal limit.

“A pre–kidney transplant [PTH] level greater than or equal to six times normal was associated with post-transplant graft failure but not with delayed graft function or complications in the first year post kidney transplant,” Dr. Callender said. “Although the thresholds at two and four times normal were statistically significant, there was a continued risk significant for graft failure above six times normal.”

This finding “suggests that perhaps the current KDIGO guideline of maintaining patient PTH at up to nine times normal is too liberal,” Dr. Callender said.

She acknowledged several limitations of the study: its retrospective nature, small sample size, and “many missing data points” because a wide variety of dialysis centers with varying documentation standards collected information.

“However,” Dr. Callender said, “we believe these findings support the design and implementation of a multi-institutional, prospective, randomized control trial to evaluate whether a change in management of patients with uremic hyperparathyroidism is warranted.”

Dr. Callender and her coauthors had no financial relationships to disclose.

BALTIMORE – Performing a parathyroidectomy in kidney transplant patients before their transplant can reduce the risk of graft failure and provide other benefits, the findings of a retrospective study of 913 patients suggest.

Uremic hyperparathyroidism (UHPT) is common in patients with end-stage kidney disease, and elevated parathyroid hormone (PTH) levels have been linked with delayed graft function after kidney transplants, but current guidelines for PTH levels may not go far enough to reduce the risk of graft failure and other post–kidney transplant complications in people with elevated PTH before transplant, according to Dr. Glenda G. Callender of Yale University, New Haven, Conn., and her colleagues.

Dr. Glenda G. Callender

“Uremic hyperparathyroidism was associated with an increased risk of complications in the first year post kidney transplant,” Dr. Callender said at the annual meeting of the American Association of Endocrine Surgeons. “Pre–kidney transplant parathyroidectomy was associated with a decreased risk of post–kidney transplant graft failure. This implies that pre–kidney transplant reduction of PTH levels should be considered in patients with UPHT.”

The Yale researchers reviewed outcomes of 913 patients at their institution who had a kidney transplant from 2005 to 2014. They analyzed biochemical values before kidney transplant and at three intervals post transplant: at 1 month, 6 months, and 1 year. Among the outcomes they evaluated were calcium and PTH levels, estimated glomerular filtration rate, complications, delayed graft function, and graft failure. The overall graft survival rate was 97.8% 1 year after kidney transplantation.

Overall, 49.4% of patients (451) had a diagnosis with UHPT before kidney transplant; 6.2% of all patients (57) had parathyroidectomy before kidney transplant and another 2% (18) had parathyroidectomy at some point after their kidney transplant operations. Median baseline PTH levels were higher in the UHPT patients: 206 pg/mL vs. 159 pg/mL for the non-UHPT group, Dr. Callender reported.

The researchers captured complete data on 37 of the 57 patients who had pretransplant parathyroidectomy. Twenty-four (65% of the group) had subtotal parathyroidectomy in which 3.5 glands were removed, and 12 (32%) had fewer than 3.5 glands removed. One patient had total parathyroidectomy, she said.

Among the patients with UHPT, the median pre–kidney transplant PTH was similar between the pretransplant parathyroidectomy and the no-parathyroidectomy groups: 218 pg/mL and 180 pg/mL, respectively, Dr. Callender said.

Pre–kidney transplant diagnosis of UHPT had an odds ratio of 1.44 for complications in the first year after transplant surgery, but not necessarily a greater risk for graft function or graft failure, she said. However, those relative risks changed with the degree of PTH above normal. Patients with UHPT who had pretransplant parathyroidectomy had a lower risk of graft failure, with an odds ratio of 0.547.

Current Kidney Disease Improving Global Outcomes (KDIGO) guidelines recommend maintaining PTH levels in patients with UHPT before they have kidney transplant surgery at no more than nine times normal. To test the optimal PTH levels before kidney transplant, the researchers analyzed thresholds ranging from two to nine times the normal limit.

“A pre–kidney transplant [PTH] level greater than or equal to six times normal was associated with post-transplant graft failure but not with delayed graft function or complications in the first year post kidney transplant,” Dr. Callender said. “Although the thresholds at two and four times normal were statistically significant, there was a continued risk significant for graft failure above six times normal.”

This finding “suggests that perhaps the current KDIGO guideline of maintaining patient PTH at up to nine times normal is too liberal,” Dr. Callender said.

She acknowledged several limitations of the study: its retrospective nature, small sample size, and “many missing data points” because a wide variety of dialysis centers with varying documentation standards collected information.

“However,” Dr. Callender said, “we believe these findings support the design and implementation of a multi-institutional, prospective, randomized control trial to evaluate whether a change in management of patients with uremic hyperparathyroidism is warranted.”

Dr. Callender and her coauthors had no financial relationships to disclose.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Parathyroidectomy before kidney transplant may reduce complications
Display Headline
Parathyroidectomy before kidney transplant may reduce complications
Article Source

AT AAES 2016

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Parathyroidectomy reduces graft failure in individuals with uremic hyperparathyroidism (UHPT) who undergo kidney transplant.

Major finding: Pre–kidney transplant diagnosis of UHPT had an odds ratio of 1.44 of complications a year after transplant; patients who had parathyroidectomy before transplant had a reduced 0.547 odds ratio risk of graft failure.

Data source: Review of 913 patients who had kidney transplant from 2005 to 2014 at a single institution.

Disclosures: Dr. Callender and her coauthors reported having no financial disclosures.

Vitamin D supplementation cuts dust mite atopy

Article Type
Changed
Tue, 07/21/2020 - 14:18
Display Headline
Vitamin D supplementation cuts dust mite atopy

BALTIMORE – Three months of daily, oral treatment with a relatively high but safe dosage of a vitamin D supplement to pregnant mothers during late gestation followed by continued oral supplementation to their neonates during the first 6 months of life led to a significant reduction in the prevalence of dust-mite skin reactivity in those children once they reached 18 months old in a randomized, controlled trial with 259 mothers and infants.

And in a preliminary assessment that tallied the number of children who required primary care office visits for asthma through age 18 months, children who had received the highest vitamin D supplementation also showed a statistically significant reduction of these visits, compared with the placebo control children, Dr. Cameron C. Grant reported at the annual meeting of the Pediatric Academic Societies.

Dr. Cameron C. Grant

This suggestion that the vitamin D intervention could cut asthma development is not completely certain because in 18-month-old children, diagnosis of asthma is “very insecure,” noted Dr. Grant, a pediatrician at the University of Auckland, New Zealand and at Starship Children’s Hospital, also in Auckland. In addition, a limitation of the observed effect on dust mite atopy on skin-test challenge was that this follow-up occurred in only 186 (72%) of the 259 infants who participated in the study.

The study’s premise was that vitamin D is an immune system modulator, and that New Zealand provides an excellent setting to test the hypothesis that normalized vitamin D levels can help prevent development of atopy and asthma because many of the country’s residents are vitamin D deficient due to their diet and sun avoidance to prevent skin cancers. Results from prior studies had shown that 57% of New Zealand neonates have inadequate levels of vitamin D at birth, defined as a serum level of 25-hydroxyvitamin D of less than 20 ng/ml (less than 50 nmol/L), Dr. Grant noted.

“I think this intervention will only work in populations that are vitamin D deficient,” Dr. Grant said in an interview. In his study, the average serum level of 25-hydroxyvitamin D among control neonates was 38 nmol/L (about 15 ng/mL). In contrast, neonates born to mothers who had received a daily, higher-dose vitamin D supplement during the third trimester had serum measures that were roughly twice that level.

The study enrolled 260 pregnant women from the Auckland area with a single pregnancy at 26-30 weeks’ gestation; average gestational age at baseline was 27 weeks. Dr. Grant and his associates randomized the mothers to receive 1,000 IU oral vitamin D daily, 2,000 oral vitamin D daily, or placebo. The women delivered 259 infants. Infants born to women on the lower dosage supplement then received 400 IU vitamin daily for 6 months, those born to mothers on the higher level supplement received 800 IU vitamin D daily for 6 months, and those born to mothers in the placebo group received placebo supplements daily for 6 months.

Both supplement regimens led to statistically significant increases in serum levels of 25-hydroxyvitamin D in maternal serum at 36 weeks’ gestation, in cord blood at delivery, in the neonates’ serum at ages 2 months and 4 months, and in infant serum in the higher dosage group at 6 months of age, compared with similar measures taken at all these time points in the placebo group.

In addition, the neonates in the higher dosage group had significantly higher serum levels at 2, 4, and 6 months, compared with the lower dosage group. When measured a final time at 18-month follow-up, a year after the end of vitamin D supplementation, average serum levels of 25-hydroxyvitamin D in an three subgroups of children were virtually identical and similar to maternal serum levels at baseline. Dr. Grant and his associates had previously reported these findings and also had documented the safety of both the low and high levels of vitamin D supplements for both mothers and their children (Pediatrics. 2014 Jan;133[1]:e143-53).

The new findings reported by Dr. Grant focused on clinical outcomes at 18 months. He and his colleagues ran skin-prick testing on 186 of the 259 (72%) children in the study (the remaining children weren’t available for this follow-up assessment). They tested three aeroallergens: cat, pollen, and house dust mite. They saw no significant differences in the prevalence of positive skin-prick reactions among the three study groups to cat and pollen, but prevalence levels of positive reactions to dust mite were 9% in the controls, 3% of children in the low-dosage group, and none in the high dosage group. The difference between the controls and high dosage groups was statistically significant; the difference between the controls and the low dosage group was not significant, Dr. Grant said. Additional testing of specific IgE responses to four different dust mite antigens showed statistically significant reductions in responses to each of the four antigens among the high dosage children, compared with the controls and with the low dosage children.

 

 

The researchers also tallied the number of acute, primary care office visits during the first 18 months of life among the children in each of the three subgroups for a variety of respiratory diagnoses. The three groups showed no significant differences in total number of office visits for most of these diagnoses, including colds, otitis media, croup, and bronchitis. However, about 12% of children in the control group had been seen in a primary care office for a diagnosis of asthma, compared with none of the children in the low dosage group and about 4% in the high-dosage group. The differences between the two intervention groups and the control group were statistically significant. Dr. Grant cautioned that this finding is very preliminary and that any conclusions about the impact of vitamin D supplements on asthma incidence must await studies with larger numbers of children who are followed to an older age.

Dr. Grant had no disclosures.

[email protected]

On Twitter @mitchelzoler

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
vitamin D, asthma, atopy, dust mite, neonatal, New Zealand, Cameron Grant
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BALTIMORE – Three months of daily, oral treatment with a relatively high but safe dosage of a vitamin D supplement to pregnant mothers during late gestation followed by continued oral supplementation to their neonates during the first 6 months of life led to a significant reduction in the prevalence of dust-mite skin reactivity in those children once they reached 18 months old in a randomized, controlled trial with 259 mothers and infants.

And in a preliminary assessment that tallied the number of children who required primary care office visits for asthma through age 18 months, children who had received the highest vitamin D supplementation also showed a statistically significant reduction of these visits, compared with the placebo control children, Dr. Cameron C. Grant reported at the annual meeting of the Pediatric Academic Societies.

Dr. Cameron C. Grant

This suggestion that the vitamin D intervention could cut asthma development is not completely certain because in 18-month-old children, diagnosis of asthma is “very insecure,” noted Dr. Grant, a pediatrician at the University of Auckland, New Zealand and at Starship Children’s Hospital, also in Auckland. In addition, a limitation of the observed effect on dust mite atopy on skin-test challenge was that this follow-up occurred in only 186 (72%) of the 259 infants who participated in the study.

The study’s premise was that vitamin D is an immune system modulator, and that New Zealand provides an excellent setting to test the hypothesis that normalized vitamin D levels can help prevent development of atopy and asthma because many of the country’s residents are vitamin D deficient due to their diet and sun avoidance to prevent skin cancers. Results from prior studies had shown that 57% of New Zealand neonates have inadequate levels of vitamin D at birth, defined as a serum level of 25-hydroxyvitamin D of less than 20 ng/ml (less than 50 nmol/L), Dr. Grant noted.

“I think this intervention will only work in populations that are vitamin D deficient,” Dr. Grant said in an interview. In his study, the average serum level of 25-hydroxyvitamin D among control neonates was 38 nmol/L (about 15 ng/mL). In contrast, neonates born to mothers who had received a daily, higher-dose vitamin D supplement during the third trimester had serum measures that were roughly twice that level.

The study enrolled 260 pregnant women from the Auckland area with a single pregnancy at 26-30 weeks’ gestation; average gestational age at baseline was 27 weeks. Dr. Grant and his associates randomized the mothers to receive 1,000 IU oral vitamin D daily, 2,000 oral vitamin D daily, or placebo. The women delivered 259 infants. Infants born to women on the lower dosage supplement then received 400 IU vitamin daily for 6 months, those born to mothers on the higher level supplement received 800 IU vitamin D daily for 6 months, and those born to mothers in the placebo group received placebo supplements daily for 6 months.

Both supplement regimens led to statistically significant increases in serum levels of 25-hydroxyvitamin D in maternal serum at 36 weeks’ gestation, in cord blood at delivery, in the neonates’ serum at ages 2 months and 4 months, and in infant serum in the higher dosage group at 6 months of age, compared with similar measures taken at all these time points in the placebo group.

In addition, the neonates in the higher dosage group had significantly higher serum levels at 2, 4, and 6 months, compared with the lower dosage group. When measured a final time at 18-month follow-up, a year after the end of vitamin D supplementation, average serum levels of 25-hydroxyvitamin D in an three subgroups of children were virtually identical and similar to maternal serum levels at baseline. Dr. Grant and his associates had previously reported these findings and also had documented the safety of both the low and high levels of vitamin D supplements for both mothers and their children (Pediatrics. 2014 Jan;133[1]:e143-53).

The new findings reported by Dr. Grant focused on clinical outcomes at 18 months. He and his colleagues ran skin-prick testing on 186 of the 259 (72%) children in the study (the remaining children weren’t available for this follow-up assessment). They tested three aeroallergens: cat, pollen, and house dust mite. They saw no significant differences in the prevalence of positive skin-prick reactions among the three study groups to cat and pollen, but prevalence levels of positive reactions to dust mite were 9% in the controls, 3% of children in the low-dosage group, and none in the high dosage group. The difference between the controls and high dosage groups was statistically significant; the difference between the controls and the low dosage group was not significant, Dr. Grant said. Additional testing of specific IgE responses to four different dust mite antigens showed statistically significant reductions in responses to each of the four antigens among the high dosage children, compared with the controls and with the low dosage children.

 

 

The researchers also tallied the number of acute, primary care office visits during the first 18 months of life among the children in each of the three subgroups for a variety of respiratory diagnoses. The three groups showed no significant differences in total number of office visits for most of these diagnoses, including colds, otitis media, croup, and bronchitis. However, about 12% of children in the control group had been seen in a primary care office for a diagnosis of asthma, compared with none of the children in the low dosage group and about 4% in the high-dosage group. The differences between the two intervention groups and the control group were statistically significant. Dr. Grant cautioned that this finding is very preliminary and that any conclusions about the impact of vitamin D supplements on asthma incidence must await studies with larger numbers of children who are followed to an older age.

Dr. Grant had no disclosures.

[email protected]

On Twitter @mitchelzoler

BALTIMORE – Three months of daily, oral treatment with a relatively high but safe dosage of a vitamin D supplement to pregnant mothers during late gestation followed by continued oral supplementation to their neonates during the first 6 months of life led to a significant reduction in the prevalence of dust-mite skin reactivity in those children once they reached 18 months old in a randomized, controlled trial with 259 mothers and infants.

And in a preliminary assessment that tallied the number of children who required primary care office visits for asthma through age 18 months, children who had received the highest vitamin D supplementation also showed a statistically significant reduction of these visits, compared with the placebo control children, Dr. Cameron C. Grant reported at the annual meeting of the Pediatric Academic Societies.

Dr. Cameron C. Grant

This suggestion that the vitamin D intervention could cut asthma development is not completely certain because in 18-month-old children, diagnosis of asthma is “very insecure,” noted Dr. Grant, a pediatrician at the University of Auckland, New Zealand and at Starship Children’s Hospital, also in Auckland. In addition, a limitation of the observed effect on dust mite atopy on skin-test challenge was that this follow-up occurred in only 186 (72%) of the 259 infants who participated in the study.

The study’s premise was that vitamin D is an immune system modulator, and that New Zealand provides an excellent setting to test the hypothesis that normalized vitamin D levels can help prevent development of atopy and asthma because many of the country’s residents are vitamin D deficient due to their diet and sun avoidance to prevent skin cancers. Results from prior studies had shown that 57% of New Zealand neonates have inadequate levels of vitamin D at birth, defined as a serum level of 25-hydroxyvitamin D of less than 20 ng/ml (less than 50 nmol/L), Dr. Grant noted.

“I think this intervention will only work in populations that are vitamin D deficient,” Dr. Grant said in an interview. In his study, the average serum level of 25-hydroxyvitamin D among control neonates was 38 nmol/L (about 15 ng/mL). In contrast, neonates born to mothers who had received a daily, higher-dose vitamin D supplement during the third trimester had serum measures that were roughly twice that level.

The study enrolled 260 pregnant women from the Auckland area with a single pregnancy at 26-30 weeks’ gestation; average gestational age at baseline was 27 weeks. Dr. Grant and his associates randomized the mothers to receive 1,000 IU oral vitamin D daily, 2,000 oral vitamin D daily, or placebo. The women delivered 259 infants. Infants born to women on the lower dosage supplement then received 400 IU vitamin daily for 6 months, those born to mothers on the higher level supplement received 800 IU vitamin D daily for 6 months, and those born to mothers in the placebo group received placebo supplements daily for 6 months.

Both supplement regimens led to statistically significant increases in serum levels of 25-hydroxyvitamin D in maternal serum at 36 weeks’ gestation, in cord blood at delivery, in the neonates’ serum at ages 2 months and 4 months, and in infant serum in the higher dosage group at 6 months of age, compared with similar measures taken at all these time points in the placebo group.

In addition, the neonates in the higher dosage group had significantly higher serum levels at 2, 4, and 6 months, compared with the lower dosage group. When measured a final time at 18-month follow-up, a year after the end of vitamin D supplementation, average serum levels of 25-hydroxyvitamin D in an three subgroups of children were virtually identical and similar to maternal serum levels at baseline. Dr. Grant and his associates had previously reported these findings and also had documented the safety of both the low and high levels of vitamin D supplements for both mothers and their children (Pediatrics. 2014 Jan;133[1]:e143-53).

The new findings reported by Dr. Grant focused on clinical outcomes at 18 months. He and his colleagues ran skin-prick testing on 186 of the 259 (72%) children in the study (the remaining children weren’t available for this follow-up assessment). They tested three aeroallergens: cat, pollen, and house dust mite. They saw no significant differences in the prevalence of positive skin-prick reactions among the three study groups to cat and pollen, but prevalence levels of positive reactions to dust mite were 9% in the controls, 3% of children in the low-dosage group, and none in the high dosage group. The difference between the controls and high dosage groups was statistically significant; the difference between the controls and the low dosage group was not significant, Dr. Grant said. Additional testing of specific IgE responses to four different dust mite antigens showed statistically significant reductions in responses to each of the four antigens among the high dosage children, compared with the controls and with the low dosage children.

 

 

The researchers also tallied the number of acute, primary care office visits during the first 18 months of life among the children in each of the three subgroups for a variety of respiratory diagnoses. The three groups showed no significant differences in total number of office visits for most of these diagnoses, including colds, otitis media, croup, and bronchitis. However, about 12% of children in the control group had been seen in a primary care office for a diagnosis of asthma, compared with none of the children in the low dosage group and about 4% in the high-dosage group. The differences between the two intervention groups and the control group were statistically significant. Dr. Grant cautioned that this finding is very preliminary and that any conclusions about the impact of vitamin D supplements on asthma incidence must await studies with larger numbers of children who are followed to an older age.

Dr. Grant had no disclosures.

[email protected]

On Twitter @mitchelzoler

References

References

Publications
Publications
Topics
Article Type
Display Headline
Vitamin D supplementation cuts dust mite atopy
Display Headline
Vitamin D supplementation cuts dust mite atopy
Legacy Keywords
vitamin D, asthma, atopy, dust mite, neonatal, New Zealand, Cameron Grant
Legacy Keywords
vitamin D, asthma, atopy, dust mite, neonatal, New Zealand, Cameron Grant
Sections
Article Source

AT THE PAS ANNUAL MEETING

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Maternal treatment to achieve adequate vitamin D levels during late gestation followed by neonatal vitamin D supplementation significantly cut dust mite atopy at 18 months of age, along with a suggestion of reduced asthma incidence.

Major finding: Dust mite reactivity at 18 months occurred in no children treated with higher vitamin D supplementation and in 9% of controls.

Data source: A randomized, controlled, single-center study with 260 pregnant women who delivered 259 infants.

Disclosures: Dr. Grant had no disclosures.