User login
Jeff Evans has been editor of Rheumatology News/MDedge Rheumatology and the EULAR Congress News since 2013. He started at Frontline Medical Communications in 2001 and was a reporter for 8 years before serving as editor of Clinical Neurology News and World Neurology, and briefly as editor of GI & Hepatology News. He graduated cum laude from Cornell University (New York) with a BA in biological sciences, concentrating in neurobiology and behavior.
Small Reduction in Fractures, Falls With Enough Vitamin D
ARLINGTON, VA. — People who take sufficiently high supplement doses of vitamin D or those who already have adequate levels of vitamin D were found to have a small but significantly reduced risk of specific fractures, falls, and low bone mineral density, according to an Agency for Healthcare Research and Quality report on the effect of vitamin D supplements on bone health outcomes.
Dr. Ann B. Cranney and her associates at the University of Ottawa Evidence-Based Practice Center extensively reviewed the literature on the effects of 25-hydroxyvitamin D (25[OH]D) concentration or vitamin D supplementation. She presented the results of metaanalyses on studies that met eligibility criteria at a conference sponsored by the American Society for Bone and Mineral Research.
It was not possible to quantitatively summarize the results of 10 randomized controlled trials or 31 observational studies that examined the effect of 25(OH)D levels on bone health outcomes in postmenopausal women and older men, so Dr. Cranney and her colleagues categorized the evidence supporting the effect of the vitamin D metabolite as good, fair, or inconsistent. For serum 25(OH)D levels of at least 50–80 nmol/L, there was good evidence of an association with increased bone mineral density in the hip, fair evidence of an inverse association with the risk of hip fracture, and inconsistent evidence of an association with a reduction in falls and functional measures such as grip strength and body sway.
In 74 randomized controlled trials of supplementation with vitamin D3 or D2, the investigators found that 25(OH)D levels increased more with supplementation with vitamin D3 than with vitamin D2. Data from 16 randomized controlled trials provided enough information on 25(OH)D levels in control and treatment groups at baseline and at the end of the study to determine that supplementation with 700 IU/day or more of vitamin D3 was associated with a drop in serum parathyroid hormone levels.
The investigators also calculated from the trials that 1 IU vitamin D3 raises serum 25(OH)D concentration by 0.016 nmol/L.
Trials that used supplements with either vitamin D3 or vitamin D2 did not show a significant effect on reducing the risk of fractures overall or of hip fractures in particular. Also, supplementation with vitamin D plus calcium or vitamin D alone did not have a significant effect on the risk of nonvertebral fractures. But in eight trials, supplements of 700 IU/day or more vitamin D3 significantly reduced the risk of nonvertebral fractures by 15%. This risk reduction was primarily driven by two trials of individuals in an institutional setting, who had a 22% reduction in the risk of nonvertebral fractures. Supplements of 700 IU/day or more vitamin D3 also significantly lowered the risk of hip fractures; trials in an institutional setting, rather than in the community, factored strongly in the overall results, she noted.
The investigators found that participants in trials of vitamin D3 supplementation that recorded serum 25(OH)D concentrations of 74 nmol/L or higher had a significant 23% lower risk of nonvertebral fractures than did participants of trials that did not achieve a 25(OH)D level of 74 nmol/L.
Vitamin D supplements did not reduce the risk of falls overall in 12 trials. But vitamin D supplements significantly lowered the risk of a fall by 11% in six trials in which falls were defined or independently ascertained, Dr. Cranney said.
The Agency for Healthcare Research and Quality requested the report on behalf of the National Institutes of Health Office of Dietary Supplements.
ARLINGTON, VA. — People who take sufficiently high supplement doses of vitamin D or those who already have adequate levels of vitamin D were found to have a small but significantly reduced risk of specific fractures, falls, and low bone mineral density, according to an Agency for Healthcare Research and Quality report on the effect of vitamin D supplements on bone health outcomes.
Dr. Ann B. Cranney and her associates at the University of Ottawa Evidence-Based Practice Center extensively reviewed the literature on the effects of 25-hydroxyvitamin D (25[OH]D) concentration or vitamin D supplementation. She presented the results of metaanalyses on studies that met eligibility criteria at a conference sponsored by the American Society for Bone and Mineral Research.
It was not possible to quantitatively summarize the results of 10 randomized controlled trials or 31 observational studies that examined the effect of 25(OH)D levels on bone health outcomes in postmenopausal women and older men, so Dr. Cranney and her colleagues categorized the evidence supporting the effect of the vitamin D metabolite as good, fair, or inconsistent. For serum 25(OH)D levels of at least 50–80 nmol/L, there was good evidence of an association with increased bone mineral density in the hip, fair evidence of an inverse association with the risk of hip fracture, and inconsistent evidence of an association with a reduction in falls and functional measures such as grip strength and body sway.
In 74 randomized controlled trials of supplementation with vitamin D3 or D2, the investigators found that 25(OH)D levels increased more with supplementation with vitamin D3 than with vitamin D2. Data from 16 randomized controlled trials provided enough information on 25(OH)D levels in control and treatment groups at baseline and at the end of the study to determine that supplementation with 700 IU/day or more of vitamin D3 was associated with a drop in serum parathyroid hormone levels.
The investigators also calculated from the trials that 1 IU vitamin D3 raises serum 25(OH)D concentration by 0.016 nmol/L.
Trials that used supplements with either vitamin D3 or vitamin D2 did not show a significant effect on reducing the risk of fractures overall or of hip fractures in particular. Also, supplementation with vitamin D plus calcium or vitamin D alone did not have a significant effect on the risk of nonvertebral fractures. But in eight trials, supplements of 700 IU/day or more vitamin D3 significantly reduced the risk of nonvertebral fractures by 15%. This risk reduction was primarily driven by two trials of individuals in an institutional setting, who had a 22% reduction in the risk of nonvertebral fractures. Supplements of 700 IU/day or more vitamin D3 also significantly lowered the risk of hip fractures; trials in an institutional setting, rather than in the community, factored strongly in the overall results, she noted.
The investigators found that participants in trials of vitamin D3 supplementation that recorded serum 25(OH)D concentrations of 74 nmol/L or higher had a significant 23% lower risk of nonvertebral fractures than did participants of trials that did not achieve a 25(OH)D level of 74 nmol/L.
Vitamin D supplements did not reduce the risk of falls overall in 12 trials. But vitamin D supplements significantly lowered the risk of a fall by 11% in six trials in which falls were defined or independently ascertained, Dr. Cranney said.
The Agency for Healthcare Research and Quality requested the report on behalf of the National Institutes of Health Office of Dietary Supplements.
ARLINGTON, VA. — People who take sufficiently high supplement doses of vitamin D or those who already have adequate levels of vitamin D were found to have a small but significantly reduced risk of specific fractures, falls, and low bone mineral density, according to an Agency for Healthcare Research and Quality report on the effect of vitamin D supplements on bone health outcomes.
Dr. Ann B. Cranney and her associates at the University of Ottawa Evidence-Based Practice Center extensively reviewed the literature on the effects of 25-hydroxyvitamin D (25[OH]D) concentration or vitamin D supplementation. She presented the results of metaanalyses on studies that met eligibility criteria at a conference sponsored by the American Society for Bone and Mineral Research.
It was not possible to quantitatively summarize the results of 10 randomized controlled trials or 31 observational studies that examined the effect of 25(OH)D levels on bone health outcomes in postmenopausal women and older men, so Dr. Cranney and her colleagues categorized the evidence supporting the effect of the vitamin D metabolite as good, fair, or inconsistent. For serum 25(OH)D levels of at least 50–80 nmol/L, there was good evidence of an association with increased bone mineral density in the hip, fair evidence of an inverse association with the risk of hip fracture, and inconsistent evidence of an association with a reduction in falls and functional measures such as grip strength and body sway.
In 74 randomized controlled trials of supplementation with vitamin D3 or D2, the investigators found that 25(OH)D levels increased more with supplementation with vitamin D3 than with vitamin D2. Data from 16 randomized controlled trials provided enough information on 25(OH)D levels in control and treatment groups at baseline and at the end of the study to determine that supplementation with 700 IU/day or more of vitamin D3 was associated with a drop in serum parathyroid hormone levels.
The investigators also calculated from the trials that 1 IU vitamin D3 raises serum 25(OH)D concentration by 0.016 nmol/L.
Trials that used supplements with either vitamin D3 or vitamin D2 did not show a significant effect on reducing the risk of fractures overall or of hip fractures in particular. Also, supplementation with vitamin D plus calcium or vitamin D alone did not have a significant effect on the risk of nonvertebral fractures. But in eight trials, supplements of 700 IU/day or more vitamin D3 significantly reduced the risk of nonvertebral fractures by 15%. This risk reduction was primarily driven by two trials of individuals in an institutional setting, who had a 22% reduction in the risk of nonvertebral fractures. Supplements of 700 IU/day or more vitamin D3 also significantly lowered the risk of hip fractures; trials in an institutional setting, rather than in the community, factored strongly in the overall results, she noted.
The investigators found that participants in trials of vitamin D3 supplementation that recorded serum 25(OH)D concentrations of 74 nmol/L or higher had a significant 23% lower risk of nonvertebral fractures than did participants of trials that did not achieve a 25(OH)D level of 74 nmol/L.
Vitamin D supplements did not reduce the risk of falls overall in 12 trials. But vitamin D supplements significantly lowered the risk of a fall by 11% in six trials in which falls were defined or independently ascertained, Dr. Cranney said.
The Agency for Healthcare Research and Quality requested the report on behalf of the National Institutes of Health Office of Dietary Supplements.
Genetic Variant Tied to Amyloid-β in Alzheimer's
Genetic variants of a protein involved in determining the fate of amyloid precursor protein are associated with an increased risk of developing Alzheimer's disease, reported Dr. Ekaterina Rogaeva of the University of Toronto and her associates.
The increased risk for the disease appears to be caused by certain haplotypes of the SORL1 gene that decrease the expression of the gene. As a result, more amyloid precursor protein follows a pathway in which excess amyloid-β peptide is produced in the brain—one of the central events in the pathogenesis of Alzheimer's disease (AD), according to the investigators.
Dr. Samuel E. Gandy, director of the Farber Institute for Neurosciences at Thomas Jefferson University, Philadelphia, said the study's results “fit well into the amyloid model for Alzheimer's, and that's certainly the one that's getting the most attention and most assessment clinically.”
Dr. Rogaeva and her colleagues found that several overlapping haplotypes in two different regions of the SORL1 gene increased the likelihood of developing late-onset familial Alzheimer's disease (FAD), based on results obtained from two cohorts of families with late-onset FAD and later replicated in a cohort of cases and controls in other studies.
“Taken together, our results suggest that genetic and possibly environmentally specified changes in SORL1 [protein] expression or function are causally linked to the pathogenesis [of Alzheimer's disease] and have a modest effect on risk for this disease,” the researchers reported (Nat. Genet. 2007 Jan. 14 [Epub doi:10.1038/ng1943]).
The initial “discovery cohort” comprised 124 northern European FAD families and 228 Caribbean Hispanic FAD families.
The “replication cohort” consisted of northern European individuals from a case-control study (178 cases with sporadic AD and 242 controls with self-identified white European ancestry), 276 white sibships from the Multi-Institutional Research in Alzheimer's Genetic Epidemiology (MIRAGE) study, 238 African-American sibships from the MIRAGE study, and Israeli-Arab individuals (111 with AD and 114 normal controls from the Wadi Ara population study).
The researchers confirmed the association between AD and the SORL1 gene by genotyping the single-nucleotide polymorphisms that were contained in the haplotypes and then analyzing them at an independent facility in three series of cases and controls of European ancestry from different Mayo Clinic centers (totaling 1,405 late-onset AD cases and 2,124 controls).
In genetic studies, particularly those involving Alzheimer's disease, there has “been an issue of one group making a report and then a number of other groups being unable to replicate [the results] across different ethnic groups,” Dr. Gandy said in an interview. “The good thing about this paper is that they've already tested several totally independent ethnic groups, so you can feel a bit more confident that this is true.”
SORL1 protein directly binds amyloid precursor protein and differentially regulates whether it sorts into a recycling pathway or into a pathway that generates amyloid-β.
Experiments that suppressed SORL1 protein expression—mimicking what is speculated to be the effects of AD-associated variants in the SORL1 gene—led to an overproduction of amyloid-β.
The actual disease-causing variants of the SORL1 gene are unlikely to be the single-nucleotide polymorphisms and haplotypes that were identified in the SORL1 gene's exons, the researchers noted. Instead, the pathogenic variants are likely located in sequences in the introns of the SORL1 gene and may “modulate the cell type-specific transcription or translation of the SORL1 gene in carriers of the Alzheimer's disease-associated haplotypes,” the investigators said. “This hypothesis would be supported by the recent observation of reduced expression of SORL1 protein in neurons but not glia of some individuals with sporadic Alzheimer's disease.”
One of the disease-associated haplotypes of the SORL1 gene was expressed in AD haplotype carriers at less than half the levels of carriers of nondisease haplotypes. But univariate regression analyses showed that the disease variants of the SORL1 gene accounted for about only 14% of the variance in SORL1 protein expression that was seen in those individuals.
“This latter result implies that other genetic and nongenetic factors can also modulate SORL1 [protein] expression and, perhaps, therefore, risk for Alzheimer's disease,” the researchers said.
Although variants of the SORL1 gene may not raise the risk of AD as much as the apolipoprotein E ϵ4 allele, Dr. Gandy noted that the results point out a new target for drug therapy that can raise SORL1 protein levels.
“We never know when we're going to encounter side effects, so it's good to have multiple possible targets,” he said.
Genetic variants of a protein involved in determining the fate of amyloid precursor protein are associated with an increased risk of developing Alzheimer's disease, reported Dr. Ekaterina Rogaeva of the University of Toronto and her associates.
The increased risk for the disease appears to be caused by certain haplotypes of the SORL1 gene that decrease the expression of the gene. As a result, more amyloid precursor protein follows a pathway in which excess amyloid-β peptide is produced in the brain—one of the central events in the pathogenesis of Alzheimer's disease (AD), according to the investigators.
Dr. Samuel E. Gandy, director of the Farber Institute for Neurosciences at Thomas Jefferson University, Philadelphia, said the study's results “fit well into the amyloid model for Alzheimer's, and that's certainly the one that's getting the most attention and most assessment clinically.”
Dr. Rogaeva and her colleagues found that several overlapping haplotypes in two different regions of the SORL1 gene increased the likelihood of developing late-onset familial Alzheimer's disease (FAD), based on results obtained from two cohorts of families with late-onset FAD and later replicated in a cohort of cases and controls in other studies.
“Taken together, our results suggest that genetic and possibly environmentally specified changes in SORL1 [protein] expression or function are causally linked to the pathogenesis [of Alzheimer's disease] and have a modest effect on risk for this disease,” the researchers reported (Nat. Genet. 2007 Jan. 14 [Epub doi:10.1038/ng1943]).
The initial “discovery cohort” comprised 124 northern European FAD families and 228 Caribbean Hispanic FAD families.
The “replication cohort” consisted of northern European individuals from a case-control study (178 cases with sporadic AD and 242 controls with self-identified white European ancestry), 276 white sibships from the Multi-Institutional Research in Alzheimer's Genetic Epidemiology (MIRAGE) study, 238 African-American sibships from the MIRAGE study, and Israeli-Arab individuals (111 with AD and 114 normal controls from the Wadi Ara population study).
The researchers confirmed the association between AD and the SORL1 gene by genotyping the single-nucleotide polymorphisms that were contained in the haplotypes and then analyzing them at an independent facility in three series of cases and controls of European ancestry from different Mayo Clinic centers (totaling 1,405 late-onset AD cases and 2,124 controls).
In genetic studies, particularly those involving Alzheimer's disease, there has “been an issue of one group making a report and then a number of other groups being unable to replicate [the results] across different ethnic groups,” Dr. Gandy said in an interview. “The good thing about this paper is that they've already tested several totally independent ethnic groups, so you can feel a bit more confident that this is true.”
SORL1 protein directly binds amyloid precursor protein and differentially regulates whether it sorts into a recycling pathway or into a pathway that generates amyloid-β.
Experiments that suppressed SORL1 protein expression—mimicking what is speculated to be the effects of AD-associated variants in the SORL1 gene—led to an overproduction of amyloid-β.
The actual disease-causing variants of the SORL1 gene are unlikely to be the single-nucleotide polymorphisms and haplotypes that were identified in the SORL1 gene's exons, the researchers noted. Instead, the pathogenic variants are likely located in sequences in the introns of the SORL1 gene and may “modulate the cell type-specific transcription or translation of the SORL1 gene in carriers of the Alzheimer's disease-associated haplotypes,” the investigators said. “This hypothesis would be supported by the recent observation of reduced expression of SORL1 protein in neurons but not glia of some individuals with sporadic Alzheimer's disease.”
One of the disease-associated haplotypes of the SORL1 gene was expressed in AD haplotype carriers at less than half the levels of carriers of nondisease haplotypes. But univariate regression analyses showed that the disease variants of the SORL1 gene accounted for about only 14% of the variance in SORL1 protein expression that was seen in those individuals.
“This latter result implies that other genetic and nongenetic factors can also modulate SORL1 [protein] expression and, perhaps, therefore, risk for Alzheimer's disease,” the researchers said.
Although variants of the SORL1 gene may not raise the risk of AD as much as the apolipoprotein E ϵ4 allele, Dr. Gandy noted that the results point out a new target for drug therapy that can raise SORL1 protein levels.
“We never know when we're going to encounter side effects, so it's good to have multiple possible targets,” he said.
Genetic variants of a protein involved in determining the fate of amyloid precursor protein are associated with an increased risk of developing Alzheimer's disease, reported Dr. Ekaterina Rogaeva of the University of Toronto and her associates.
The increased risk for the disease appears to be caused by certain haplotypes of the SORL1 gene that decrease the expression of the gene. As a result, more amyloid precursor protein follows a pathway in which excess amyloid-β peptide is produced in the brain—one of the central events in the pathogenesis of Alzheimer's disease (AD), according to the investigators.
Dr. Samuel E. Gandy, director of the Farber Institute for Neurosciences at Thomas Jefferson University, Philadelphia, said the study's results “fit well into the amyloid model for Alzheimer's, and that's certainly the one that's getting the most attention and most assessment clinically.”
Dr. Rogaeva and her colleagues found that several overlapping haplotypes in two different regions of the SORL1 gene increased the likelihood of developing late-onset familial Alzheimer's disease (FAD), based on results obtained from two cohorts of families with late-onset FAD and later replicated in a cohort of cases and controls in other studies.
“Taken together, our results suggest that genetic and possibly environmentally specified changes in SORL1 [protein] expression or function are causally linked to the pathogenesis [of Alzheimer's disease] and have a modest effect on risk for this disease,” the researchers reported (Nat. Genet. 2007 Jan. 14 [Epub doi:10.1038/ng1943]).
The initial “discovery cohort” comprised 124 northern European FAD families and 228 Caribbean Hispanic FAD families.
The “replication cohort” consisted of northern European individuals from a case-control study (178 cases with sporadic AD and 242 controls with self-identified white European ancestry), 276 white sibships from the Multi-Institutional Research in Alzheimer's Genetic Epidemiology (MIRAGE) study, 238 African-American sibships from the MIRAGE study, and Israeli-Arab individuals (111 with AD and 114 normal controls from the Wadi Ara population study).
The researchers confirmed the association between AD and the SORL1 gene by genotyping the single-nucleotide polymorphisms that were contained in the haplotypes and then analyzing them at an independent facility in three series of cases and controls of European ancestry from different Mayo Clinic centers (totaling 1,405 late-onset AD cases and 2,124 controls).
In genetic studies, particularly those involving Alzheimer's disease, there has “been an issue of one group making a report and then a number of other groups being unable to replicate [the results] across different ethnic groups,” Dr. Gandy said in an interview. “The good thing about this paper is that they've already tested several totally independent ethnic groups, so you can feel a bit more confident that this is true.”
SORL1 protein directly binds amyloid precursor protein and differentially regulates whether it sorts into a recycling pathway or into a pathway that generates amyloid-β.
Experiments that suppressed SORL1 protein expression—mimicking what is speculated to be the effects of AD-associated variants in the SORL1 gene—led to an overproduction of amyloid-β.
The actual disease-causing variants of the SORL1 gene are unlikely to be the single-nucleotide polymorphisms and haplotypes that were identified in the SORL1 gene's exons, the researchers noted. Instead, the pathogenic variants are likely located in sequences in the introns of the SORL1 gene and may “modulate the cell type-specific transcription or translation of the SORL1 gene in carriers of the Alzheimer's disease-associated haplotypes,” the investigators said. “This hypothesis would be supported by the recent observation of reduced expression of SORL1 protein in neurons but not glia of some individuals with sporadic Alzheimer's disease.”
One of the disease-associated haplotypes of the SORL1 gene was expressed in AD haplotype carriers at less than half the levels of carriers of nondisease haplotypes. But univariate regression analyses showed that the disease variants of the SORL1 gene accounted for about only 14% of the variance in SORL1 protein expression that was seen in those individuals.
“This latter result implies that other genetic and nongenetic factors can also modulate SORL1 [protein] expression and, perhaps, therefore, risk for Alzheimer's disease,” the researchers said.
Although variants of the SORL1 gene may not raise the risk of AD as much as the apolipoprotein E ϵ4 allele, Dr. Gandy noted that the results point out a new target for drug therapy that can raise SORL1 protein levels.
“We never know when we're going to encounter side effects, so it's good to have multiple possible targets,” he said.
PTH Response Could Explain Racial Differences in Osteo-Related Fractures
ARLINGTON, VA. — African Americans may have a lower rate of osteoporosis-related fractures than whites because of adaptations in calcium homeostasis, bone turnover and resorption, and response to parathyroid hormone, Dr. Felicia Cosman said at a conference sponsored by the American Society for Bone and Mineral Research.
It is “very surprising” that at all ages, black individuals have a lower rate of fractures and higher bone mineral density (BMD) than white individuals, even though blacks generally have higher rates of vitamin D deficiency or insufficiency, said Dr. Cosman, medical director of the clinical research center at Helen Hayes Hospital, West Haverstraw, N.Y.
Mean serum levels of 25-hydroxyvitamin D [25(OH)D] are known at all ages and in both genders to be generally lower in blacks than in whites. This is the result of reduced skin production of vitamin D and a lower dietary intake of vitamin D, Dr. Cosman said.
An alteration in the vitamin D-endocrine system in blacks was first proposed by Dr. Norman Bell; it was based on evidence that blacks have a greater prevalence of vitamin D deficiency and relative secondary hyperparathyroidism, lower levels of bone turnover, and increased urinary calcium retention as an adaptive means to maintain calcium homeostasis without sacrificing the skeleton (J. Clin. Invest. 1985;76:470–3).
In many studies, parathyroid hormone (PTH) levels are higher, on average, in blacks than in whites. The PTH levels found in blacks occur within the context of low calcium intake in addition to low 25(OH)D levels, which may be related to “real or perceived” lactose intolerance, Dr. Cosman said. As a consequence of high PTH levels, blacks have generally been measured with higher 1–25 dihydroxyvitamin D [1,25(OH)2D] levels than have whites.
“We would expect that with higher 1,25(OH)2D levels, you would see greater [dietary] calcium absorption in black individuals compared to whites,” but studies have reported inconsistent data, many of which have shown no significant interracial differences, she said.
One also would expect blacks to have higher bone turnover levels because of high PTH levels, but in general this has not been true, Dr. Cosman said. However, nearly all studies of the kidney have found that blacks have lower urinary calcium excretion than whites.
In addition, supplementation of 1,25(OH)2D has been shown to cause a significantly greater decrease in urinary calcium excretion in blacks than in whites. Markers of bone formation also increased more among blacks than among whites, whereas bone resorption indices showed no racial differences (Osteoporos. Int. 2000;11:271–7). In a separate study, administration of PTH also caused blacks to retain urinary calcium to a greater degree than it did in whites, but it did not cause any racial differences in bone formation markers. After receipt of PTH, blacks also did not have as great an increase in bone resorption markers (J. Bone Miner. Res. 1997;12:958–66). This finding directly confirms “the hypothesis that the black skeleton could be resistant to the acute bone resorptive effects of PTH,” she said.
Studies of histomorphometric differences in bone have shown significantly reduced bone formation rates and a longer total bone formation period in blacks, compared with whites. The results of those studies are consistent with evidence that blacks have a lower level of serum osteocalcin—which has been the most sensitive indicator of a racial difference in bone turnover levels—and that blacks respond more slowly to bone remodeling therapies.
“The bottom line message … for these measurements is that in a relative secondary hyperparathyroid state you really expect to see high [bone] turnover,” Dr. Cosman said. “We never see that. We see either the same or, in most cases, lower turnover in blacks.”
ARLINGTON, VA. — African Americans may have a lower rate of osteoporosis-related fractures than whites because of adaptations in calcium homeostasis, bone turnover and resorption, and response to parathyroid hormone, Dr. Felicia Cosman said at a conference sponsored by the American Society for Bone and Mineral Research.
It is “very surprising” that at all ages, black individuals have a lower rate of fractures and higher bone mineral density (BMD) than white individuals, even though blacks generally have higher rates of vitamin D deficiency or insufficiency, said Dr. Cosman, medical director of the clinical research center at Helen Hayes Hospital, West Haverstraw, N.Y.
Mean serum levels of 25-hydroxyvitamin D [25(OH)D] are known at all ages and in both genders to be generally lower in blacks than in whites. This is the result of reduced skin production of vitamin D and a lower dietary intake of vitamin D, Dr. Cosman said.
An alteration in the vitamin D-endocrine system in blacks was first proposed by Dr. Norman Bell; it was based on evidence that blacks have a greater prevalence of vitamin D deficiency and relative secondary hyperparathyroidism, lower levels of bone turnover, and increased urinary calcium retention as an adaptive means to maintain calcium homeostasis without sacrificing the skeleton (J. Clin. Invest. 1985;76:470–3).
In many studies, parathyroid hormone (PTH) levels are higher, on average, in blacks than in whites. The PTH levels found in blacks occur within the context of low calcium intake in addition to low 25(OH)D levels, which may be related to “real or perceived” lactose intolerance, Dr. Cosman said. As a consequence of high PTH levels, blacks have generally been measured with higher 1–25 dihydroxyvitamin D [1,25(OH)2D] levels than have whites.
“We would expect that with higher 1,25(OH)2D levels, you would see greater [dietary] calcium absorption in black individuals compared to whites,” but studies have reported inconsistent data, many of which have shown no significant interracial differences, she said.
One also would expect blacks to have higher bone turnover levels because of high PTH levels, but in general this has not been true, Dr. Cosman said. However, nearly all studies of the kidney have found that blacks have lower urinary calcium excretion than whites.
In addition, supplementation of 1,25(OH)2D has been shown to cause a significantly greater decrease in urinary calcium excretion in blacks than in whites. Markers of bone formation also increased more among blacks than among whites, whereas bone resorption indices showed no racial differences (Osteoporos. Int. 2000;11:271–7). In a separate study, administration of PTH also caused blacks to retain urinary calcium to a greater degree than it did in whites, but it did not cause any racial differences in bone formation markers. After receipt of PTH, blacks also did not have as great an increase in bone resorption markers (J. Bone Miner. Res. 1997;12:958–66). This finding directly confirms “the hypothesis that the black skeleton could be resistant to the acute bone resorptive effects of PTH,” she said.
Studies of histomorphometric differences in bone have shown significantly reduced bone formation rates and a longer total bone formation period in blacks, compared with whites. The results of those studies are consistent with evidence that blacks have a lower level of serum osteocalcin—which has been the most sensitive indicator of a racial difference in bone turnover levels—and that blacks respond more slowly to bone remodeling therapies.
“The bottom line message … for these measurements is that in a relative secondary hyperparathyroid state you really expect to see high [bone] turnover,” Dr. Cosman said. “We never see that. We see either the same or, in most cases, lower turnover in blacks.”
ARLINGTON, VA. — African Americans may have a lower rate of osteoporosis-related fractures than whites because of adaptations in calcium homeostasis, bone turnover and resorption, and response to parathyroid hormone, Dr. Felicia Cosman said at a conference sponsored by the American Society for Bone and Mineral Research.
It is “very surprising” that at all ages, black individuals have a lower rate of fractures and higher bone mineral density (BMD) than white individuals, even though blacks generally have higher rates of vitamin D deficiency or insufficiency, said Dr. Cosman, medical director of the clinical research center at Helen Hayes Hospital, West Haverstraw, N.Y.
Mean serum levels of 25-hydroxyvitamin D [25(OH)D] are known at all ages and in both genders to be generally lower in blacks than in whites. This is the result of reduced skin production of vitamin D and a lower dietary intake of vitamin D, Dr. Cosman said.
An alteration in the vitamin D-endocrine system in blacks was first proposed by Dr. Norman Bell; it was based on evidence that blacks have a greater prevalence of vitamin D deficiency and relative secondary hyperparathyroidism, lower levels of bone turnover, and increased urinary calcium retention as an adaptive means to maintain calcium homeostasis without sacrificing the skeleton (J. Clin. Invest. 1985;76:470–3).
In many studies, parathyroid hormone (PTH) levels are higher, on average, in blacks than in whites. The PTH levels found in blacks occur within the context of low calcium intake in addition to low 25(OH)D levels, which may be related to “real or perceived” lactose intolerance, Dr. Cosman said. As a consequence of high PTH levels, blacks have generally been measured with higher 1–25 dihydroxyvitamin D [1,25(OH)2D] levels than have whites.
“We would expect that with higher 1,25(OH)2D levels, you would see greater [dietary] calcium absorption in black individuals compared to whites,” but studies have reported inconsistent data, many of which have shown no significant interracial differences, she said.
One also would expect blacks to have higher bone turnover levels because of high PTH levels, but in general this has not been true, Dr. Cosman said. However, nearly all studies of the kidney have found that blacks have lower urinary calcium excretion than whites.
In addition, supplementation of 1,25(OH)2D has been shown to cause a significantly greater decrease in urinary calcium excretion in blacks than in whites. Markers of bone formation also increased more among blacks than among whites, whereas bone resorption indices showed no racial differences (Osteoporos. Int. 2000;11:271–7). In a separate study, administration of PTH also caused blacks to retain urinary calcium to a greater degree than it did in whites, but it did not cause any racial differences in bone formation markers. After receipt of PTH, blacks also did not have as great an increase in bone resorption markers (J. Bone Miner. Res. 1997;12:958–66). This finding directly confirms “the hypothesis that the black skeleton could be resistant to the acute bone resorptive effects of PTH,” she said.
Studies of histomorphometric differences in bone have shown significantly reduced bone formation rates and a longer total bone formation period in blacks, compared with whites. The results of those studies are consistent with evidence that blacks have a lower level of serum osteocalcin—which has been the most sensitive indicator of a racial difference in bone turnover levels—and that blacks respond more slowly to bone remodeling therapies.
“The bottom line message … for these measurements is that in a relative secondary hyperparathyroid state you really expect to see high [bone] turnover,” Dr. Cosman said. “We never see that. We see either the same or, in most cases, lower turnover in blacks.”
Twin Study Data Show Heritability of Knee OA
WASHINGTON — Nearly half of the effects that contribute to knee osteoarthritis can be explained through heritable traits, Guangju Zhai, Ph.D., reported at the annual meeting of the American College of Rheumatology.
In a group of 114 monozygotic and 195 dizygotic twin pairs (all white females) from the Twins UK registry, heritability accounted for 49% of the total variance of joint space narrowing in the knee and for 47% of osteophytes. The results did not change substantially after adjustments were made for age and body mass index, said Dr. Zhai of St. Thomas' Hospital, London. At baseline and at a follow-up of about 7 years, radiographs of the anteroposterior aspect of patients' knees were obtained while in extension and bearing weight. A full lower-limb x-ray was obtained at the follow-up visit.
About 20% of the patients had joint space narrowing and osteophytes at baseline. About 30% had progression of joint space narrowing or osteophytes at follow-up.
Genetic effects explained 65% of the total variance in knee alignment. Statistical analyses showed that the heritability estimate for knee alignment remained the same after adjustments were made for the presence of knee osteoarthritis, which suggests that they are not likely to share common genetic control.
The heritability of the progression of joint space narrowing or osteophytes appeared to be stronger than heritability was for the mere presence of either sign. About 80% of joint space narrowing could be accounted for through heritable traits, while osteophyte progression was 62% heritable. Adjustments for age and BMI did not change the heritability of joint space narrowing progression, but decreased the heritability of osteophyte progression to 50%, Dr. Zhai said.
WASHINGTON — Nearly half of the effects that contribute to knee osteoarthritis can be explained through heritable traits, Guangju Zhai, Ph.D., reported at the annual meeting of the American College of Rheumatology.
In a group of 114 monozygotic and 195 dizygotic twin pairs (all white females) from the Twins UK registry, heritability accounted for 49% of the total variance of joint space narrowing in the knee and for 47% of osteophytes. The results did not change substantially after adjustments were made for age and body mass index, said Dr. Zhai of St. Thomas' Hospital, London. At baseline and at a follow-up of about 7 years, radiographs of the anteroposterior aspect of patients' knees were obtained while in extension and bearing weight. A full lower-limb x-ray was obtained at the follow-up visit.
About 20% of the patients had joint space narrowing and osteophytes at baseline. About 30% had progression of joint space narrowing or osteophytes at follow-up.
Genetic effects explained 65% of the total variance in knee alignment. Statistical analyses showed that the heritability estimate for knee alignment remained the same after adjustments were made for the presence of knee osteoarthritis, which suggests that they are not likely to share common genetic control.
The heritability of the progression of joint space narrowing or osteophytes appeared to be stronger than heritability was for the mere presence of either sign. About 80% of joint space narrowing could be accounted for through heritable traits, while osteophyte progression was 62% heritable. Adjustments for age and BMI did not change the heritability of joint space narrowing progression, but decreased the heritability of osteophyte progression to 50%, Dr. Zhai said.
WASHINGTON — Nearly half of the effects that contribute to knee osteoarthritis can be explained through heritable traits, Guangju Zhai, Ph.D., reported at the annual meeting of the American College of Rheumatology.
In a group of 114 monozygotic and 195 dizygotic twin pairs (all white females) from the Twins UK registry, heritability accounted for 49% of the total variance of joint space narrowing in the knee and for 47% of osteophytes. The results did not change substantially after adjustments were made for age and body mass index, said Dr. Zhai of St. Thomas' Hospital, London. At baseline and at a follow-up of about 7 years, radiographs of the anteroposterior aspect of patients' knees were obtained while in extension and bearing weight. A full lower-limb x-ray was obtained at the follow-up visit.
About 20% of the patients had joint space narrowing and osteophytes at baseline. About 30% had progression of joint space narrowing or osteophytes at follow-up.
Genetic effects explained 65% of the total variance in knee alignment. Statistical analyses showed that the heritability estimate for knee alignment remained the same after adjustments were made for the presence of knee osteoarthritis, which suggests that they are not likely to share common genetic control.
The heritability of the progression of joint space narrowing or osteophytes appeared to be stronger than heritability was for the mere presence of either sign. About 80% of joint space narrowing could be accounted for through heritable traits, while osteophyte progression was 62% heritable. Adjustments for age and BMI did not change the heritability of joint space narrowing progression, but decreased the heritability of osteophyte progression to 50%, Dr. Zhai said.
Many Atherosclerotic Risk Factors May Go Untreated in Lupus
WASHINGTON — Awareness of the increased risk of premature atherosclerosis in lupus patients may be rising, but even experts in lupus treatment are inadequately treating patients with known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006. Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or ever had smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or those who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist:hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment],” he said.
WASHINGTON — Awareness of the increased risk of premature atherosclerosis in lupus patients may be rising, but even experts in lupus treatment are inadequately treating patients with known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006. Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or ever had smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or those who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist:hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment],” he said.
WASHINGTON — Awareness of the increased risk of premature atherosclerosis in lupus patients may be rising, but even experts in lupus treatment are inadequately treating patients with known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006. Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or ever had smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or those who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist:hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment],” he said.
Predictors of Atherosclerosis Progression Identified in SLE
WASHINGTON — Atherosclerosis is more likely to progress in systemic lupus erythematosus patients when lupus is diagnosed at older ages, when it has existed for a long duration, and when high homocysteine levels are present, according to new research presented at the annual meeting of the American College of Rheumatology.
Other research at the meeting suggested that the risk that women with SLE will develop carotid artery plaque may be determined by the presence of proinflammatory high-density lipoprotein (piHDL) cholesterol.
“We really don't know the rate and determinants of progression of carotid plaque in lupus,” said Dr. Mary J. Roman of Cornell University, New York.
She and her colleagues used serial ultrasound scans of the distal carotid artery and clinical assessments to evaluate the progression of atherosclerosis in 159 patients in the Hospital for Special Surgery's SLE registry.
After an average follow-up of 34 months, 28% of the patients had either developed first-time atherosclerotic plaque in the carotid since their baseline assessment or showed an increase in existing plaque since baseline. That is equivalent to progression of atherosclerosis in about 10% of SLE patients per year, Dr. Roman said.
“We may use this observed rate of atherosclerosis progression in assessing the efficacy in future intervention trials,” she said.
Compared with patients who had progressive plaque build up, those without plaque progression were significantly younger at baseline (mean age 50 years vs. 36 years) and at diagnosis (mean 36 years vs. 21 years), and had lower serum homocysteine levels at baseline. Patients without progression of atherosclerosis also tended to have more aggressive treatment of disease than those who experienced progression.
For each 10-year increase in either age at diagnosis or disease duration, patients were about three times more likely to have plaque progression than no plaque or stable plaque. Progression of atherosclerosis occurred in 56% of patients in the highest tertile of baseline serum homocysteine levels (7.9 mcmol/L or greater).
“Other than older age, traditional risk factors were not associated with progression of atherosclerosis,” Dr. Roman noted.
In a separate presentation, Dr. Maureen McMahon of the department of rheumatology at the University of California at Los Angeles reported preliminary results from an ongoing study that suggest the development of carotid artery plaque in women with SLE is associated with the presence of piHDL.
After conducting B-mode ultrasound screening of the carotid artery and taking blood samples of women with SLE and healthy control women, Dr. McMahon and her colleagues found that 42 of 95 (44%) women with SLE had piHDL, compared with 3 of 52 (6%) age-matched control women. Significantly more SLE patients with carotid plaque had piHDL than did SLE patients without plaque (93% vs. 38%). But there was no significant difference in the presence of piHDL between control patients with and without plaque.
Oxidized low-density lipoprotein (LDL cholesterol) directly and indirectly promotes the production of inflammatory cytokines, the migration of monocytes into the subepithelial space of vessels, and the formation of macrophages that take up the oxidized LDL cholesterol and form foam cells that build an atherosclerotic plaque. Normal HDL cholesterol helps to reduce the effect of oxidized LDL cholesterol by promoting cholesterol efflux from cells and by inhibiting the oxidization of LDL cholesterol. During periods of acute inflammation, HDL cholesterol may become proinflammatory and unable to perform its usual protective function, Dr. McMahon explained.
SLE patients with piHDL were 25 times more likely than control patients to have plaque after controlling for the traditional cardiovascular risk factors of hypertension, elevated LDL cholesterol, age, body mass index, diabetes, and high-sensitivity C-reactive protein.
“Measurement of piHDL may be one tool to identify [SLE] patients at risk for the development of atherosclerosis,” Dr. McMahon concluded.
The SLE patients had a mean age of about 43 years and were not selected for a history of cardiovascular disease. None of the patients was allowed to take statins within 6 months of entry into the study.
WASHINGTON — Atherosclerosis is more likely to progress in systemic lupus erythematosus patients when lupus is diagnosed at older ages, when it has existed for a long duration, and when high homocysteine levels are present, according to new research presented at the annual meeting of the American College of Rheumatology.
Other research at the meeting suggested that the risk that women with SLE will develop carotid artery plaque may be determined by the presence of proinflammatory high-density lipoprotein (piHDL) cholesterol.
“We really don't know the rate and determinants of progression of carotid plaque in lupus,” said Dr. Mary J. Roman of Cornell University, New York.
She and her colleagues used serial ultrasound scans of the distal carotid artery and clinical assessments to evaluate the progression of atherosclerosis in 159 patients in the Hospital for Special Surgery's SLE registry.
After an average follow-up of 34 months, 28% of the patients had either developed first-time atherosclerotic plaque in the carotid since their baseline assessment or showed an increase in existing plaque since baseline. That is equivalent to progression of atherosclerosis in about 10% of SLE patients per year, Dr. Roman said.
“We may use this observed rate of atherosclerosis progression in assessing the efficacy in future intervention trials,” she said.
Compared with patients who had progressive plaque build up, those without plaque progression were significantly younger at baseline (mean age 50 years vs. 36 years) and at diagnosis (mean 36 years vs. 21 years), and had lower serum homocysteine levels at baseline. Patients without progression of atherosclerosis also tended to have more aggressive treatment of disease than those who experienced progression.
For each 10-year increase in either age at diagnosis or disease duration, patients were about three times more likely to have plaque progression than no plaque or stable plaque. Progression of atherosclerosis occurred in 56% of patients in the highest tertile of baseline serum homocysteine levels (7.9 mcmol/L or greater).
“Other than older age, traditional risk factors were not associated with progression of atherosclerosis,” Dr. Roman noted.
In a separate presentation, Dr. Maureen McMahon of the department of rheumatology at the University of California at Los Angeles reported preliminary results from an ongoing study that suggest the development of carotid artery plaque in women with SLE is associated with the presence of piHDL.
After conducting B-mode ultrasound screening of the carotid artery and taking blood samples of women with SLE and healthy control women, Dr. McMahon and her colleagues found that 42 of 95 (44%) women with SLE had piHDL, compared with 3 of 52 (6%) age-matched control women. Significantly more SLE patients with carotid plaque had piHDL than did SLE patients without plaque (93% vs. 38%). But there was no significant difference in the presence of piHDL between control patients with and without plaque.
Oxidized low-density lipoprotein (LDL cholesterol) directly and indirectly promotes the production of inflammatory cytokines, the migration of monocytes into the subepithelial space of vessels, and the formation of macrophages that take up the oxidized LDL cholesterol and form foam cells that build an atherosclerotic plaque. Normal HDL cholesterol helps to reduce the effect of oxidized LDL cholesterol by promoting cholesterol efflux from cells and by inhibiting the oxidization of LDL cholesterol. During periods of acute inflammation, HDL cholesterol may become proinflammatory and unable to perform its usual protective function, Dr. McMahon explained.
SLE patients with piHDL were 25 times more likely than control patients to have plaque after controlling for the traditional cardiovascular risk factors of hypertension, elevated LDL cholesterol, age, body mass index, diabetes, and high-sensitivity C-reactive protein.
“Measurement of piHDL may be one tool to identify [SLE] patients at risk for the development of atherosclerosis,” Dr. McMahon concluded.
The SLE patients had a mean age of about 43 years and were not selected for a history of cardiovascular disease. None of the patients was allowed to take statins within 6 months of entry into the study.
WASHINGTON — Atherosclerosis is more likely to progress in systemic lupus erythematosus patients when lupus is diagnosed at older ages, when it has existed for a long duration, and when high homocysteine levels are present, according to new research presented at the annual meeting of the American College of Rheumatology.
Other research at the meeting suggested that the risk that women with SLE will develop carotid artery plaque may be determined by the presence of proinflammatory high-density lipoprotein (piHDL) cholesterol.
“We really don't know the rate and determinants of progression of carotid plaque in lupus,” said Dr. Mary J. Roman of Cornell University, New York.
She and her colleagues used serial ultrasound scans of the distal carotid artery and clinical assessments to evaluate the progression of atherosclerosis in 159 patients in the Hospital for Special Surgery's SLE registry.
After an average follow-up of 34 months, 28% of the patients had either developed first-time atherosclerotic plaque in the carotid since their baseline assessment or showed an increase in existing plaque since baseline. That is equivalent to progression of atherosclerosis in about 10% of SLE patients per year, Dr. Roman said.
“We may use this observed rate of atherosclerosis progression in assessing the efficacy in future intervention trials,” she said.
Compared with patients who had progressive plaque build up, those without plaque progression were significantly younger at baseline (mean age 50 years vs. 36 years) and at diagnosis (mean 36 years vs. 21 years), and had lower serum homocysteine levels at baseline. Patients without progression of atherosclerosis also tended to have more aggressive treatment of disease than those who experienced progression.
For each 10-year increase in either age at diagnosis or disease duration, patients were about three times more likely to have plaque progression than no plaque or stable plaque. Progression of atherosclerosis occurred in 56% of patients in the highest tertile of baseline serum homocysteine levels (7.9 mcmol/L or greater).
“Other than older age, traditional risk factors were not associated with progression of atherosclerosis,” Dr. Roman noted.
In a separate presentation, Dr. Maureen McMahon of the department of rheumatology at the University of California at Los Angeles reported preliminary results from an ongoing study that suggest the development of carotid artery plaque in women with SLE is associated with the presence of piHDL.
After conducting B-mode ultrasound screening of the carotid artery and taking blood samples of women with SLE and healthy control women, Dr. McMahon and her colleagues found that 42 of 95 (44%) women with SLE had piHDL, compared with 3 of 52 (6%) age-matched control women. Significantly more SLE patients with carotid plaque had piHDL than did SLE patients without plaque (93% vs. 38%). But there was no significant difference in the presence of piHDL between control patients with and without plaque.
Oxidized low-density lipoprotein (LDL cholesterol) directly and indirectly promotes the production of inflammatory cytokines, the migration of monocytes into the subepithelial space of vessels, and the formation of macrophages that take up the oxidized LDL cholesterol and form foam cells that build an atherosclerotic plaque. Normal HDL cholesterol helps to reduce the effect of oxidized LDL cholesterol by promoting cholesterol efflux from cells and by inhibiting the oxidization of LDL cholesterol. During periods of acute inflammation, HDL cholesterol may become proinflammatory and unable to perform its usual protective function, Dr. McMahon explained.
SLE patients with piHDL were 25 times more likely than control patients to have plaque after controlling for the traditional cardiovascular risk factors of hypertension, elevated LDL cholesterol, age, body mass index, diabetes, and high-sensitivity C-reactive protein.
“Measurement of piHDL may be one tool to identify [SLE] patients at risk for the development of atherosclerosis,” Dr. McMahon concluded.
The SLE patients had a mean age of about 43 years and were not selected for a history of cardiovascular disease. None of the patients was allowed to take statins within 6 months of entry into the study.
Specific Changes Distinguish Atypical, Incomplete Kawasaki's
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
Specific Changes Mark Atypical Kawasaki Disease
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children who were discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children who were discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
WASHINGTON — Atypical and incomplete Kawasaki disease may be distinguished from other common childhood febrile illnesses by characteristic changes to the extremities, mucosa, conjunctiva, and blood laboratory values, Dr. Fernanda Falcini reported at a poster session of the annual meeting of the American College of Rheumatology.
In a chart review of 1,499 children who were discharged from the hospital with a diagnosis of Kawasaki disease (KD), 225 (15%) did not fulfill the Centers for Disease Control and Prevention's case definition criteria of KD. The CDC identifies KD patients as those having four of the following five clinical signs: rash, cervical lymphadenopathy of at least 1.5 cm in diameter, bilateral conjunctival injection, oral mucosal changes, and peripheral extremity changes.
Of those 225 patients, 172 had incomplete KD (median age 21 months) and 53 had atypical disease (median age 50 months), according to Dr. Falcini of the rheumatology unit in the department of pediatrics at the University of Florence (Italy).
Patients with incomplete KD did not meet all of the CDC case definition criteria, whereas atypical disease referred to patients who also had a problem that generally is not seen in KD.
Lip and oral redness, skin extremity changes, and nonexudative conjunctivitis occurred significantly more often among children with incomplete or atypical KD than in 55 children who had other febrile illnesses that mimic KD. These other illnesses were cytomegalovirus (in 21 children), adenovirus (16), systemic juvenile idiopathic arthritis (12), Epstein-Barr virus (5), and staphylococcal scalded skin syndrome (1).
The erythrocyte sedimentation rate and total platelet count of children with incomplete and atypical KD also were significantly higher than in children with other KD-mimicking illnesses. But children with febrile diseases other than KD were significantly more likely to have lymphadenopathy than were those with incomplete or atypical KD.
Coronary artery diseases, including dilatation and aneurysms, were detected only in patients with incomplete (47) or atypical KD (15), reported Dr. Falcini.
Treat Atherosclerosis Risks in Lupus Patients : Many risk factors were present within 1 year of diagnosis and increased in prevalence over 3 years.
WASHINGTON — Awareness of the increased risk of atherosclerosis in patients who have lupus may be rising, but even experts in lupus treatment are not adequately treating patients who have known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Premature atherosclerosis in patients with systemic lupus erythematosus (SLE) may develop as a result of a combination of disease- and therapy-related factors, classic coronary artery disease risk factors, and genetic factors. Many of these factors are present within the first year after diagnosis of SLE, said Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital.
He and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006.
Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
“For some unknown reason … there is some reluctance to begin therapy with cholesterol-lowering medications in our patients,” Dr. Urowitz said at the meeting.
“We can no longer say that we are busy looking at the initial treatment of patients with lupus. This is now 3 years into the illness,” he said. These findings are coming from “the 'august' SLICC group who call themselves 'lupologists.'”
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or had ever smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist-to-hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more of the patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment]; this doesn't tell the whole story. You must continue to follow up patients,” said Dr. Urowitz, professor of medicine at the University of Toronto.
WASHINGTON — Awareness of the increased risk of atherosclerosis in patients who have lupus may be rising, but even experts in lupus treatment are not adequately treating patients who have known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Premature atherosclerosis in patients with systemic lupus erythematosus (SLE) may develop as a result of a combination of disease- and therapy-related factors, classic coronary artery disease risk factors, and genetic factors. Many of these factors are present within the first year after diagnosis of SLE, said Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital.
He and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006.
Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
“For some unknown reason … there is some reluctance to begin therapy with cholesterol-lowering medications in our patients,” Dr. Urowitz said at the meeting.
“We can no longer say that we are busy looking at the initial treatment of patients with lupus. This is now 3 years into the illness,” he said. These findings are coming from “the 'august' SLICC group who call themselves 'lupologists.'”
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or had ever smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist-to-hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more of the patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment]; this doesn't tell the whole story. You must continue to follow up patients,” said Dr. Urowitz, professor of medicine at the University of Toronto.
WASHINGTON — Awareness of the increased risk of atherosclerosis in patients who have lupus may be rising, but even experts in lupus treatment are not adequately treating patients who have known risk factors for the condition, Dr. Murray B. Urowitz reported at the annual meeting of the American College of Rheumatology.
Premature atherosclerosis in patients with systemic lupus erythematosus (SLE) may develop as a result of a combination of disease- and therapy-related factors, classic coronary artery disease risk factors, and genetic factors. Many of these factors are present within the first year after diagnosis of SLE, said Dr. Urowitz, director of the Centre of Prognosis Studies in the Rheumatic Diseases at Toronto Western Hospital.
He and his colleagues in the Systemic Lupus International Collaborating Clinics (SLICC), a group of 30 investigators located at 27 centers around the world, conducted an analysis of 935 SLE patients who had been enrolled in the multicenter registry within 15 months of diagnosis during 2000–2006.
Follow-up data at 3 years were available for 278 patients. These patients had a mean SLE disease activity index-2k (SLEDAI-2k) score of 5.49 at enrollment and an adjusted mean SLEDAI-2k over 3 years of 4.94.
Of 101 patients who had hypercholesterolemia at enrollment, 25 received treatment for the condition. After 3 years, 167 patients had ever had hypercholesterolemia, but only 63 (38%) had received treatment.
“For some unknown reason … there is some reluctance to begin therapy with cholesterol-lowering medications in our patients,” Dr. Urowitz said at the meeting.
“We can no longer say that we are busy looking at the initial treatment of patients with lupus. This is now 3 years into the illness,” he said. These findings are coming from “the 'august' SLICC group who call themselves 'lupologists.'”
In comparison, the percentage of hypertensive patients who received treatment increased from enrollment (87 of 109 [80%]) to the 3-year follow-up (144 of 162 [89%]) even though the prevalence of hypertension increased.
Other risk factors for coronary artery disease increased in prevalence during the 3 years, including the percentage of patients who currently or had ever smoked (from 14% to 19% and from 37% to 42%, respectively), the percentage of patients who reported a family history of coronary artery disease (from 18% to 25%), as well as the percentage of those with diabetes mellitus or who had become postmenopausal.
Risk factors relating to body composition also increased during follow-up, such as the percentage of patients with a body mass index in the overweight or obese range (from 31% to 46%), a waist-to-hip ratio greater than 0.8 (from 32% to 55%), and low physical activity (from 37% to 55%). Since enrollment, more of the patients had taken corticosteroids (from 71% to 79%), antimalarials (from 60% to 77%), or immunosuppressives (from 38% to 59%).
“All risk factors increased in prevalence over 3 years, so you're not off the hook when they start [treatment]; this doesn't tell the whole story. You must continue to follow up patients,” said Dr. Urowitz, professor of medicine at the University of Toronto.
Research Into Seizure Prediction Devices Advances : No adverse events have been reported in the two ongoing phase III implanted device studies.
CHICAGO – Ongoing clinical trials for two implanted devices designed to interrupt or predict seizures herald an area of clinical research that has quickly gained ground during the last 5 years, Dr. Brian Litt said at the annual meeting of the American Neurological Association.
Research into seizure prediction, most of which has occurred in the past 15 years, has been “very controversial,” mostly because of people getting too excited about findings very early on, said Dr. Litt of the departments of neurology and bioengineering at the University of Pennsylvania, Philadelphia.
Early studies were plagued by overreliance on abstract functions rather than on clinical physiological parameters, and they lacked statistical rigor. As a result, the databases were biased toward seizures because much of the data were taken from inpatients who had many seizures during hospital stays.
“Those data are not what it's like to live with epilepsy; you might have one seizure a month, you might have four a month. But clearly the preponderance of the data is interictal,” he said.
A data set heavily enriched with seizures makes it much more likely that attempts to predict seizures at broad intervals will, in fact, detect a seizure. This made it impossible to reproduce the claims of seizure prediction that were announced in early studies.
“We also found that listening to patients was really important,” Dr. Litt said, because many patients tell their physicians that sometimes hours or days before a seizure onset, they have a feeling–or prodrome–that tells them they are likely to have a seizure. And the patients may or may not have a seizure.
“The model [for predicting seizures] has to account for this,” he said.
These lessons taught Dr. Litt and his colleagues that they were very unlikely to predict an exact seizure, but that it was likely they could identify periods of time in which the probability of a seizure's occurring is greatly increased.
No efficacy data are yet available for the two devices that are being tested in phase III trials, but no adverse events have occurred.
In Medtronic Inc.'s Stimulation of the Anterior Nucleus of the Thalamus for Epilepsy (SANTE) trial, about 150 adult patients with medically refractory partial-onset epilepsy will receive the Intercept Epilepsy Control System.
The implanted device, which bilaterally stimulates the anterior nucleus of the thalamus but does not sense or respond to EEG activity, will be turned on in some patients but not in others during the trial's double-blind phase. Medtronic decided to continue the SANTE trial after it recently passed its midterm analysis, according to Dr. Litt.
The Responsive Neurostimulator system from NeuroPace Inc. will be tested in about 240 adult patients to determine if it can reduce the frequency of medically uncontrolled and disabling partial-onset seizures.
All of the patients will be implanted with the device, which scans EEG recordings for particular patterns associated with seizure onset or impending seizures, and then stimulates epileptogenic foci through intracranial electrodes. Only some patients will have the device turned on during the double-blind phase of the randomized trial.
In a safety study of about 50 patients with more than four seizures per month who were implanted with the Responsive Neurostimulator, 43% of those with complex partial seizures and 35% of those with disabling motor seizures had a 50% or greater reduction in seizures, Dr. Litt said.
“Is this a home run? No. Does it mean that it's effective? No. Does it mean that there's proof of principle enough to perhaps go forward? I think it does,” Dr. Litt said.
“Remember, this is a first-generation device. Judge this as a work in progress, like the first pacemaker,” he added.
Dr. Litt has contributed patents through the University of Pennsylvania for NeuroPace's Responsive Neurostimulator device. He is a consultant to BioNeuronics Corp., and he helped to found BioQuantix Corp. through the University of Pennsylvania.
Major questions still remain in understanding and mapping epileptic networks in the brain, such as where to place electrodes, where to sense seizure onset, and where to stimulate the brain. Researchers also want to know how seizures are generated over time.
To answer these questions, Dr. Litt and his associates have examined seizures in patients with Responsive Neurostimulator devices, which save about a minute of data prior to stimulation and also for a short period afterward.
Analyses of the 2-second period before a seizure began in thousands of events distinguished between effective and ineffective types of stimulation. For particular stereotyped seizure onsets, the researchers used specific characteristics of synchrony, frequency of activity, and the relationship between the stimulus and the seizure waveform to determine if stimulation would be effective or not.
“The bottom line is that seizures in which stimulation is not effective are ones that are likely more evolved or perhaps began in a different place in the network and spread to these regions before the stimulation occurred,” he said.
Although Dr. Litt's model for seizure generation has not been statistically proven, his group's research suggests that seizures “may occur in a reproducible cascade of events” in which there are periods of increased complex epileptiform activity in the hours or days before a seizure, followed in the 2 hours before the seizure by short seizurelike bursts of activity, or “seizlets,” that last 1–5 seconds. These seizlets appear to build exponentially as the seizure approaches and activity ramps up.
To prove that this cascade of events exists, the investigators have built detectors that can quantitatively detect seizures in large chunks of data. When seizure and nonseizure events are mixed up and randomized, the two events can be distinguished with a certain latency, which increases as the likelihood of correctly predicting a seizure event increases, he said.
Other investigators who have collaborated with Dr. Litt may have come across a good method for validating the performance of algorithms that are designed to predict seizures.
This method also may have discovered the first evidence for the EEG patterns of a definitive preictal period (J. Neurophysiol. 2006 Oct. 4 [Epub DOI:10.1152/jn.00190.2006]).
Pinpointing the location of seizures has benefited from research using high-frequency EEG.
High-frequency EEG readings were not recognized as clinically significant until recent studies showed that the characteristic waveform flattening, or “electro-decrement,” of intracranial EEG before a seizure is actually high-frequency activity that was filtered out by intracranial EEGs that were calibrated to filter settings of pen and paper EEG machines from the 1950s, Dr. Litt said.
For many seizures, a rise in high-frequency epileptiform oscillations can indicate an impending seizure 40 minutes in advance (Brain 2004;127:1496–506).
Investigations of the density of these high-frequency epileptiform oscillations during a period of time around specific electrodes in the brain have helped to map the distribution of nodes that are “heating up” before seizure onset, he explained.
These maps have suggested that the focal point of a seizure is not really like a single point, as was previously thought, but is “more like a cloud. It's areas that are buzzing and trying to initiate synchrony that seem to be going from one place to the other to generate the seizure, and which ones actually start the seizure may vary,” he said.
CHICAGO – Ongoing clinical trials for two implanted devices designed to interrupt or predict seizures herald an area of clinical research that has quickly gained ground during the last 5 years, Dr. Brian Litt said at the annual meeting of the American Neurological Association.
Research into seizure prediction, most of which has occurred in the past 15 years, has been “very controversial,” mostly because of people getting too excited about findings very early on, said Dr. Litt of the departments of neurology and bioengineering at the University of Pennsylvania, Philadelphia.
Early studies were plagued by overreliance on abstract functions rather than on clinical physiological parameters, and they lacked statistical rigor. As a result, the databases were biased toward seizures because much of the data were taken from inpatients who had many seizures during hospital stays.
“Those data are not what it's like to live with epilepsy; you might have one seizure a month, you might have four a month. But clearly the preponderance of the data is interictal,” he said.
A data set heavily enriched with seizures makes it much more likely that attempts to predict seizures at broad intervals will, in fact, detect a seizure. This made it impossible to reproduce the claims of seizure prediction that were announced in early studies.
“We also found that listening to patients was really important,” Dr. Litt said, because many patients tell their physicians that sometimes hours or days before a seizure onset, they have a feeling–or prodrome–that tells them they are likely to have a seizure. And the patients may or may not have a seizure.
“The model [for predicting seizures] has to account for this,” he said.
These lessons taught Dr. Litt and his colleagues that they were very unlikely to predict an exact seizure, but that it was likely they could identify periods of time in which the probability of a seizure's occurring is greatly increased.
No efficacy data are yet available for the two devices that are being tested in phase III trials, but no adverse events have occurred.
In Medtronic Inc.'s Stimulation of the Anterior Nucleus of the Thalamus for Epilepsy (SANTE) trial, about 150 adult patients with medically refractory partial-onset epilepsy will receive the Intercept Epilepsy Control System.
The implanted device, which bilaterally stimulates the anterior nucleus of the thalamus but does not sense or respond to EEG activity, will be turned on in some patients but not in others during the trial's double-blind phase. Medtronic decided to continue the SANTE trial after it recently passed its midterm analysis, according to Dr. Litt.
The Responsive Neurostimulator system from NeuroPace Inc. will be tested in about 240 adult patients to determine if it can reduce the frequency of medically uncontrolled and disabling partial-onset seizures.
All of the patients will be implanted with the device, which scans EEG recordings for particular patterns associated with seizure onset or impending seizures, and then stimulates epileptogenic foci through intracranial electrodes. Only some patients will have the device turned on during the double-blind phase of the randomized trial.
In a safety study of about 50 patients with more than four seizures per month who were implanted with the Responsive Neurostimulator, 43% of those with complex partial seizures and 35% of those with disabling motor seizures had a 50% or greater reduction in seizures, Dr. Litt said.
“Is this a home run? No. Does it mean that it's effective? No. Does it mean that there's proof of principle enough to perhaps go forward? I think it does,” Dr. Litt said.
“Remember, this is a first-generation device. Judge this as a work in progress, like the first pacemaker,” he added.
Dr. Litt has contributed patents through the University of Pennsylvania for NeuroPace's Responsive Neurostimulator device. He is a consultant to BioNeuronics Corp., and he helped to found BioQuantix Corp. through the University of Pennsylvania.
Major questions still remain in understanding and mapping epileptic networks in the brain, such as where to place electrodes, where to sense seizure onset, and where to stimulate the brain. Researchers also want to know how seizures are generated over time.
To answer these questions, Dr. Litt and his associates have examined seizures in patients with Responsive Neurostimulator devices, which save about a minute of data prior to stimulation and also for a short period afterward.
Analyses of the 2-second period before a seizure began in thousands of events distinguished between effective and ineffective types of stimulation. For particular stereotyped seizure onsets, the researchers used specific characteristics of synchrony, frequency of activity, and the relationship between the stimulus and the seizure waveform to determine if stimulation would be effective or not.
“The bottom line is that seizures in which stimulation is not effective are ones that are likely more evolved or perhaps began in a different place in the network and spread to these regions before the stimulation occurred,” he said.
Although Dr. Litt's model for seizure generation has not been statistically proven, his group's research suggests that seizures “may occur in a reproducible cascade of events” in which there are periods of increased complex epileptiform activity in the hours or days before a seizure, followed in the 2 hours before the seizure by short seizurelike bursts of activity, or “seizlets,” that last 1–5 seconds. These seizlets appear to build exponentially as the seizure approaches and activity ramps up.
To prove that this cascade of events exists, the investigators have built detectors that can quantitatively detect seizures in large chunks of data. When seizure and nonseizure events are mixed up and randomized, the two events can be distinguished with a certain latency, which increases as the likelihood of correctly predicting a seizure event increases, he said.
Other investigators who have collaborated with Dr. Litt may have come across a good method for validating the performance of algorithms that are designed to predict seizures.
This method also may have discovered the first evidence for the EEG patterns of a definitive preictal period (J. Neurophysiol. 2006 Oct. 4 [Epub DOI:10.1152/jn.00190.2006]).
Pinpointing the location of seizures has benefited from research using high-frequency EEG.
High-frequency EEG readings were not recognized as clinically significant until recent studies showed that the characteristic waveform flattening, or “electro-decrement,” of intracranial EEG before a seizure is actually high-frequency activity that was filtered out by intracranial EEGs that were calibrated to filter settings of pen and paper EEG machines from the 1950s, Dr. Litt said.
For many seizures, a rise in high-frequency epileptiform oscillations can indicate an impending seizure 40 minutes in advance (Brain 2004;127:1496–506).
Investigations of the density of these high-frequency epileptiform oscillations during a period of time around specific electrodes in the brain have helped to map the distribution of nodes that are “heating up” before seizure onset, he explained.
These maps have suggested that the focal point of a seizure is not really like a single point, as was previously thought, but is “more like a cloud. It's areas that are buzzing and trying to initiate synchrony that seem to be going from one place to the other to generate the seizure, and which ones actually start the seizure may vary,” he said.
CHICAGO – Ongoing clinical trials for two implanted devices designed to interrupt or predict seizures herald an area of clinical research that has quickly gained ground during the last 5 years, Dr. Brian Litt said at the annual meeting of the American Neurological Association.
Research into seizure prediction, most of which has occurred in the past 15 years, has been “very controversial,” mostly because of people getting too excited about findings very early on, said Dr. Litt of the departments of neurology and bioengineering at the University of Pennsylvania, Philadelphia.
Early studies were plagued by overreliance on abstract functions rather than on clinical physiological parameters, and they lacked statistical rigor. As a result, the databases were biased toward seizures because much of the data were taken from inpatients who had many seizures during hospital stays.
“Those data are not what it's like to live with epilepsy; you might have one seizure a month, you might have four a month. But clearly the preponderance of the data is interictal,” he said.
A data set heavily enriched with seizures makes it much more likely that attempts to predict seizures at broad intervals will, in fact, detect a seizure. This made it impossible to reproduce the claims of seizure prediction that were announced in early studies.
“We also found that listening to patients was really important,” Dr. Litt said, because many patients tell their physicians that sometimes hours or days before a seizure onset, they have a feeling–or prodrome–that tells them they are likely to have a seizure. And the patients may or may not have a seizure.
“The model [for predicting seizures] has to account for this,” he said.
These lessons taught Dr. Litt and his colleagues that they were very unlikely to predict an exact seizure, but that it was likely they could identify periods of time in which the probability of a seizure's occurring is greatly increased.
No efficacy data are yet available for the two devices that are being tested in phase III trials, but no adverse events have occurred.
In Medtronic Inc.'s Stimulation of the Anterior Nucleus of the Thalamus for Epilepsy (SANTE) trial, about 150 adult patients with medically refractory partial-onset epilepsy will receive the Intercept Epilepsy Control System.
The implanted device, which bilaterally stimulates the anterior nucleus of the thalamus but does not sense or respond to EEG activity, will be turned on in some patients but not in others during the trial's double-blind phase. Medtronic decided to continue the SANTE trial after it recently passed its midterm analysis, according to Dr. Litt.
The Responsive Neurostimulator system from NeuroPace Inc. will be tested in about 240 adult patients to determine if it can reduce the frequency of medically uncontrolled and disabling partial-onset seizures.
All of the patients will be implanted with the device, which scans EEG recordings for particular patterns associated with seizure onset or impending seizures, and then stimulates epileptogenic foci through intracranial electrodes. Only some patients will have the device turned on during the double-blind phase of the randomized trial.
In a safety study of about 50 patients with more than four seizures per month who were implanted with the Responsive Neurostimulator, 43% of those with complex partial seizures and 35% of those with disabling motor seizures had a 50% or greater reduction in seizures, Dr. Litt said.
“Is this a home run? No. Does it mean that it's effective? No. Does it mean that there's proof of principle enough to perhaps go forward? I think it does,” Dr. Litt said.
“Remember, this is a first-generation device. Judge this as a work in progress, like the first pacemaker,” he added.
Dr. Litt has contributed patents through the University of Pennsylvania for NeuroPace's Responsive Neurostimulator device. He is a consultant to BioNeuronics Corp., and he helped to found BioQuantix Corp. through the University of Pennsylvania.
Major questions still remain in understanding and mapping epileptic networks in the brain, such as where to place electrodes, where to sense seizure onset, and where to stimulate the brain. Researchers also want to know how seizures are generated over time.
To answer these questions, Dr. Litt and his associates have examined seizures in patients with Responsive Neurostimulator devices, which save about a minute of data prior to stimulation and also for a short period afterward.
Analyses of the 2-second period before a seizure began in thousands of events distinguished between effective and ineffective types of stimulation. For particular stereotyped seizure onsets, the researchers used specific characteristics of synchrony, frequency of activity, and the relationship between the stimulus and the seizure waveform to determine if stimulation would be effective or not.
“The bottom line is that seizures in which stimulation is not effective are ones that are likely more evolved or perhaps began in a different place in the network and spread to these regions before the stimulation occurred,” he said.
Although Dr. Litt's model for seizure generation has not been statistically proven, his group's research suggests that seizures “may occur in a reproducible cascade of events” in which there are periods of increased complex epileptiform activity in the hours or days before a seizure, followed in the 2 hours before the seizure by short seizurelike bursts of activity, or “seizlets,” that last 1–5 seconds. These seizlets appear to build exponentially as the seizure approaches and activity ramps up.
To prove that this cascade of events exists, the investigators have built detectors that can quantitatively detect seizures in large chunks of data. When seizure and nonseizure events are mixed up and randomized, the two events can be distinguished with a certain latency, which increases as the likelihood of correctly predicting a seizure event increases, he said.
Other investigators who have collaborated with Dr. Litt may have come across a good method for validating the performance of algorithms that are designed to predict seizures.
This method also may have discovered the first evidence for the EEG patterns of a definitive preictal period (J. Neurophysiol. 2006 Oct. 4 [Epub DOI:10.1152/jn.00190.2006]).
Pinpointing the location of seizures has benefited from research using high-frequency EEG.
High-frequency EEG readings were not recognized as clinically significant until recent studies showed that the characteristic waveform flattening, or “electro-decrement,” of intracranial EEG before a seizure is actually high-frequency activity that was filtered out by intracranial EEGs that were calibrated to filter settings of pen and paper EEG machines from the 1950s, Dr. Litt said.
For many seizures, a rise in high-frequency epileptiform oscillations can indicate an impending seizure 40 minutes in advance (Brain 2004;127:1496–506).
Investigations of the density of these high-frequency epileptiform oscillations during a period of time around specific electrodes in the brain have helped to map the distribution of nodes that are “heating up” before seizure onset, he explained.
These maps have suggested that the focal point of a seizure is not really like a single point, as was previously thought, but is “more like a cloud. It's areas that are buzzing and trying to initiate synchrony that seem to be going from one place to the other to generate the seizure, and which ones actually start the seizure may vary,” he said.