Multidetector CT Accurate in 'Real-World' Patients

Article Type
Changed
Display Headline
Multidetector CT Accurate in 'Real-World' Patients

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked to determine diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York. He and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center. The facility is a referral center for hospitals on Long Island. About 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible. Calcium channel blockers were used in patients who had contraindications to β-blockers. The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was about 58 beats a minute. Chest pain and abnormal stress test were the most common indications.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments). The mean age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65%, and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” said Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%. The positive predictive value was 61%, and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” he said. “Our false positives tended to be those patients who had higher calcium scores and this neared statistical significance.”

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

ELSEVIER GLOBAL MEDICAL NEWS

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked to determine diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York. He and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center. The facility is a referral center for hospitals on Long Island. About 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible. Calcium channel blockers were used in patients who had contraindications to β-blockers. The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was about 58 beats a minute. Chest pain and abnormal stress test were the most common indications.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments). The mean age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65%, and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” said Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%. The positive predictive value was 61%, and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” he said. “Our false positives tended to be those patients who had higher calcium scores and this neared statistical significance.”

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

ELSEVIER GLOBAL MEDICAL NEWS

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked to determine diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York. He and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center. The facility is a referral center for hospitals on Long Island. About 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible. Calcium channel blockers were used in patients who had contraindications to β-blockers. The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was about 58 beats a minute. Chest pain and abnormal stress test were the most common indications.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments). The mean age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65%, and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” said Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%. The positive predictive value was 61%, and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” he said. “Our false positives tended to be those patients who had higher calcium scores and this neared statistical significance.”

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

ELSEVIER GLOBAL MEDICAL NEWS

Publications
Publications
Topics
Article Type
Display Headline
Multidetector CT Accurate in 'Real-World' Patients
Display Headline
Multidetector CT Accurate in 'Real-World' Patients
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Lab Tests, History Catch Secondary Osteoporosis

Article Type
Changed
Display Headline
Lab Tests, History Catch Secondary Osteoporosis

WASHINGTON — A careful evaluation and thorough history can identify a large portion of patients with secondary osteoporosis, Dr. Meryl LeBoff, director of the skeletal health and osteoporosis center and bone density unit at Brigham and Women's Hospital in Boston, said at an international symposium sponsored by the National Osteoporosis Foundation.

The true prevalence of secondary osteoporosis is not known. However, about 50% of patients can be detected with a good medical history, Dr. LeBoff said. Although laboratory evaluations vary, such tests can be used to identify 25%–65% of patients with secondary osteoporosis.

Identifying secondary osteoporosis is important, because skeletal changes may be reversible and decreased acquisition of peak bone mass is a determinant of osteoporosis later in life.

In a 2004 report on bone health and osteoporosis, the surgeon general recommended that all patients who are diagnosed with osteoporosis get at least a limited evaluation for secondary causes of bone loss. However, patients with a low z score are most in need of in-depth evaluation for secondary osteoporosis.

In particular, premenopausal women or men with unexplained fractures and those who are adherent but have a poor response to therapy should be evaluated for secondary osteoporosis.

A low z score—which compares a patient's bone mineral density (BMD) to the mean for a healthy age- and gender-matched population—may suggest an increased likelihood of secondary osteoporosis. A z score of −1.0 is associated with a twofold greater lifetime risk of fracture and a z score of −2.0 is associated with a fourfold greater lifetime risk of fracture.

“However, z scores do not consistently predict which patient has an underlying disorder, so it's important to use clinical judgment in the evaluation of a particular patient,” Dr. LeBoff said.

There are no evidence-based guides for evaluating a patient for secondary osteoporosis. Dr. LeBoff recommends taking a detailed personal and family history. Be sure to ask about calcium intake. In addition to a thorough physical exam, do bone density testing and laboratory tests.

Laboratory tests for serum calcium, 25-hydroxy vitamin D, 24-hour urinary calcium, and parathyroid hormone—plus serum thyroid-stimulating hormone among women on thyroid replacement—can identify an estimated 98% of patients with secondary osteoporosis (J. Clin. Endocrinol. Metab. 2002;87:4431-7).

At the Brigham and Women's osteoporosis center, guidelines for evaluation of secondary osteoporosis include a z score less than −1.5. Laboratory tests include serum calcium and phosphorus, renal function, 25-hydroxy vitamin D levels, thyroid-stimulating hormone, parathyroid hormone, and urinary calcium. In select patients, markers of bone turnover may be tested.

Dr. LeBoff also discussed some of the common causes of secondary osteoporosis:

Glucocorticoids. “Use of glucocorticoids is the most common cause of secondary osteoporosis,” said Dr. LeBoff. A number of other endocrine abnormalities—thyroid hormone excess, hypogonadism, anorexia, hyperparathyroidism, hypercalciuria, vitamin D deficiency, and androgen insensitivity—can also cause secondary osteoporosis.

Glucocorticoids increase fracture risk progressively. “Even extremely low doses of inhaled glucocorticoids can lead to bone loss,” Dr. LeBoff said.

The pathophysiology of glucocorticoid-induced osteoporosis is multifactorial, involving decreased osteoblast function, increased osteoblast apoptosis, increased gastrointestinal absorption of calcium, increased urinary calcium excretion, and an increase in osteoclast bone resorption.

Anorexia. This disorder affects an estimated 4% of U.S. college students. Anorexia leads to a 25% lower spine BMD and a sevenfold increased fracture risk. Peak bone mass is decreased, and there may be a permanent deficit of bone mass in these young women. Anorexic women have subnormal levels of dehydroepiandrosterone, testosterone, estrogen, and cortisol. “Estrogen does not correct the low bone mass [in these women],” Dr. LeBoff said. A number of trials are underway looking at ways to reverse decreased bone mass in anorexic women.

Vitamin D deficiency. Vitamin D deficiency is common and has been implicated in impaired muscle function, increased falls, increased muscle pain, multiple sclerosis, and some malignancies. There is seasonal variation in vitamin D levels and notably, vitamin D activation decreases with age, darker skin pigment, and increased sunblock use. Gastrointestinal disorders also can lead to vitamin D deficiency, as it is absorbed in the small intestine.

Levels of vitamin D sufficiency and deficiency have been a matter of some debate. “Vitamin D deficiency is currently defined as 25-hydroxy vitamin D level of less than 20 ng/mL … sufficiency for bone is [25-hydroxy vitamin D level] greater than 30–32 ng/mL,” Dr. LeBoff said.

Inadequate levels of vitamin D have been documented in 52% of women who participated in osteoporosis trials. Women in these studies had an average T score of −1.8.

In a study based at Brigham and Women's, 90% of women admitted with hip fractures had vitamin D insufficiency and 57% had vitamin D deficiency. Because of this, when women are admitted now with hip fragility fracture they are given 50,000 units of vitamin D. They are also evaluated for secondary osteoporosis.

 

 

Aromatase inhibitors. “Bone loss is clearly associated with breast cancer therapies,” Dr. LeBoff said. Aromatase inhibitors can lead to bone loss of about 2.6% per year, though long-term data are not yet available. Gonadotropin-releasing hormones can lead to bone loss of 4%–6% per year. Ovarian failure can lead to bone loss of about 8% per year. Oophorectomy is associated with bone loss of 11% per year.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — A careful evaluation and thorough history can identify a large portion of patients with secondary osteoporosis, Dr. Meryl LeBoff, director of the skeletal health and osteoporosis center and bone density unit at Brigham and Women's Hospital in Boston, said at an international symposium sponsored by the National Osteoporosis Foundation.

The true prevalence of secondary osteoporosis is not known. However, about 50% of patients can be detected with a good medical history, Dr. LeBoff said. Although laboratory evaluations vary, such tests can be used to identify 25%–65% of patients with secondary osteoporosis.

Identifying secondary osteoporosis is important, because skeletal changes may be reversible and decreased acquisition of peak bone mass is a determinant of osteoporosis later in life.

In a 2004 report on bone health and osteoporosis, the surgeon general recommended that all patients who are diagnosed with osteoporosis get at least a limited evaluation for secondary causes of bone loss. However, patients with a low z score are most in need of in-depth evaluation for secondary osteoporosis.

In particular, premenopausal women or men with unexplained fractures and those who are adherent but have a poor response to therapy should be evaluated for secondary osteoporosis.

A low z score—which compares a patient's bone mineral density (BMD) to the mean for a healthy age- and gender-matched population—may suggest an increased likelihood of secondary osteoporosis. A z score of −1.0 is associated with a twofold greater lifetime risk of fracture and a z score of −2.0 is associated with a fourfold greater lifetime risk of fracture.

“However, z scores do not consistently predict which patient has an underlying disorder, so it's important to use clinical judgment in the evaluation of a particular patient,” Dr. LeBoff said.

There are no evidence-based guides for evaluating a patient for secondary osteoporosis. Dr. LeBoff recommends taking a detailed personal and family history. Be sure to ask about calcium intake. In addition to a thorough physical exam, do bone density testing and laboratory tests.

Laboratory tests for serum calcium, 25-hydroxy vitamin D, 24-hour urinary calcium, and parathyroid hormone—plus serum thyroid-stimulating hormone among women on thyroid replacement—can identify an estimated 98% of patients with secondary osteoporosis (J. Clin. Endocrinol. Metab. 2002;87:4431-7).

At the Brigham and Women's osteoporosis center, guidelines for evaluation of secondary osteoporosis include a z score less than −1.5. Laboratory tests include serum calcium and phosphorus, renal function, 25-hydroxy vitamin D levels, thyroid-stimulating hormone, parathyroid hormone, and urinary calcium. In select patients, markers of bone turnover may be tested.

Dr. LeBoff also discussed some of the common causes of secondary osteoporosis:

Glucocorticoids. “Use of glucocorticoids is the most common cause of secondary osteoporosis,” said Dr. LeBoff. A number of other endocrine abnormalities—thyroid hormone excess, hypogonadism, anorexia, hyperparathyroidism, hypercalciuria, vitamin D deficiency, and androgen insensitivity—can also cause secondary osteoporosis.

Glucocorticoids increase fracture risk progressively. “Even extremely low doses of inhaled glucocorticoids can lead to bone loss,” Dr. LeBoff said.

The pathophysiology of glucocorticoid-induced osteoporosis is multifactorial, involving decreased osteoblast function, increased osteoblast apoptosis, increased gastrointestinal absorption of calcium, increased urinary calcium excretion, and an increase in osteoclast bone resorption.

Anorexia. This disorder affects an estimated 4% of U.S. college students. Anorexia leads to a 25% lower spine BMD and a sevenfold increased fracture risk. Peak bone mass is decreased, and there may be a permanent deficit of bone mass in these young women. Anorexic women have subnormal levels of dehydroepiandrosterone, testosterone, estrogen, and cortisol. “Estrogen does not correct the low bone mass [in these women],” Dr. LeBoff said. A number of trials are underway looking at ways to reverse decreased bone mass in anorexic women.

Vitamin D deficiency. Vitamin D deficiency is common and has been implicated in impaired muscle function, increased falls, increased muscle pain, multiple sclerosis, and some malignancies. There is seasonal variation in vitamin D levels and notably, vitamin D activation decreases with age, darker skin pigment, and increased sunblock use. Gastrointestinal disorders also can lead to vitamin D deficiency, as it is absorbed in the small intestine.

Levels of vitamin D sufficiency and deficiency have been a matter of some debate. “Vitamin D deficiency is currently defined as 25-hydroxy vitamin D level of less than 20 ng/mL … sufficiency for bone is [25-hydroxy vitamin D level] greater than 30–32 ng/mL,” Dr. LeBoff said.

Inadequate levels of vitamin D have been documented in 52% of women who participated in osteoporosis trials. Women in these studies had an average T score of −1.8.

In a study based at Brigham and Women's, 90% of women admitted with hip fractures had vitamin D insufficiency and 57% had vitamin D deficiency. Because of this, when women are admitted now with hip fragility fracture they are given 50,000 units of vitamin D. They are also evaluated for secondary osteoporosis.

 

 

Aromatase inhibitors. “Bone loss is clearly associated with breast cancer therapies,” Dr. LeBoff said. Aromatase inhibitors can lead to bone loss of about 2.6% per year, though long-term data are not yet available. Gonadotropin-releasing hormones can lead to bone loss of 4%–6% per year. Ovarian failure can lead to bone loss of about 8% per year. Oophorectomy is associated with bone loss of 11% per year.

WASHINGTON — A careful evaluation and thorough history can identify a large portion of patients with secondary osteoporosis, Dr. Meryl LeBoff, director of the skeletal health and osteoporosis center and bone density unit at Brigham and Women's Hospital in Boston, said at an international symposium sponsored by the National Osteoporosis Foundation.

The true prevalence of secondary osteoporosis is not known. However, about 50% of patients can be detected with a good medical history, Dr. LeBoff said. Although laboratory evaluations vary, such tests can be used to identify 25%–65% of patients with secondary osteoporosis.

Identifying secondary osteoporosis is important, because skeletal changes may be reversible and decreased acquisition of peak bone mass is a determinant of osteoporosis later in life.

In a 2004 report on bone health and osteoporosis, the surgeon general recommended that all patients who are diagnosed with osteoporosis get at least a limited evaluation for secondary causes of bone loss. However, patients with a low z score are most in need of in-depth evaluation for secondary osteoporosis.

In particular, premenopausal women or men with unexplained fractures and those who are adherent but have a poor response to therapy should be evaluated for secondary osteoporosis.

A low z score—which compares a patient's bone mineral density (BMD) to the mean for a healthy age- and gender-matched population—may suggest an increased likelihood of secondary osteoporosis. A z score of −1.0 is associated with a twofold greater lifetime risk of fracture and a z score of −2.0 is associated with a fourfold greater lifetime risk of fracture.

“However, z scores do not consistently predict which patient has an underlying disorder, so it's important to use clinical judgment in the evaluation of a particular patient,” Dr. LeBoff said.

There are no evidence-based guides for evaluating a patient for secondary osteoporosis. Dr. LeBoff recommends taking a detailed personal and family history. Be sure to ask about calcium intake. In addition to a thorough physical exam, do bone density testing and laboratory tests.

Laboratory tests for serum calcium, 25-hydroxy vitamin D, 24-hour urinary calcium, and parathyroid hormone—plus serum thyroid-stimulating hormone among women on thyroid replacement—can identify an estimated 98% of patients with secondary osteoporosis (J. Clin. Endocrinol. Metab. 2002;87:4431-7).

At the Brigham and Women's osteoporosis center, guidelines for evaluation of secondary osteoporosis include a z score less than −1.5. Laboratory tests include serum calcium and phosphorus, renal function, 25-hydroxy vitamin D levels, thyroid-stimulating hormone, parathyroid hormone, and urinary calcium. In select patients, markers of bone turnover may be tested.

Dr. LeBoff also discussed some of the common causes of secondary osteoporosis:

Glucocorticoids. “Use of glucocorticoids is the most common cause of secondary osteoporosis,” said Dr. LeBoff. A number of other endocrine abnormalities—thyroid hormone excess, hypogonadism, anorexia, hyperparathyroidism, hypercalciuria, vitamin D deficiency, and androgen insensitivity—can also cause secondary osteoporosis.

Glucocorticoids increase fracture risk progressively. “Even extremely low doses of inhaled glucocorticoids can lead to bone loss,” Dr. LeBoff said.

The pathophysiology of glucocorticoid-induced osteoporosis is multifactorial, involving decreased osteoblast function, increased osteoblast apoptosis, increased gastrointestinal absorption of calcium, increased urinary calcium excretion, and an increase in osteoclast bone resorption.

Anorexia. This disorder affects an estimated 4% of U.S. college students. Anorexia leads to a 25% lower spine BMD and a sevenfold increased fracture risk. Peak bone mass is decreased, and there may be a permanent deficit of bone mass in these young women. Anorexic women have subnormal levels of dehydroepiandrosterone, testosterone, estrogen, and cortisol. “Estrogen does not correct the low bone mass [in these women],” Dr. LeBoff said. A number of trials are underway looking at ways to reverse decreased bone mass in anorexic women.

Vitamin D deficiency. Vitamin D deficiency is common and has been implicated in impaired muscle function, increased falls, increased muscle pain, multiple sclerosis, and some malignancies. There is seasonal variation in vitamin D levels and notably, vitamin D activation decreases with age, darker skin pigment, and increased sunblock use. Gastrointestinal disorders also can lead to vitamin D deficiency, as it is absorbed in the small intestine.

Levels of vitamin D sufficiency and deficiency have been a matter of some debate. “Vitamin D deficiency is currently defined as 25-hydroxy vitamin D level of less than 20 ng/mL … sufficiency for bone is [25-hydroxy vitamin D level] greater than 30–32 ng/mL,” Dr. LeBoff said.

Inadequate levels of vitamin D have been documented in 52% of women who participated in osteoporosis trials. Women in these studies had an average T score of −1.8.

In a study based at Brigham and Women's, 90% of women admitted with hip fractures had vitamin D insufficiency and 57% had vitamin D deficiency. Because of this, when women are admitted now with hip fragility fracture they are given 50,000 units of vitamin D. They are also evaluated for secondary osteoporosis.

 

 

Aromatase inhibitors. “Bone loss is clearly associated with breast cancer therapies,” Dr. LeBoff said. Aromatase inhibitors can lead to bone loss of about 2.6% per year, though long-term data are not yet available. Gonadotropin-releasing hormones can lead to bone loss of 4%–6% per year. Ovarian failure can lead to bone loss of about 8% per year. Oophorectomy is associated with bone loss of 11% per year.

Publications
Publications
Topics
Article Type
Display Headline
Lab Tests, History Catch Secondary Osteoporosis
Display Headline
Lab Tests, History Catch Secondary Osteoporosis
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Image of the Month

Article Type
Changed
Display Headline
Image of the Month

To evaluate the level of atrophy in the brain—which presumably is a surrogate marker for the underlying pathology—researchers have focused on a few regions, such as the hippocampus and entorhinal cortex. These regions are known from histopathologic studies to be affected by Alzheimer disease (AD).

In part, studies have been limited to a few areas because it can be cumbersome and time consuming to outline these regions manually on images. However, “automated computer analysis methods have the ability to go beyond that and look at many different regions together,” said Christos Davatzikos, Ph.D., director of the section of biomedical image analysis in the department of radiology at the University of Pennsylvania, Philadelphia.

Dr. Davatzikos and Susan M. Resnick, Ph.D., of the National Institute on Aging's laboratory of personality and cognition, along with their colleagues, recently studied 15 elderly individuals with mild cognitive impairment (MCI) and 15 healthy individuals from the Baltimore Longitudinal Study of Aging's neuroimaging substudy. Although the 15 case patients were free of dementia at initial enrollment, they developed MCI over the course of up to 9 years (Neurobiol. Aging 2006 doi:10.1016/j.neurobiolaging.2006.11.010).

Participants in the Baltimore Longitudinal Study of Aging are screened yearly, using a number of tests of mental status and cognitive function.

Diagnosis of MCI was made by consensus based on the results of assessments, including the clinical dementia rating scale.

MCI/AD appears to be a complex process that involves many brain regions. The idea behind this approach was to avoid making a priori assumptions about which regions are affected by AD but rather to look at the entire brain.

“The computer essentially evaluates every region in the brain and gets a number of how much gray matter there is locally in that region,” said Dr. Davatzikos. In the next level of analysis, the computer evaluates whether a given combination of these numbers indicates a spatial pattern that is suggestive of MCI—strictly from the perspective of brain structure.

In patients with MCI, the researchers used the MR brain images immediately prior to the diagnosis of dementia, or else the most recent scans for those who had not progressed to dementia; MR brain images used for healthy patients were selected to match the two groups on age and sex. Then the researchers used the MR brain images of patients with MCI to teach the computer what the spatial distribution of gray and white matter looks like in individuals with MCI, said Dr. Davatzikos.

When evaluating a new individual, the computer compares the spatial distribution of gray and white matter of that individual with the patterns of MCI and healthy controls. The computer then determines whether the brain pattern of the new individual more closely resembles that of the MCI patients or of normal individuals.

“So basically we took the most recent scans and we said, 'Can you train the computer to recognize the spatial patterns of atrophy that are highly characteristic of MCI?' and we found—with approximately 90% accuracy—that we could do that,” said Dr. Davatzikos.

The researchers used the most recent scans to develop the model but then were able to apply it to previous scans and follow brain pattern changes in these individuals longitudinally.

An abnormality score was developed for each individual based on regional tissue distribution and volumetric measurements of specific brain regions.

A positive value (up to 1) indicates a structural pattern resembling MCI, while a negative value (as low as −1) indicates brain structure in unimpaired individuals.

In the most recent scans, those with MCI had an average abnormality score of 0.26, while those without MCI have an average score of −0.30. In the scans closest to the time of conversion to MCI status (in the eight patients who converted during the study), the average abnormality score was 0.15.

“On the average, they seem to be halfway between zero—the dividing line—and MCI, and were certainly much closer to MCI than normal individuals,” said Dr. Davatzikos.

This indicates that in the year of conversion, the patients who progressed to MCI were already well into the range of abnormal brain structure.

Most of the usual suspects—regions such as the hippocampus, entorhinal cortex, lateral and inferior temporal structures, and anterior and posterior cingulate that have already been identified as playing a role in AD—proved to be important regions in the MCI brain pattern that the researchers developed.

However, some regions known not to be involved in AD, such as occipital cortex, were used by the computer, presumably as normalization factors.

“I think that most of the regions that we found were not actually that surprising,” commented Dr. Davatzikos. “The combination of all of these [brain regions] was really what gave the diagnostic accuracy for individuals.”

 

 

It's also important that “the regions that we found that were involved had been identified in group analyses,” said Dr. Resnick.

Dr. Resnick noted that one of the study's strengths is how early patients' mild cognitive impairment was detected, showing the tool's potential for very early diagnosis.

“The people [whose conditions] we're calling 'mild cognitive impairment' in this sample would not really have come to clinical attention,” she said.

One advantage of this type of tool is that clinicians typically don't have serial data on patient cognition. Rather, a patient usually comes into the office with a complaint about memory and the clinician has to determine if this is a result of normal aging or a more pathologic process.

The ability to use an assessment of brain structure to help determine MCI could be particularly important for high-functioning individuals, who may have suffered significant cognitive declines by the time they meet clinical criteria for impairment, said Dr. Resnick.

In fact, one participant in this study was considered cognitively normal by clinical measure at the time of the most recent scan. However, when previous scans were evaluated using this method, the patient's abnormality scores rose over time. This patient subsequently died and autopsy revealed moderate AD pathology.

“Although we only evaluated the autopsy results of this one patient who seemed to be an outlier, it shows that the patterns on MRI were more in agreement with the underlying pathology than with the clinical status of the patient,” said Dr. Davatzikos.

The study is limited by the small sample size.

So is this new technique going to solve the problem of predicting which individuals will eventually develop MCI and progress to Alzheimer disease? Probably not. It's unlikely that any one tool or test is going to be able to definitively predict which individuals will develop Alzheimer disease.

“We're talking about risk factors here,” said Dr. Davatzikos. “It's like cardiac disease. … It's a collection of information that the clinician then has to evaluate [to] make a decision.”

Still, each additional piece of information that can help a clinician identify individuals potentially on the road to Alzheimer disease as early in the process as possible will be essential for making treatment decisions, especially should disease progression-halting drugs becomes available.

The above image shows the regions in which brain atrophy was evaluated by the pattern classifier to get an abnormality score. The color-coding reflects how much each region contributed to the discrimination between mild cognitive impairment and cognitively normal individuals. Larger numbers indicate a larger contribution. Courtesy Dr. Christos Davatzikos and Dr. Susan M. Resnick

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

To evaluate the level of atrophy in the brain—which presumably is a surrogate marker for the underlying pathology—researchers have focused on a few regions, such as the hippocampus and entorhinal cortex. These regions are known from histopathologic studies to be affected by Alzheimer disease (AD).

In part, studies have been limited to a few areas because it can be cumbersome and time consuming to outline these regions manually on images. However, “automated computer analysis methods have the ability to go beyond that and look at many different regions together,” said Christos Davatzikos, Ph.D., director of the section of biomedical image analysis in the department of radiology at the University of Pennsylvania, Philadelphia.

Dr. Davatzikos and Susan M. Resnick, Ph.D., of the National Institute on Aging's laboratory of personality and cognition, along with their colleagues, recently studied 15 elderly individuals with mild cognitive impairment (MCI) and 15 healthy individuals from the Baltimore Longitudinal Study of Aging's neuroimaging substudy. Although the 15 case patients were free of dementia at initial enrollment, they developed MCI over the course of up to 9 years (Neurobiol. Aging 2006 doi:10.1016/j.neurobiolaging.2006.11.010).

Participants in the Baltimore Longitudinal Study of Aging are screened yearly, using a number of tests of mental status and cognitive function.

Diagnosis of MCI was made by consensus based on the results of assessments, including the clinical dementia rating scale.

MCI/AD appears to be a complex process that involves many brain regions. The idea behind this approach was to avoid making a priori assumptions about which regions are affected by AD but rather to look at the entire brain.

“The computer essentially evaluates every region in the brain and gets a number of how much gray matter there is locally in that region,” said Dr. Davatzikos. In the next level of analysis, the computer evaluates whether a given combination of these numbers indicates a spatial pattern that is suggestive of MCI—strictly from the perspective of brain structure.

In patients with MCI, the researchers used the MR brain images immediately prior to the diagnosis of dementia, or else the most recent scans for those who had not progressed to dementia; MR brain images used for healthy patients were selected to match the two groups on age and sex. Then the researchers used the MR brain images of patients with MCI to teach the computer what the spatial distribution of gray and white matter looks like in individuals with MCI, said Dr. Davatzikos.

When evaluating a new individual, the computer compares the spatial distribution of gray and white matter of that individual with the patterns of MCI and healthy controls. The computer then determines whether the brain pattern of the new individual more closely resembles that of the MCI patients or of normal individuals.

“So basically we took the most recent scans and we said, 'Can you train the computer to recognize the spatial patterns of atrophy that are highly characteristic of MCI?' and we found—with approximately 90% accuracy—that we could do that,” said Dr. Davatzikos.

The researchers used the most recent scans to develop the model but then were able to apply it to previous scans and follow brain pattern changes in these individuals longitudinally.

An abnormality score was developed for each individual based on regional tissue distribution and volumetric measurements of specific brain regions.

A positive value (up to 1) indicates a structural pattern resembling MCI, while a negative value (as low as −1) indicates brain structure in unimpaired individuals.

In the most recent scans, those with MCI had an average abnormality score of 0.26, while those without MCI have an average score of −0.30. In the scans closest to the time of conversion to MCI status (in the eight patients who converted during the study), the average abnormality score was 0.15.

“On the average, they seem to be halfway between zero—the dividing line—and MCI, and were certainly much closer to MCI than normal individuals,” said Dr. Davatzikos.

This indicates that in the year of conversion, the patients who progressed to MCI were already well into the range of abnormal brain structure.

Most of the usual suspects—regions such as the hippocampus, entorhinal cortex, lateral and inferior temporal structures, and anterior and posterior cingulate that have already been identified as playing a role in AD—proved to be important regions in the MCI brain pattern that the researchers developed.

However, some regions known not to be involved in AD, such as occipital cortex, were used by the computer, presumably as normalization factors.

“I think that most of the regions that we found were not actually that surprising,” commented Dr. Davatzikos. “The combination of all of these [brain regions] was really what gave the diagnostic accuracy for individuals.”

 

 

It's also important that “the regions that we found that were involved had been identified in group analyses,” said Dr. Resnick.

Dr. Resnick noted that one of the study's strengths is how early patients' mild cognitive impairment was detected, showing the tool's potential for very early diagnosis.

“The people [whose conditions] we're calling 'mild cognitive impairment' in this sample would not really have come to clinical attention,” she said.

One advantage of this type of tool is that clinicians typically don't have serial data on patient cognition. Rather, a patient usually comes into the office with a complaint about memory and the clinician has to determine if this is a result of normal aging or a more pathologic process.

The ability to use an assessment of brain structure to help determine MCI could be particularly important for high-functioning individuals, who may have suffered significant cognitive declines by the time they meet clinical criteria for impairment, said Dr. Resnick.

In fact, one participant in this study was considered cognitively normal by clinical measure at the time of the most recent scan. However, when previous scans were evaluated using this method, the patient's abnormality scores rose over time. This patient subsequently died and autopsy revealed moderate AD pathology.

“Although we only evaluated the autopsy results of this one patient who seemed to be an outlier, it shows that the patterns on MRI were more in agreement with the underlying pathology than with the clinical status of the patient,” said Dr. Davatzikos.

The study is limited by the small sample size.

So is this new technique going to solve the problem of predicting which individuals will eventually develop MCI and progress to Alzheimer disease? Probably not. It's unlikely that any one tool or test is going to be able to definitively predict which individuals will develop Alzheimer disease.

“We're talking about risk factors here,” said Dr. Davatzikos. “It's like cardiac disease. … It's a collection of information that the clinician then has to evaluate [to] make a decision.”

Still, each additional piece of information that can help a clinician identify individuals potentially on the road to Alzheimer disease as early in the process as possible will be essential for making treatment decisions, especially should disease progression-halting drugs becomes available.

The above image shows the regions in which brain atrophy was evaluated by the pattern classifier to get an abnormality score. The color-coding reflects how much each region contributed to the discrimination between mild cognitive impairment and cognitively normal individuals. Larger numbers indicate a larger contribution. Courtesy Dr. Christos Davatzikos and Dr. Susan M. Resnick

To evaluate the level of atrophy in the brain—which presumably is a surrogate marker for the underlying pathology—researchers have focused on a few regions, such as the hippocampus and entorhinal cortex. These regions are known from histopathologic studies to be affected by Alzheimer disease (AD).

In part, studies have been limited to a few areas because it can be cumbersome and time consuming to outline these regions manually on images. However, “automated computer analysis methods have the ability to go beyond that and look at many different regions together,” said Christos Davatzikos, Ph.D., director of the section of biomedical image analysis in the department of radiology at the University of Pennsylvania, Philadelphia.

Dr. Davatzikos and Susan M. Resnick, Ph.D., of the National Institute on Aging's laboratory of personality and cognition, along with their colleagues, recently studied 15 elderly individuals with mild cognitive impairment (MCI) and 15 healthy individuals from the Baltimore Longitudinal Study of Aging's neuroimaging substudy. Although the 15 case patients were free of dementia at initial enrollment, they developed MCI over the course of up to 9 years (Neurobiol. Aging 2006 doi:10.1016/j.neurobiolaging.2006.11.010).

Participants in the Baltimore Longitudinal Study of Aging are screened yearly, using a number of tests of mental status and cognitive function.

Diagnosis of MCI was made by consensus based on the results of assessments, including the clinical dementia rating scale.

MCI/AD appears to be a complex process that involves many brain regions. The idea behind this approach was to avoid making a priori assumptions about which regions are affected by AD but rather to look at the entire brain.

“The computer essentially evaluates every region in the brain and gets a number of how much gray matter there is locally in that region,” said Dr. Davatzikos. In the next level of analysis, the computer evaluates whether a given combination of these numbers indicates a spatial pattern that is suggestive of MCI—strictly from the perspective of brain structure.

In patients with MCI, the researchers used the MR brain images immediately prior to the diagnosis of dementia, or else the most recent scans for those who had not progressed to dementia; MR brain images used for healthy patients were selected to match the two groups on age and sex. Then the researchers used the MR brain images of patients with MCI to teach the computer what the spatial distribution of gray and white matter looks like in individuals with MCI, said Dr. Davatzikos.

When evaluating a new individual, the computer compares the spatial distribution of gray and white matter of that individual with the patterns of MCI and healthy controls. The computer then determines whether the brain pattern of the new individual more closely resembles that of the MCI patients or of normal individuals.

“So basically we took the most recent scans and we said, 'Can you train the computer to recognize the spatial patterns of atrophy that are highly characteristic of MCI?' and we found—with approximately 90% accuracy—that we could do that,” said Dr. Davatzikos.

The researchers used the most recent scans to develop the model but then were able to apply it to previous scans and follow brain pattern changes in these individuals longitudinally.

An abnormality score was developed for each individual based on regional tissue distribution and volumetric measurements of specific brain regions.

A positive value (up to 1) indicates a structural pattern resembling MCI, while a negative value (as low as −1) indicates brain structure in unimpaired individuals.

In the most recent scans, those with MCI had an average abnormality score of 0.26, while those without MCI have an average score of −0.30. In the scans closest to the time of conversion to MCI status (in the eight patients who converted during the study), the average abnormality score was 0.15.

“On the average, they seem to be halfway between zero—the dividing line—and MCI, and were certainly much closer to MCI than normal individuals,” said Dr. Davatzikos.

This indicates that in the year of conversion, the patients who progressed to MCI were already well into the range of abnormal brain structure.

Most of the usual suspects—regions such as the hippocampus, entorhinal cortex, lateral and inferior temporal structures, and anterior and posterior cingulate that have already been identified as playing a role in AD—proved to be important regions in the MCI brain pattern that the researchers developed.

However, some regions known not to be involved in AD, such as occipital cortex, were used by the computer, presumably as normalization factors.

“I think that most of the regions that we found were not actually that surprising,” commented Dr. Davatzikos. “The combination of all of these [brain regions] was really what gave the diagnostic accuracy for individuals.”

 

 

It's also important that “the regions that we found that were involved had been identified in group analyses,” said Dr. Resnick.

Dr. Resnick noted that one of the study's strengths is how early patients' mild cognitive impairment was detected, showing the tool's potential for very early diagnosis.

“The people [whose conditions] we're calling 'mild cognitive impairment' in this sample would not really have come to clinical attention,” she said.

One advantage of this type of tool is that clinicians typically don't have serial data on patient cognition. Rather, a patient usually comes into the office with a complaint about memory and the clinician has to determine if this is a result of normal aging or a more pathologic process.

The ability to use an assessment of brain structure to help determine MCI could be particularly important for high-functioning individuals, who may have suffered significant cognitive declines by the time they meet clinical criteria for impairment, said Dr. Resnick.

In fact, one participant in this study was considered cognitively normal by clinical measure at the time of the most recent scan. However, when previous scans were evaluated using this method, the patient's abnormality scores rose over time. This patient subsequently died and autopsy revealed moderate AD pathology.

“Although we only evaluated the autopsy results of this one patient who seemed to be an outlier, it shows that the patterns on MRI were more in agreement with the underlying pathology than with the clinical status of the patient,” said Dr. Davatzikos.

The study is limited by the small sample size.

So is this new technique going to solve the problem of predicting which individuals will eventually develop MCI and progress to Alzheimer disease? Probably not. It's unlikely that any one tool or test is going to be able to definitively predict which individuals will develop Alzheimer disease.

“We're talking about risk factors here,” said Dr. Davatzikos. “It's like cardiac disease. … It's a collection of information that the clinician then has to evaluate [to] make a decision.”

Still, each additional piece of information that can help a clinician identify individuals potentially on the road to Alzheimer disease as early in the process as possible will be essential for making treatment decisions, especially should disease progression-halting drugs becomes available.

The above image shows the regions in which brain atrophy was evaluated by the pattern classifier to get an abnormality score. The color-coding reflects how much each region contributed to the discrimination between mild cognitive impairment and cognitively normal individuals. Larger numbers indicate a larger contribution. Courtesy Dr. Christos Davatzikos and Dr. Susan M. Resnick

Publications
Publications
Topics
Article Type
Display Headline
Image of the Month
Display Headline
Image of the Month
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Image of the Month

Article Type
Changed
Display Headline
Image of the Month

An urgent brain CT was ordered but appeared normal.

A cervical spine radiograph showed an increased atlantoaxial (C1-C2) distance of 5 mm.

However, MRI showed a septic arthritis from C1-C2 with enhancement of the dura.

There was no evidence of bony destruction or spinal cord compression.

Although any infectious agent may cause arthritis, bacterial pathogens are the most significant because of their rapidly destructive nature.

For this reason, the current discussion concentrates on bacterial septic arthritides.

Failure to recognize and to appropriately treat septic arthritis significantly increases morbidity and may even lead to death.

According to Dr. Sarah Westlake, who is a rheumatology specialist registrar at Queen Alexandra Hospital in Portsmouth, (England), only two patients previously have been described with C1-C2 septic arthritis.

“As in our patient, early radiograph features of prevertebral soft tissue swelling can be very subtle and bony destruction of septic arthritis or endplate destruction of diskitis can take 2-8 weeks to evolve,” she said.

Cervical septic arthritis or diskitis can be life threatening. That is because there is a heightened risk of cervical spine subluxation as well as medullary compression.

“It should therefore be considered in any patient with sepsis and severe neck pain, even with normal cervical spine radiographs,” she explained.

MRI and blood cultures are the diagnostic tests of choice.

However, if the blood cultures turn out to be negative, diskovertebral biopsy for diskitis or joint aspiration for septic arthritis could be considered by a suitably-trained radiologist, said Dr. Westlake.

Blood cultures were performed on this patient on three separate occasions. The subsequent cultures grew methicillin-resistant S. aureus.

S. aureus is the most common organism causing nongonococcal arthritis. The virulence of S. aureus is associated with its ability to attach to host tissue within the joint, evade host defenses, and cause damage to the joint,” according to Kelley's Textbook of Rheumatology, 7th edition.

The patient was treated with a 6-week course of vancomycin and an additional 6 weeks of rifampicin and doxycycline.

There were no neurologic complications that occurred at any time.

Septic arthritis from C1-C2 is seen (arrow), with enhancement of the dura. Courtesy Dr. R. Hull/Dr. S. Westlake/Dr. F. Witham

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

An urgent brain CT was ordered but appeared normal.

A cervical spine radiograph showed an increased atlantoaxial (C1-C2) distance of 5 mm.

However, MRI showed a septic arthritis from C1-C2 with enhancement of the dura.

There was no evidence of bony destruction or spinal cord compression.

Although any infectious agent may cause arthritis, bacterial pathogens are the most significant because of their rapidly destructive nature.

For this reason, the current discussion concentrates on bacterial septic arthritides.

Failure to recognize and to appropriately treat septic arthritis significantly increases morbidity and may even lead to death.

According to Dr. Sarah Westlake, who is a rheumatology specialist registrar at Queen Alexandra Hospital in Portsmouth, (England), only two patients previously have been described with C1-C2 septic arthritis.

“As in our patient, early radiograph features of prevertebral soft tissue swelling can be very subtle and bony destruction of septic arthritis or endplate destruction of diskitis can take 2-8 weeks to evolve,” she said.

Cervical septic arthritis or diskitis can be life threatening. That is because there is a heightened risk of cervical spine subluxation as well as medullary compression.

“It should therefore be considered in any patient with sepsis and severe neck pain, even with normal cervical spine radiographs,” she explained.

MRI and blood cultures are the diagnostic tests of choice.

However, if the blood cultures turn out to be negative, diskovertebral biopsy for diskitis or joint aspiration for septic arthritis could be considered by a suitably-trained radiologist, said Dr. Westlake.

Blood cultures were performed on this patient on three separate occasions. The subsequent cultures grew methicillin-resistant S. aureus.

S. aureus is the most common organism causing nongonococcal arthritis. The virulence of S. aureus is associated with its ability to attach to host tissue within the joint, evade host defenses, and cause damage to the joint,” according to Kelley's Textbook of Rheumatology, 7th edition.

The patient was treated with a 6-week course of vancomycin and an additional 6 weeks of rifampicin and doxycycline.

There were no neurologic complications that occurred at any time.

Septic arthritis from C1-C2 is seen (arrow), with enhancement of the dura. Courtesy Dr. R. Hull/Dr. S. Westlake/Dr. F. Witham

An urgent brain CT was ordered but appeared normal.

A cervical spine radiograph showed an increased atlantoaxial (C1-C2) distance of 5 mm.

However, MRI showed a septic arthritis from C1-C2 with enhancement of the dura.

There was no evidence of bony destruction or spinal cord compression.

Although any infectious agent may cause arthritis, bacterial pathogens are the most significant because of their rapidly destructive nature.

For this reason, the current discussion concentrates on bacterial septic arthritides.

Failure to recognize and to appropriately treat septic arthritis significantly increases morbidity and may even lead to death.

According to Dr. Sarah Westlake, who is a rheumatology specialist registrar at Queen Alexandra Hospital in Portsmouth, (England), only two patients previously have been described with C1-C2 septic arthritis.

“As in our patient, early radiograph features of prevertebral soft tissue swelling can be very subtle and bony destruction of septic arthritis or endplate destruction of diskitis can take 2-8 weeks to evolve,” she said.

Cervical septic arthritis or diskitis can be life threatening. That is because there is a heightened risk of cervical spine subluxation as well as medullary compression.

“It should therefore be considered in any patient with sepsis and severe neck pain, even with normal cervical spine radiographs,” she explained.

MRI and blood cultures are the diagnostic tests of choice.

However, if the blood cultures turn out to be negative, diskovertebral biopsy for diskitis or joint aspiration for septic arthritis could be considered by a suitably-trained radiologist, said Dr. Westlake.

Blood cultures were performed on this patient on three separate occasions. The subsequent cultures grew methicillin-resistant S. aureus.

S. aureus is the most common organism causing nongonococcal arthritis. The virulence of S. aureus is associated with its ability to attach to host tissue within the joint, evade host defenses, and cause damage to the joint,” according to Kelley's Textbook of Rheumatology, 7th edition.

The patient was treated with a 6-week course of vancomycin and an additional 6 weeks of rifampicin and doxycycline.

There were no neurologic complications that occurred at any time.

Septic arthritis from C1-C2 is seen (arrow), with enhancement of the dura. Courtesy Dr. R. Hull/Dr. S. Westlake/Dr. F. Witham

Publications
Publications
Topics
Article Type
Display Headline
Image of the Month
Display Headline
Image of the Month
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

MDCT Has High Accuracy in Real-World Setting

Article Type
Changed
Display Headline
MDCT Has High Accuracy in Real-World Setting

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked in order to determine the diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York.

Dr. Makaryus and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center.

The facility is a referral center for hospitals on Long Island. Roughly 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible.

Calcium channel blockers were used in patients who had contraindications to β-blockers.

The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was roughly 58 beats per minute. The two most common indications were chest pain and abnormal stress test.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments).

The mean patient age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65% and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” commented Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%.

The positive predictive value was 61% and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” said Dr. Makaryus. In other words, “our false positives tended to be those patients who had higher calcium scores, and this neared statistical significance [P = .059].”

Irregular heart rate and motion artifacts also can be problematic in CT angiography, he said.

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked in order to determine the diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York.

Dr. Makaryus and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center.

The facility is a referral center for hospitals on Long Island. Roughly 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible.

Calcium channel blockers were used in patients who had contraindications to β-blockers.

The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was roughly 58 beats per minute. The two most common indications were chest pain and abnormal stress test.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments).

The mean patient age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65% and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” commented Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%.

The positive predictive value was 61% and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” said Dr. Makaryus. In other words, “our false positives tended to be those patients who had higher calcium scores, and this neared statistical significance [P = .059].”

Irregular heart rate and motion artifacts also can be problematic in CT angiography, he said.

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

WASHINGTON — Multidetector CT angiography appears to be very accurate in diagnosing coronary artery disease even in less-than-ideal patients, according to data presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

While published studies have shown impressive diagnostic sensitivity and specificity for 64-slice CT in the assessment of coronary artery disease (CAD), patients with irregular heartbeats or allergies to β-blockers have tended to be excluded. In addition, patients with histories of coronary disease or those with high calcium scores sometimes were excluded.

“MDCT studies that have been published … have been highly selective of all the patients they have picked in order to determine the diagnostic accuracy of CT,” said Dr. Amgad N. Makaryus, a cardiologist at North Shore University Hospital in Manhasset, New York.

Dr. Makaryus and his colleagues evaluated the accuracy of 64-detector scanning compared with coronary angiography in a real-world population, at North Shore University Hospital, a large tertiary care center.

The facility is a referral center for hospitals on Long Island. Roughly 10,000 cardiac catheterizations are performed there yearly. In addition, 4,000–5,000 single-photon emission computed tomography myocardial perfusion studies are performed annually.

The study involved 1,818 consecutive patients who underwent coronary CT (64-detector). β-Blockers were used as much as possible.

Calcium channel blockers were used in patients who had contraindications to β-blockers.

The imaging protocol involved an 8- to 10-second breath hold with a 5- to 7-second image-acquisition time.

Overall, 17% of patients had a history of coronary disease; 10% had a history of atrial fibrillation or flutter. The mean heart rate during CT studies was roughly 58 beats per minute. The two most common indications were chest pain and abnormal stress test.

Specifically, the researchers assessed those patients who underwent invasive angiography based on their MDCT results. A total of 41 patients were referred for coronary angiography for 164 coronary arteries (410 coronary segments).

The mean patient age was 62 years (range 39–85 years) and the population was almost three-quarters male (73%). Stenosis of greater than 50% was considered significant.

On a per-vessel basis, the sensitivity of MDCT was 86% and specificity was 84%. The positive predictive value was 65% and the negative predictive value was 85%.

“Still we have this very high negative predictive value as has been seen in many of the prior studies,” commented Dr. Makaryus. On a per-segment basis, the sensitivity of MDCT was 77% and specificity was 93%.

The positive predictive value was 61% and the negative predictive value was 97%.

Calcium is a particular problem in CT angiography. Calcified plaques appear enlarged (or bloomed) because of partial-volume averaging effects and obscure the adjacent coronary lumen. This effect can lead to false-positive results because the degree of stenosis is overestimated.

The mean calcium score in this group was 789. “More of the patients that had higher calcium scores actually had a disagreement between their CTA result and the invasive coronary angiogram,” said Dr. Makaryus. In other words, “our false positives tended to be those patients who had higher calcium scores, and this neared statistical significance [P = .059].”

Irregular heart rate and motion artifacts also can be problematic in CT angiography, he said.

Dr. Makaryus, who disclosed that he had no significant conflicts of interest, was a postdoctoral clinical cardiovascular imaging fellow at New York-Presbyterian Hospital, New York, during 2006–2007.

Publications
Publications
Topics
Article Type
Display Headline
MDCT Has High Accuracy in Real-World Setting
Display Headline
MDCT Has High Accuracy in Real-World Setting
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

CT Angiography's Clinical Utility Faces Hurdles

Article Type
Changed
Display Headline
CT Angiography's Clinical Utility Faces Hurdles

WASHINGTON — Although the evaluation of noncalcified plaques with CT angiography currently is possible, there are still several obstacles to overcome before the technique is clinically useful, said Dr. Stephan Achenbach, a professor of medicine at the University of Erlangen in Germany, at the annual meeting of the Society of Cardiovascular Computed Tomography.

“As technology progresses, image quality gets better and better and our ability to visualize plaques gets better and better,” said Dr. Achenbach, who also is the past president of the Society of Cardiovascular Computed Tomography.

One of the criticisms of invasive angiography is that only the lumen can be seen, not the plaque itself. Contrast CT does allow for the visualization of noncalcified plaque in the coronary arteries.

“CT is able to show it—the slight lumen reduction and also noncalcified and partly calcified plaque,” said Dr. Achenbach. In fact, with high resolution, CT cross sections are similar to intravascular ultrasound (IVUS) for evaluating plaque composition. However, “this is indeed a tremendously difficult task to visualize these plaques by CT.”

One reason for this is that the plaques in the coronary arteries are extremely small. The spatial resolution of CT under optimal conditions is approximately 0.4 mm. “So we're trying to visualize something that is half a millimeter thick with a spatial resolution of 0.4 mm,” said Dr. Achenbach.

Another problem is contrast. Calcium is easy to see on CT because it has a very high contrast with the surrounding tissue. However, the contrast between noncalcified plaque and the surrounding tissue is much less. “So we have to deal with structures that give us very little contrast on CT,” said Dr. Achenbach.

Yet another challenge in using CT to visualize noncalcified plaque is the high level of image noise. Simply put, noise is the difference between real-world signals and an ideal signal and may be caused by a wide range of sources, such as variations in detector sensitivity and environmental variations.

The combination of low contrast between noncalcified plaque and surrounding structures and high image noise makes it very difficult to tell whether noncalcified plaque is present.

“Motion is another problem,” said Dr. Achenbach. Patient motion can produce blurring and dark areas that can be mistaken for noncalcified plaques.

The presence of calcium also can cause problems on occasion. Even under ideal conditions, calcium can appear to be surrounded by a dark rim. On inspection it can be unclear if this is really noncalcified plaque or not. “In the presence of calcium, we have tremendous difficulty ruling in or ruling out the presence of noncalcified plaque,” said Dr. Achenbach.

Despite these problems, “if image quality is really good, we continue to be amazed by how accurately and clearly CT angiography can visualize these noncalcified plaques,” he said.

Unfortunately, there are few data on the accuracy of CTA in identifying noncalcified plaque. In the studies that have been performed, researchers compared multidetector CT (usually 16-slice) with IVUS in patients without coronary stenoses. The accuracy of multidetector CT (MDCT) in the detection of nonstenotic plaque ranged from 80% to 90%. However, many of the plaques identified were at least partly calcified. The accuracy of MDCT detection of purely noncalcified plaque was closer to 50%.

Beyond plaque characterization, can CT quantify? “Theoretically, you can measure the size of the plaque and you can measure the size of the lumen,” said Dr. Achenbach.

In general, the correlation between MDCT and IVUS with regard to measuring plaque area and volume is good. “But you're not really able to very accurately measure a single coronary plaque,” said Dr. Achenbach.

In addition, interobserver variability is a problem when it comes to quantifying plaque using MDCT. In a study performed at his own institution, interobserver variability ranged from 19% to 32%, depending on the vessel.

The ultimate goal, though, is to be able to identify vulnerable plaques—those at the greatest risk of rupture.

Histologically, the markers of plaque vulnerability include a thin fibrous cap, a necrotic core, and macrophage infiltration. These markers are very hard to see on CT.

“However, there are some other measures that are also tied to plaque vulnerability and the [risk of it causing] an event in the future that might be amenable to CT,” said Dr. Achenbach.

The remodeling index—defined as the lesion external elastic membrane (EEM) area divided by the EEM area for a reference vessel—is a potential measure. “It has been shown that the remodeling index in CT correlates quite well with the remodeling index in IVUS,” said Dr. Achenbach. Strong positive remodeling has been associated with greater risk of plaque rupture.

 

 

It also is possible to measure the density—or attenuation—of plaque on CT. In a number of studies, it has been suggested that the lipid-rich plaques (considered more dangerous) have lower CT attenuation than do the fibrous plaques (which are more stable).

However, “measuring Hounsfield values is a little problematic if you want to differentiate a single plaque,” said Dr. Achenbach. In addition, CT density is heavily influenced by the concentration of contrast in the lumen. As the contrast concentration increases, so does density.

Dr. Achenbach disclosed that he has received grant/research support from Siemens. He also is a consultant to Bristol-Myers Squibb Co. and a member of the speakers' bureau for Siemens and Bracco Diagnostics Inc.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Although the evaluation of noncalcified plaques with CT angiography currently is possible, there are still several obstacles to overcome before the technique is clinically useful, said Dr. Stephan Achenbach, a professor of medicine at the University of Erlangen in Germany, at the annual meeting of the Society of Cardiovascular Computed Tomography.

“As technology progresses, image quality gets better and better and our ability to visualize plaques gets better and better,” said Dr. Achenbach, who also is the past president of the Society of Cardiovascular Computed Tomography.

One of the criticisms of invasive angiography is that only the lumen can be seen, not the plaque itself. Contrast CT does allow for the visualization of noncalcified plaque in the coronary arteries.

“CT is able to show it—the slight lumen reduction and also noncalcified and partly calcified plaque,” said Dr. Achenbach. In fact, with high resolution, CT cross sections are similar to intravascular ultrasound (IVUS) for evaluating plaque composition. However, “this is indeed a tremendously difficult task to visualize these plaques by CT.”

One reason for this is that the plaques in the coronary arteries are extremely small. The spatial resolution of CT under optimal conditions is approximately 0.4 mm. “So we're trying to visualize something that is half a millimeter thick with a spatial resolution of 0.4 mm,” said Dr. Achenbach.

Another problem is contrast. Calcium is easy to see on CT because it has a very high contrast with the surrounding tissue. However, the contrast between noncalcified plaque and the surrounding tissue is much less. “So we have to deal with structures that give us very little contrast on CT,” said Dr. Achenbach.

Yet another challenge in using CT to visualize noncalcified plaque is the high level of image noise. Simply put, noise is the difference between real-world signals and an ideal signal and may be caused by a wide range of sources, such as variations in detector sensitivity and environmental variations.

The combination of low contrast between noncalcified plaque and surrounding structures and high image noise makes it very difficult to tell whether noncalcified plaque is present.

“Motion is another problem,” said Dr. Achenbach. Patient motion can produce blurring and dark areas that can be mistaken for noncalcified plaques.

The presence of calcium also can cause problems on occasion. Even under ideal conditions, calcium can appear to be surrounded by a dark rim. On inspection it can be unclear if this is really noncalcified plaque or not. “In the presence of calcium, we have tremendous difficulty ruling in or ruling out the presence of noncalcified plaque,” said Dr. Achenbach.

Despite these problems, “if image quality is really good, we continue to be amazed by how accurately and clearly CT angiography can visualize these noncalcified plaques,” he said.

Unfortunately, there are few data on the accuracy of CTA in identifying noncalcified plaque. In the studies that have been performed, researchers compared multidetector CT (usually 16-slice) with IVUS in patients without coronary stenoses. The accuracy of multidetector CT (MDCT) in the detection of nonstenotic plaque ranged from 80% to 90%. However, many of the plaques identified were at least partly calcified. The accuracy of MDCT detection of purely noncalcified plaque was closer to 50%.

Beyond plaque characterization, can CT quantify? “Theoretically, you can measure the size of the plaque and you can measure the size of the lumen,” said Dr. Achenbach.

In general, the correlation between MDCT and IVUS with regard to measuring plaque area and volume is good. “But you're not really able to very accurately measure a single coronary plaque,” said Dr. Achenbach.

In addition, interobserver variability is a problem when it comes to quantifying plaque using MDCT. In a study performed at his own institution, interobserver variability ranged from 19% to 32%, depending on the vessel.

The ultimate goal, though, is to be able to identify vulnerable plaques—those at the greatest risk of rupture.

Histologically, the markers of plaque vulnerability include a thin fibrous cap, a necrotic core, and macrophage infiltration. These markers are very hard to see on CT.

“However, there are some other measures that are also tied to plaque vulnerability and the [risk of it causing] an event in the future that might be amenable to CT,” said Dr. Achenbach.

The remodeling index—defined as the lesion external elastic membrane (EEM) area divided by the EEM area for a reference vessel—is a potential measure. “It has been shown that the remodeling index in CT correlates quite well with the remodeling index in IVUS,” said Dr. Achenbach. Strong positive remodeling has been associated with greater risk of plaque rupture.

 

 

It also is possible to measure the density—or attenuation—of plaque on CT. In a number of studies, it has been suggested that the lipid-rich plaques (considered more dangerous) have lower CT attenuation than do the fibrous plaques (which are more stable).

However, “measuring Hounsfield values is a little problematic if you want to differentiate a single plaque,” said Dr. Achenbach. In addition, CT density is heavily influenced by the concentration of contrast in the lumen. As the contrast concentration increases, so does density.

Dr. Achenbach disclosed that he has received grant/research support from Siemens. He also is a consultant to Bristol-Myers Squibb Co. and a member of the speakers' bureau for Siemens and Bracco Diagnostics Inc.

WASHINGTON — Although the evaluation of noncalcified plaques with CT angiography currently is possible, there are still several obstacles to overcome before the technique is clinically useful, said Dr. Stephan Achenbach, a professor of medicine at the University of Erlangen in Germany, at the annual meeting of the Society of Cardiovascular Computed Tomography.

“As technology progresses, image quality gets better and better and our ability to visualize plaques gets better and better,” said Dr. Achenbach, who also is the past president of the Society of Cardiovascular Computed Tomography.

One of the criticisms of invasive angiography is that only the lumen can be seen, not the plaque itself. Contrast CT does allow for the visualization of noncalcified plaque in the coronary arteries.

“CT is able to show it—the slight lumen reduction and also noncalcified and partly calcified plaque,” said Dr. Achenbach. In fact, with high resolution, CT cross sections are similar to intravascular ultrasound (IVUS) for evaluating plaque composition. However, “this is indeed a tremendously difficult task to visualize these plaques by CT.”

One reason for this is that the plaques in the coronary arteries are extremely small. The spatial resolution of CT under optimal conditions is approximately 0.4 mm. “So we're trying to visualize something that is half a millimeter thick with a spatial resolution of 0.4 mm,” said Dr. Achenbach.

Another problem is contrast. Calcium is easy to see on CT because it has a very high contrast with the surrounding tissue. However, the contrast between noncalcified plaque and the surrounding tissue is much less. “So we have to deal with structures that give us very little contrast on CT,” said Dr. Achenbach.

Yet another challenge in using CT to visualize noncalcified plaque is the high level of image noise. Simply put, noise is the difference between real-world signals and an ideal signal and may be caused by a wide range of sources, such as variations in detector sensitivity and environmental variations.

The combination of low contrast between noncalcified plaque and surrounding structures and high image noise makes it very difficult to tell whether noncalcified plaque is present.

“Motion is another problem,” said Dr. Achenbach. Patient motion can produce blurring and dark areas that can be mistaken for noncalcified plaques.

The presence of calcium also can cause problems on occasion. Even under ideal conditions, calcium can appear to be surrounded by a dark rim. On inspection it can be unclear if this is really noncalcified plaque or not. “In the presence of calcium, we have tremendous difficulty ruling in or ruling out the presence of noncalcified plaque,” said Dr. Achenbach.

Despite these problems, “if image quality is really good, we continue to be amazed by how accurately and clearly CT angiography can visualize these noncalcified plaques,” he said.

Unfortunately, there are few data on the accuracy of CTA in identifying noncalcified plaque. In the studies that have been performed, researchers compared multidetector CT (usually 16-slice) with IVUS in patients without coronary stenoses. The accuracy of multidetector CT (MDCT) in the detection of nonstenotic plaque ranged from 80% to 90%. However, many of the plaques identified were at least partly calcified. The accuracy of MDCT detection of purely noncalcified plaque was closer to 50%.

Beyond plaque characterization, can CT quantify? “Theoretically, you can measure the size of the plaque and you can measure the size of the lumen,” said Dr. Achenbach.

In general, the correlation between MDCT and IVUS with regard to measuring plaque area and volume is good. “But you're not really able to very accurately measure a single coronary plaque,” said Dr. Achenbach.

In addition, interobserver variability is a problem when it comes to quantifying plaque using MDCT. In a study performed at his own institution, interobserver variability ranged from 19% to 32%, depending on the vessel.

The ultimate goal, though, is to be able to identify vulnerable plaques—those at the greatest risk of rupture.

Histologically, the markers of plaque vulnerability include a thin fibrous cap, a necrotic core, and macrophage infiltration. These markers are very hard to see on CT.

“However, there are some other measures that are also tied to plaque vulnerability and the [risk of it causing] an event in the future that might be amenable to CT,” said Dr. Achenbach.

The remodeling index—defined as the lesion external elastic membrane (EEM) area divided by the EEM area for a reference vessel—is a potential measure. “It has been shown that the remodeling index in CT correlates quite well with the remodeling index in IVUS,” said Dr. Achenbach. Strong positive remodeling has been associated with greater risk of plaque rupture.

 

 

It also is possible to measure the density—or attenuation—of plaque on CT. In a number of studies, it has been suggested that the lipid-rich plaques (considered more dangerous) have lower CT attenuation than do the fibrous plaques (which are more stable).

However, “measuring Hounsfield values is a little problematic if you want to differentiate a single plaque,” said Dr. Achenbach. In addition, CT density is heavily influenced by the concentration of contrast in the lumen. As the contrast concentration increases, so does density.

Dr. Achenbach disclosed that he has received grant/research support from Siemens. He also is a consultant to Bristol-Myers Squibb Co. and a member of the speakers' bureau for Siemens and Bracco Diagnostics Inc.

Publications
Publications
Topics
Article Type
Display Headline
CT Angiography's Clinical Utility Faces Hurdles
Display Headline
CT Angiography's Clinical Utility Faces Hurdles
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Pros and Cons of Continued Bisphosphonate Use

Article Type
Changed
Display Headline
Pros and Cons of Continued Bisphosphonate Use

WASHINGTON – Physicians and patients need to work together to decide for or against long-term bisphosphonate treatment for osteoporosis. The body of evidence is still evolving, and there's no one-size-fits-all answer, said Dr. Sundeep Khosla, research chair of the division of endocrinology at the Mayo Clinic in Rochester, Minn.

“I think ultimately the patient has to decide with her physician. … Patient values factor into this,” said Dr. Khosla at an international symposium sponsored by the National Osteoporosis Foundation. A physician can inform a patient about the best information that is currently available in terms of fracture risk and the risk of complications. However, the patient has to decide what risk she is willing to take with regard to fracture.

Dr. Khosla discussed the pros and cons of long-term bisphosphonate use in the context of a hypothetical patient familiar to many physicians. A 60-year-old woman started on vitamin D/calcium supplements and 70 mg/week alendronate 5 years ago when her dual-energy x-ray absorptiometry (DXA) scan revealed a spine T score of −2.6 and a total hip T score of −2.0. She also has a family history of hip fracture. Her bone mineral density (BMD) has increased about 5% at the spine and 3% at the hip. She has not had any clinical fractures. She asks if she should continue with alendronate and if so, for how long.

So, should a patient who has been on alendronate for 5 years continue with therapy? In favor of continuing, it does appear that continuation will reduce the risk of clinical vertebral fractures.

Alendronate is the longest-available bisphosphonate, with 10 years of follow-up data. In one analysis of 10 years of data for postmenopausal women on varying regimens of alendronate, those on 10 mg daily of alendronate had increased BMD for the spine and hip (N. Engl. J. Med. 2004;350:1189-99). Spine BMD increased by 13.7% from baseline over that period, and total hip BMD increased by 6.7%. Smaller gains in BMD were noted for women on 5 mg daily of alendronate: 9.3% and 2.9% for the spine and total hip, respectively. For women in the discontinuation group, spinal BMD leveled off (an increase of 0.3% from years 6-10), and total hip BMD declined slightly (a decrease of 1% from years 6-10). There was an initial reduction in vertebral fractures for women on alendronate, but there was no difference in vertebral fractures during years 6-10. However, the study was not adequately powered to assess fractures.

This study “told us that alendronate did in fact have sustained effects over 10 years on bone density and bone turnover markers,” said Dr. Khosla. However, the fracture data were inconclusive: “At best, there was no clear evidence for an increase in vertebral or nonvertebral fractures following long-term alendronate therapy.”

Other data suggest that stopping treatment for 5 years will increase the risk of nonvertebral fractures and minor vertebral deformities.

In the FLEX (Fracture Intervention Trial [FIT] Long-Term Extension) study, published late last year, researchers assessed the effects of continuing or stopping alendronate after 5 years of treatment (JAMA 2006;296:2927-38). In this study, women who had received 5 years of alendronate therapy were randomized to continue on 5 mg/day or 10 mg/day alendronate, or to stop therapy.

For women on placebo for years 5-10, total hip BMD returned to baseline levels. Women on both doses of alendronate gained and maintained a 4% increase in hip BMD over baseline during the same period. In terms of spine BMD, women on placebo during years 5-10 had a slight increase, and women on alendronate had a steeper increase.

Women who continued on alendronate for 10 years had an almost 50% reduction in clinical vertebral fractures, compared with those who stopped treatment after 5 years. There was no difference between the groups in terms of nonvertebral or morphometric vertebral fractures.

“So if you look at clinical vertebral fractures, what you see is that if the BMD was greater than −2.0, there doesn't appear to be any real benefit [to continued alendronate]. But if you have a BMD less than −2.0 or less than −2.5 … it appears that both of these subgroups benefitted from continuing alendronate for 10 years as opposed to stopping it after 5 years.”

The study provides some useful clinical answers. “It says that continuation of alendronate for 10 years does maintain bone mass and reduces bone remodeling, compared with discontinuation after 5 years,” said Dr. Khosla. Discontinuation did not increase the risk of nonvertebral fractures or x-ray-detected vertebral fractures, but the risk of clinically detected vertebral fractures was significantly increased in those who discontinued therapy after 5 years.

 

 

“For many women, stopping alendronate after 5 years for up to 5 more years does not significantly increase fracture risk, but women at high risk of vertebral fractures–such as those who already have a vertebral fracture or those [who might have] very low bone density–may benefit by continuing beyond 5 years.”

Fewer data are available for risedronate. Over 5 years, women on risedronate had continued modest increases in spine bone density, and relative stabilization of femoral-neck bone density, judging from findings from the Vertebral Efficacy With Risedronate Therapy-Multinational (VERT-MN) trial (Bone 2003;32:120-6). Women on placebo had a reduction in femoral-neck bone density and a relative stabilization of spine bone density during the 2-year extension of the trial that originally was designed to run 3 years. During the 2 years of the extension, women on risedronate had more than a 50% reduction in vertebral fractures, compared with women who stopped therapy.

Even fewer data are available for ibandronate. In a 3-year study of almost 3,000 women, the incidence of new vertebral fractures in women on oral daily ibandronate (2.5 mg) was 11%, compared with 6% for women in the placebo group (Bone 2005;37:651-4).

“There are potential concerns with long-term bisphosphonate therapy,” said Dr. Khosla. One important question is whether the continued and potent inhibition of bone turnover could be harmful because of the increased mineralization of bone that has been observed in animal models.

There is also concern about the accumulation of microdamage. “Here, the thought is that because bone constantly needs to repair microcracks and microfractures, if you [inhibit] resorption for long periods of time, these microcracks will accumulate, and you can start to see a paradoxical increase in fractures in various sites because you haven't repaired the skeleton normally,” said Dr. Khosla.

Animal and human studies do show that bisphosphonate-induced inhibition of bone resorption is associated with increased bone mineralization. Increased bone mineralization does increase bone strength, but only up to a point because bone also becomes too stiff.

However, despite the results of animal studies with high doses of bisphosphonates, there is no evidence in humans for increased accumulation of microdamage. “This is a theoretical concern,” said Dr. Khosla.

Another major concern has been the association between bisphosphonate use and jaw osteonecrosis.

“This is a very feared complication of long-term biphosphonate therapy,” said Dr. Khosla. “This is something that is just coming to [our] attention, and we haven't quite figured out how to deal with it.”

The exposed bone that is the hallmark of jaw osteonecrosis occurs in other conditions, sometimes confounding diagnosis. The American Society for Bone and Mineral Research created a task force to examine the relationship between bisphosphonates and jaw osteonecrosis. One goal is to develop a case definition for bisphosphonate-associated jaw osteonecrosis.

Although data on jaw osteonecrosis associated with oral bisphosphonate use are limited, it's estimated that the risk is somewhere between 1 in 10,000 and less than 1 in 100,000 patient-treatment years. “This may be an underestimate because of underreporting,” said Dr. Khosla. The estimate may also be low because the risk is associated with cumulative exposure, and perhaps this complication will become more common with more patients on oral bisphosphonates for longer periods.

“It's clear that the risk of jaw osteonecrosis in patients with cancer, treated with high doses of intravenous bisphosphonates, is higher,” said Dr. Khosla. In these patients, the risk is estimated to be 1-10 per 100 patients.

“I think that all we can do as physicians is provide information and factor in the patient's values. I don't think as a physician you can completely leave the decision to the patient. They get bewildered.”

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON – Physicians and patients need to work together to decide for or against long-term bisphosphonate treatment for osteoporosis. The body of evidence is still evolving, and there's no one-size-fits-all answer, said Dr. Sundeep Khosla, research chair of the division of endocrinology at the Mayo Clinic in Rochester, Minn.

“I think ultimately the patient has to decide with her physician. … Patient values factor into this,” said Dr. Khosla at an international symposium sponsored by the National Osteoporosis Foundation. A physician can inform a patient about the best information that is currently available in terms of fracture risk and the risk of complications. However, the patient has to decide what risk she is willing to take with regard to fracture.

Dr. Khosla discussed the pros and cons of long-term bisphosphonate use in the context of a hypothetical patient familiar to many physicians. A 60-year-old woman started on vitamin D/calcium supplements and 70 mg/week alendronate 5 years ago when her dual-energy x-ray absorptiometry (DXA) scan revealed a spine T score of −2.6 and a total hip T score of −2.0. She also has a family history of hip fracture. Her bone mineral density (BMD) has increased about 5% at the spine and 3% at the hip. She has not had any clinical fractures. She asks if she should continue with alendronate and if so, for how long.

So, should a patient who has been on alendronate for 5 years continue with therapy? In favor of continuing, it does appear that continuation will reduce the risk of clinical vertebral fractures.

Alendronate is the longest-available bisphosphonate, with 10 years of follow-up data. In one analysis of 10 years of data for postmenopausal women on varying regimens of alendronate, those on 10 mg daily of alendronate had increased BMD for the spine and hip (N. Engl. J. Med. 2004;350:1189-99). Spine BMD increased by 13.7% from baseline over that period, and total hip BMD increased by 6.7%. Smaller gains in BMD were noted for women on 5 mg daily of alendronate: 9.3% and 2.9% for the spine and total hip, respectively. For women in the discontinuation group, spinal BMD leveled off (an increase of 0.3% from years 6-10), and total hip BMD declined slightly (a decrease of 1% from years 6-10). There was an initial reduction in vertebral fractures for women on alendronate, but there was no difference in vertebral fractures during years 6-10. However, the study was not adequately powered to assess fractures.

This study “told us that alendronate did in fact have sustained effects over 10 years on bone density and bone turnover markers,” said Dr. Khosla. However, the fracture data were inconclusive: “At best, there was no clear evidence for an increase in vertebral or nonvertebral fractures following long-term alendronate therapy.”

Other data suggest that stopping treatment for 5 years will increase the risk of nonvertebral fractures and minor vertebral deformities.

In the FLEX (Fracture Intervention Trial [FIT] Long-Term Extension) study, published late last year, researchers assessed the effects of continuing or stopping alendronate after 5 years of treatment (JAMA 2006;296:2927-38). In this study, women who had received 5 years of alendronate therapy were randomized to continue on 5 mg/day or 10 mg/day alendronate, or to stop therapy.

For women on placebo for years 5-10, total hip BMD returned to baseline levels. Women on both doses of alendronate gained and maintained a 4% increase in hip BMD over baseline during the same period. In terms of spine BMD, women on placebo during years 5-10 had a slight increase, and women on alendronate had a steeper increase.

Women who continued on alendronate for 10 years had an almost 50% reduction in clinical vertebral fractures, compared with those who stopped treatment after 5 years. There was no difference between the groups in terms of nonvertebral or morphometric vertebral fractures.

“So if you look at clinical vertebral fractures, what you see is that if the BMD was greater than −2.0, there doesn't appear to be any real benefit [to continued alendronate]. But if you have a BMD less than −2.0 or less than −2.5 … it appears that both of these subgroups benefitted from continuing alendronate for 10 years as opposed to stopping it after 5 years.”

The study provides some useful clinical answers. “It says that continuation of alendronate for 10 years does maintain bone mass and reduces bone remodeling, compared with discontinuation after 5 years,” said Dr. Khosla. Discontinuation did not increase the risk of nonvertebral fractures or x-ray-detected vertebral fractures, but the risk of clinically detected vertebral fractures was significantly increased in those who discontinued therapy after 5 years.

 

 

“For many women, stopping alendronate after 5 years for up to 5 more years does not significantly increase fracture risk, but women at high risk of vertebral fractures–such as those who already have a vertebral fracture or those [who might have] very low bone density–may benefit by continuing beyond 5 years.”

Fewer data are available for risedronate. Over 5 years, women on risedronate had continued modest increases in spine bone density, and relative stabilization of femoral-neck bone density, judging from findings from the Vertebral Efficacy With Risedronate Therapy-Multinational (VERT-MN) trial (Bone 2003;32:120-6). Women on placebo had a reduction in femoral-neck bone density and a relative stabilization of spine bone density during the 2-year extension of the trial that originally was designed to run 3 years. During the 2 years of the extension, women on risedronate had more than a 50% reduction in vertebral fractures, compared with women who stopped therapy.

Even fewer data are available for ibandronate. In a 3-year study of almost 3,000 women, the incidence of new vertebral fractures in women on oral daily ibandronate (2.5 mg) was 11%, compared with 6% for women in the placebo group (Bone 2005;37:651-4).

“There are potential concerns with long-term bisphosphonate therapy,” said Dr. Khosla. One important question is whether the continued and potent inhibition of bone turnover could be harmful because of the increased mineralization of bone that has been observed in animal models.

There is also concern about the accumulation of microdamage. “Here, the thought is that because bone constantly needs to repair microcracks and microfractures, if you [inhibit] resorption for long periods of time, these microcracks will accumulate, and you can start to see a paradoxical increase in fractures in various sites because you haven't repaired the skeleton normally,” said Dr. Khosla.

Animal and human studies do show that bisphosphonate-induced inhibition of bone resorption is associated with increased bone mineralization. Increased bone mineralization does increase bone strength, but only up to a point because bone also becomes too stiff.

However, despite the results of animal studies with high doses of bisphosphonates, there is no evidence in humans for increased accumulation of microdamage. “This is a theoretical concern,” said Dr. Khosla.

Another major concern has been the association between bisphosphonate use and jaw osteonecrosis.

“This is a very feared complication of long-term biphosphonate therapy,” said Dr. Khosla. “This is something that is just coming to [our] attention, and we haven't quite figured out how to deal with it.”

The exposed bone that is the hallmark of jaw osteonecrosis occurs in other conditions, sometimes confounding diagnosis. The American Society for Bone and Mineral Research created a task force to examine the relationship between bisphosphonates and jaw osteonecrosis. One goal is to develop a case definition for bisphosphonate-associated jaw osteonecrosis.

Although data on jaw osteonecrosis associated with oral bisphosphonate use are limited, it's estimated that the risk is somewhere between 1 in 10,000 and less than 1 in 100,000 patient-treatment years. “This may be an underestimate because of underreporting,” said Dr. Khosla. The estimate may also be low because the risk is associated with cumulative exposure, and perhaps this complication will become more common with more patients on oral bisphosphonates for longer periods.

“It's clear that the risk of jaw osteonecrosis in patients with cancer, treated with high doses of intravenous bisphosphonates, is higher,” said Dr. Khosla. In these patients, the risk is estimated to be 1-10 per 100 patients.

“I think that all we can do as physicians is provide information and factor in the patient's values. I don't think as a physician you can completely leave the decision to the patient. They get bewildered.”

WASHINGTON – Physicians and patients need to work together to decide for or against long-term bisphosphonate treatment for osteoporosis. The body of evidence is still evolving, and there's no one-size-fits-all answer, said Dr. Sundeep Khosla, research chair of the division of endocrinology at the Mayo Clinic in Rochester, Minn.

“I think ultimately the patient has to decide with her physician. … Patient values factor into this,” said Dr. Khosla at an international symposium sponsored by the National Osteoporosis Foundation. A physician can inform a patient about the best information that is currently available in terms of fracture risk and the risk of complications. However, the patient has to decide what risk she is willing to take with regard to fracture.

Dr. Khosla discussed the pros and cons of long-term bisphosphonate use in the context of a hypothetical patient familiar to many physicians. A 60-year-old woman started on vitamin D/calcium supplements and 70 mg/week alendronate 5 years ago when her dual-energy x-ray absorptiometry (DXA) scan revealed a spine T score of −2.6 and a total hip T score of −2.0. She also has a family history of hip fracture. Her bone mineral density (BMD) has increased about 5% at the spine and 3% at the hip. She has not had any clinical fractures. She asks if she should continue with alendronate and if so, for how long.

So, should a patient who has been on alendronate for 5 years continue with therapy? In favor of continuing, it does appear that continuation will reduce the risk of clinical vertebral fractures.

Alendronate is the longest-available bisphosphonate, with 10 years of follow-up data. In one analysis of 10 years of data for postmenopausal women on varying regimens of alendronate, those on 10 mg daily of alendronate had increased BMD for the spine and hip (N. Engl. J. Med. 2004;350:1189-99). Spine BMD increased by 13.7% from baseline over that period, and total hip BMD increased by 6.7%. Smaller gains in BMD were noted for women on 5 mg daily of alendronate: 9.3% and 2.9% for the spine and total hip, respectively. For women in the discontinuation group, spinal BMD leveled off (an increase of 0.3% from years 6-10), and total hip BMD declined slightly (a decrease of 1% from years 6-10). There was an initial reduction in vertebral fractures for women on alendronate, but there was no difference in vertebral fractures during years 6-10. However, the study was not adequately powered to assess fractures.

This study “told us that alendronate did in fact have sustained effects over 10 years on bone density and bone turnover markers,” said Dr. Khosla. However, the fracture data were inconclusive: “At best, there was no clear evidence for an increase in vertebral or nonvertebral fractures following long-term alendronate therapy.”

Other data suggest that stopping treatment for 5 years will increase the risk of nonvertebral fractures and minor vertebral deformities.

In the FLEX (Fracture Intervention Trial [FIT] Long-Term Extension) study, published late last year, researchers assessed the effects of continuing or stopping alendronate after 5 years of treatment (JAMA 2006;296:2927-38). In this study, women who had received 5 years of alendronate therapy were randomized to continue on 5 mg/day or 10 mg/day alendronate, or to stop therapy.

For women on placebo for years 5-10, total hip BMD returned to baseline levels. Women on both doses of alendronate gained and maintained a 4% increase in hip BMD over baseline during the same period. In terms of spine BMD, women on placebo during years 5-10 had a slight increase, and women on alendronate had a steeper increase.

Women who continued on alendronate for 10 years had an almost 50% reduction in clinical vertebral fractures, compared with those who stopped treatment after 5 years. There was no difference between the groups in terms of nonvertebral or morphometric vertebral fractures.

“So if you look at clinical vertebral fractures, what you see is that if the BMD was greater than −2.0, there doesn't appear to be any real benefit [to continued alendronate]. But if you have a BMD less than −2.0 or less than −2.5 … it appears that both of these subgroups benefitted from continuing alendronate for 10 years as opposed to stopping it after 5 years.”

The study provides some useful clinical answers. “It says that continuation of alendronate for 10 years does maintain bone mass and reduces bone remodeling, compared with discontinuation after 5 years,” said Dr. Khosla. Discontinuation did not increase the risk of nonvertebral fractures or x-ray-detected vertebral fractures, but the risk of clinically detected vertebral fractures was significantly increased in those who discontinued therapy after 5 years.

 

 

“For many women, stopping alendronate after 5 years for up to 5 more years does not significantly increase fracture risk, but women at high risk of vertebral fractures–such as those who already have a vertebral fracture or those [who might have] very low bone density–may benefit by continuing beyond 5 years.”

Fewer data are available for risedronate. Over 5 years, women on risedronate had continued modest increases in spine bone density, and relative stabilization of femoral-neck bone density, judging from findings from the Vertebral Efficacy With Risedronate Therapy-Multinational (VERT-MN) trial (Bone 2003;32:120-6). Women on placebo had a reduction in femoral-neck bone density and a relative stabilization of spine bone density during the 2-year extension of the trial that originally was designed to run 3 years. During the 2 years of the extension, women on risedronate had more than a 50% reduction in vertebral fractures, compared with women who stopped therapy.

Even fewer data are available for ibandronate. In a 3-year study of almost 3,000 women, the incidence of new vertebral fractures in women on oral daily ibandronate (2.5 mg) was 11%, compared with 6% for women in the placebo group (Bone 2005;37:651-4).

“There are potential concerns with long-term bisphosphonate therapy,” said Dr. Khosla. One important question is whether the continued and potent inhibition of bone turnover could be harmful because of the increased mineralization of bone that has been observed in animal models.

There is also concern about the accumulation of microdamage. “Here, the thought is that because bone constantly needs to repair microcracks and microfractures, if you [inhibit] resorption for long periods of time, these microcracks will accumulate, and you can start to see a paradoxical increase in fractures in various sites because you haven't repaired the skeleton normally,” said Dr. Khosla.

Animal and human studies do show that bisphosphonate-induced inhibition of bone resorption is associated with increased bone mineralization. Increased bone mineralization does increase bone strength, but only up to a point because bone also becomes too stiff.

However, despite the results of animal studies with high doses of bisphosphonates, there is no evidence in humans for increased accumulation of microdamage. “This is a theoretical concern,” said Dr. Khosla.

Another major concern has been the association between bisphosphonate use and jaw osteonecrosis.

“This is a very feared complication of long-term biphosphonate therapy,” said Dr. Khosla. “This is something that is just coming to [our] attention, and we haven't quite figured out how to deal with it.”

The exposed bone that is the hallmark of jaw osteonecrosis occurs in other conditions, sometimes confounding diagnosis. The American Society for Bone and Mineral Research created a task force to examine the relationship between bisphosphonates and jaw osteonecrosis. One goal is to develop a case definition for bisphosphonate-associated jaw osteonecrosis.

Although data on jaw osteonecrosis associated with oral bisphosphonate use are limited, it's estimated that the risk is somewhere between 1 in 10,000 and less than 1 in 100,000 patient-treatment years. “This may be an underestimate because of underreporting,” said Dr. Khosla. The estimate may also be low because the risk is associated with cumulative exposure, and perhaps this complication will become more common with more patients on oral bisphosphonates for longer periods.

“It's clear that the risk of jaw osteonecrosis in patients with cancer, treated with high doses of intravenous bisphosphonates, is higher,” said Dr. Khosla. In these patients, the risk is estimated to be 1-10 per 100 patients.

“I think that all we can do as physicians is provide information and factor in the patient's values. I don't think as a physician you can completely leave the decision to the patient. They get bewildered.”

Publications
Publications
Topics
Article Type
Display Headline
Pros and Cons of Continued Bisphosphonate Use
Display Headline
Pros and Cons of Continued Bisphosphonate Use
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Less Aggressive IVF Protocol Cuts Multiples, Retains Live Birth Rate

Article Type
Changed
Display Headline
Less Aggressive IVF Protocol Cuts Multiples, Retains Live Birth Rate

A less aggressive in vitro fertilization protocol results in roughly the same rate of pregnancies leading to term live births as do standard methods, but with decreased multiple pregnancy rates and overall cost, according to Dutch researchers.

The cumulative 1-year proportion of pregnancies producing term live births was 43% in 205 women who underwent the less aggressive IVF protocol, compared with 45% in 199 women who underwent standard treatment, the investigators reported.

The women were randomized to one of two IVF strategies: standard ovarian stimulation with a GnRH agonist long protocol and the transfer of two embryos (standard treatment) or mild ovarian stimulation with GnRH antagonist cotreatment and single embryo transfer (what the researchers termed “mild” treatment).

The women were aged younger than 38 years and either had no previous IVF treatment or had borne a healthy child after previous IVF treatment, wrote Dr. Esther Heijnen and her colleagues at the University Medical Center in Utrecht, the Netherlands (Lancet 2007;369:743-49).

High-quality embryos that were not transferred were cryopreserved and thawed for transfer in a subsequent unstimulated cycle before the start of a new IVF treatment cycle. The average number of treatment cycles was 2.3 in the mild treatment group and 1.7 in the standard treatment group. Overall, there was no significant difference in discomfort between the groups, despite an increase in the average number of IVF cycles for the mild treatment group.

The proportion of multiple births per couple during 1 year of treatment was significantly lower with mild treatment (0.5% vs. 13%).

Despite a greater average number of cycles in the mild treatment group, total costs of IVF treatment per couple (regardless of whether pregnancy resulted) were significantly lower with less aggressive treatment.

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

A less aggressive in vitro fertilization protocol results in roughly the same rate of pregnancies leading to term live births as do standard methods, but with decreased multiple pregnancy rates and overall cost, according to Dutch researchers.

The cumulative 1-year proportion of pregnancies producing term live births was 43% in 205 women who underwent the less aggressive IVF protocol, compared with 45% in 199 women who underwent standard treatment, the investigators reported.

The women were randomized to one of two IVF strategies: standard ovarian stimulation with a GnRH agonist long protocol and the transfer of two embryos (standard treatment) or mild ovarian stimulation with GnRH antagonist cotreatment and single embryo transfer (what the researchers termed “mild” treatment).

The women were aged younger than 38 years and either had no previous IVF treatment or had borne a healthy child after previous IVF treatment, wrote Dr. Esther Heijnen and her colleagues at the University Medical Center in Utrecht, the Netherlands (Lancet 2007;369:743-49).

High-quality embryos that were not transferred were cryopreserved and thawed for transfer in a subsequent unstimulated cycle before the start of a new IVF treatment cycle. The average number of treatment cycles was 2.3 in the mild treatment group and 1.7 in the standard treatment group. Overall, there was no significant difference in discomfort between the groups, despite an increase in the average number of IVF cycles for the mild treatment group.

The proportion of multiple births per couple during 1 year of treatment was significantly lower with mild treatment (0.5% vs. 13%).

Despite a greater average number of cycles in the mild treatment group, total costs of IVF treatment per couple (regardless of whether pregnancy resulted) were significantly lower with less aggressive treatment.

A less aggressive in vitro fertilization protocol results in roughly the same rate of pregnancies leading to term live births as do standard methods, but with decreased multiple pregnancy rates and overall cost, according to Dutch researchers.

The cumulative 1-year proportion of pregnancies producing term live births was 43% in 205 women who underwent the less aggressive IVF protocol, compared with 45% in 199 women who underwent standard treatment, the investigators reported.

The women were randomized to one of two IVF strategies: standard ovarian stimulation with a GnRH agonist long protocol and the transfer of two embryos (standard treatment) or mild ovarian stimulation with GnRH antagonist cotreatment and single embryo transfer (what the researchers termed “mild” treatment).

The women were aged younger than 38 years and either had no previous IVF treatment or had borne a healthy child after previous IVF treatment, wrote Dr. Esther Heijnen and her colleagues at the University Medical Center in Utrecht, the Netherlands (Lancet 2007;369:743-49).

High-quality embryos that were not transferred were cryopreserved and thawed for transfer in a subsequent unstimulated cycle before the start of a new IVF treatment cycle. The average number of treatment cycles was 2.3 in the mild treatment group and 1.7 in the standard treatment group. Overall, there was no significant difference in discomfort between the groups, despite an increase in the average number of IVF cycles for the mild treatment group.

The proportion of multiple births per couple during 1 year of treatment was significantly lower with mild treatment (0.5% vs. 13%).

Despite a greater average number of cycles in the mild treatment group, total costs of IVF treatment per couple (regardless of whether pregnancy resulted) were significantly lower with less aggressive treatment.

Publications
Publications
Topics
Article Type
Display Headline
Less Aggressive IVF Protocol Cuts Multiples, Retains Live Birth Rate
Display Headline
Less Aggressive IVF Protocol Cuts Multiples, Retains Live Birth Rate
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

CT Trumps SPECT for Cost-Effective Screening : Average 12-month downstream CAD-related costs were $1,716 higher in patients who underwent SPECT.

Article Type
Changed
Display Headline
CT Trumps SPECT for Cost-Effective Screening : Average 12-month downstream CAD-related costs were $1,716 higher in patients who underwent SPECT.

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. By comparison, the average cost of a nuclear study ranged from $3,000 to $4,000, he said.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” said Dr. Min.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for coronary artery disease. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known coronary artery disease were included. These were patients who did not have any coronary artery disease-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions. The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. About two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test.

There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, though this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” said Dr. Min. Looking at any diagnostic test, there was an 18% relative risk reduction in patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively.

“CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said at the meeting.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” said Dr. Min.

Dr. Min disclosed that he receives research support from GE Healthcare, which manufacturers both technologies.

A CTA reveals diffuse, mixed plaque in the left anterior descending artery.

Here, a multidetector CT volume rendered image shows calcification in the left anterior descending and right coronary arteries. Photos courtesy Dr. James K. Min

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. By comparison, the average cost of a nuclear study ranged from $3,000 to $4,000, he said.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” said Dr. Min.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for coronary artery disease. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known coronary artery disease were included. These were patients who did not have any coronary artery disease-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions. The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. About two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test.

There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, though this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” said Dr. Min. Looking at any diagnostic test, there was an 18% relative risk reduction in patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively.

“CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said at the meeting.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” said Dr. Min.

Dr. Min disclosed that he receives research support from GE Healthcare, which manufacturers both technologies.

A CTA reveals diffuse, mixed plaque in the left anterior descending artery.

Here, a multidetector CT volume rendered image shows calcification in the left anterior descending and right coronary arteries. Photos courtesy Dr. James K. Min

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. By comparison, the average cost of a nuclear study ranged from $3,000 to $4,000, he said.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” said Dr. Min.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for coronary artery disease. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known coronary artery disease were included. These were patients who did not have any coronary artery disease-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions. The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. About two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test.

There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, though this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” said Dr. Min. Looking at any diagnostic test, there was an 18% relative risk reduction in patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively.

“CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said at the meeting.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” said Dr. Min.

Dr. Min disclosed that he receives research support from GE Healthcare, which manufacturers both technologies.

A CTA reveals diffuse, mixed plaque in the left anterior descending artery.

Here, a multidetector CT volume rendered image shows calcification in the left anterior descending and right coronary arteries. Photos courtesy Dr. James K. Min

Publications
Publications
Topics
Article Type
Display Headline
CT Trumps SPECT for Cost-Effective Screening : Average 12-month downstream CAD-related costs were $1,716 higher in patients who underwent SPECT.
Display Headline
CT Trumps SPECT for Cost-Effective Screening : Average 12-month downstream CAD-related costs were $1,716 higher in patients who underwent SPECT.
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

CT Angiography Efficient for Initial Screening : Patients who underwent CT as an initial diagnostic test had lower coronary disease-related costs.

Article Type
Changed
Display Headline
CT Angiography Efficient for Initial Screening : Patients who underwent CT as an initial diagnostic test had lower coronary disease-related costs.

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. The average cost of a nuclear study ranged from $3,000 to $4,000.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” Dr. Min said.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for CAD. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known CAD were included. These were patients who did not have any CAD-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions.

The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. Roughly two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test. There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, although this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” Dr. Min reported. Looking at any diagnostic test, there was an 18% relative risk reduction in the patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively. “CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” he said.

Dr. Min disclosed that he receives research support from GE Healthcare.

CT shows calcified plaque in left anterior descending and right coronary arteries. Courtesy Dr. James K. Min

Article PDF
Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

Article PDF
Article PDF

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. The average cost of a nuclear study ranged from $3,000 to $4,000.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” Dr. Min said.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for CAD. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known CAD were included. These were patients who did not have any CAD-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions.

The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. Roughly two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test. There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, although this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” Dr. Min reported. Looking at any diagnostic test, there was an 18% relative risk reduction in the patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively. “CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” he said.

Dr. Min disclosed that he receives research support from GE Healthcare.

CT shows calcified plaque in left anterior descending and right coronary arteries. Courtesy Dr. James K. Min

WASHINGTON — Coronary CT angiography appears to be a less expensive alternative to myocardial perfusion SPECT imaging as an initial diagnostic screen for coronary artery disease, according to an analysis of data from two large regional health plans presented at the annual meeting of the Society of Cardiovascular Computed Tomography.

The average 12-month downstream coronary artery disease-related cost for patients who underwent coronary CT angiography (CTA) as an initial screen for coronary artery disease (CAD) was $1,716 lower per patient than for those who underwent SPECT, said Dr. James K. Min of Cornell University, New York. The average cost of a nuclear study ranged from $3,000 to $4,000.

“CT may be a potential, cost-efficient alternative to SPECT for the initial evaluation of patients with suspected coronary artery disease,” Dr. Min said.

The researchers analyzed private payer data from two large regional health plans with more than 6.5 million members from 2002 to 2005. The database included membership information, pharmacy claims, and inpatient and outpatient service claims.

The researchers identified patients who underwent CTA or MP SPECT imaging as an initial diagnostic screen for CAD. Information was collected for 1 year prior to and 1 year after the test.

Only patients without known CAD were included. These were patients who did not have any CAD-related procedure codes for the previous 12 months. CT and MP SPECT claims included only those with coronary heart disease codes.

For each patient, the researchers calculated a cardiac risk score. The score was a weighted average of several risk factors, including use of digitalis, anticoagulant agents, antiplatelet agents, ACE inhibitors, β-blockers, antihypertensive medication, and antidiabetic medications, as well as the presence of other clinical cardiac conditions.

The researchers also assessed each patient's overall health status using the Charleston Comorbidity Index.

Each patient in the CTA group was matched with four patients in the SPECT group based on age, gender, and cardiac risk score. Both groups had an average age of 51 years. Roughly two-thirds of the patients in each group (68%) were women. The average cardiac risk score was 0.20 in the CTA group and 0.19 in the SPECT group.

A total of 1,833 patients were identified who had an initial diagnostic screen with CTA; they were matched with 7,332 patients who had SPECT imaging.

In addition to a cost difference for the two modalities, the researchers noted that the use of antiplatelet therapy was greater among SPECT patients after the initial diagnostic test. There was also a trend toward greater use of ACE inhibitors and statins in the SPECT group, although this did not achieve significance.

“In terms of follow-up diagnostic tests, patients who initially underwent CT angiography were more likely to undergo nuclear stress testing in the follow-up period, while patients who underwent nuclear stress testing were more likely to undergo invasive coronary angiography,” Dr. Min reported. Looking at any diagnostic test, there was an 18% relative risk reduction in the patients who underwent initial coronary evaluation with CT angiography.

The researchers also looked at clinical outcomes. Patients who underwent initial SPECT imaging had a higher rate of surgical or percutaneous interventions in the follow-up period compared with those who had CTA—1.2% compared with 0.4%, respectively. “CTA patients experienced lower rates of both hospitalization as well as angina or myocardial infarction,” Dr. Min said.

“From this we tentatively conclude that compared to MP SPECT patients, patients who underwent CT as an initial diagnostic test incurred lower 12-month total coronary disease-related costs,” he said.

Dr. Min disclosed that he receives research support from GE Healthcare.

CT shows calcified plaque in left anterior descending and right coronary arteries. Courtesy Dr. James K. Min

Publications
Publications
Topics
Article Type
Display Headline
CT Angiography Efficient for Initial Screening : Patients who underwent CT as an initial diagnostic test had lower coronary disease-related costs.
Display Headline
CT Angiography Efficient for Initial Screening : Patients who underwent CT as an initial diagnostic test had lower coronary disease-related costs.
Article Source

PURLs Copyright

Inside the Article

Article PDF Media