User login
Dr. Susan M. Ott, in this issue of the Journal, argues that many patients on bisphosphonate therapy for more than 5 years should be offered a “drug holiday.” She proposes a simple algorithm that uses measurement of bone turnover and bone density to decide whether to continue therapy, the assumption being that having accumulated in bone, the drug effect will persist after discontinuation.
This will please many patients, who prefer taking fewer drugs. Cost and potential adverse effects are their concerns. Physicians worry about adynamic bone, and as bisphosphonates accumulate in bone with prolonged therapy, they may ultimately increase the incidence of what are now rare adverse effects, ie, jaw necrosis and linear atypical fractures of the femur. To date, we have little evidence that continued drug exposure will cause more of these severe complications, but lack of data is not so comforting.
The data that support taking a bisphosphonate holiday after 5 years (vs continuing therapy for 10 years) are scant compared with the data supporting their initial benefit. The FLEX study (J Bone Miner Res 2010; 25:976–982), as Dr. Ott notes, provides only tenuous benefit for the longer therapy option (with alendronate). Benefit of 10 vs 5 years of therapy is based on subset analysis of a relevant but small group of patients in this study (those with a femoral neck T score lower than −2.5 and no vertebral fracture at baseline). Patients in this subset suffered more nonvertebral fractures after stopping the drug at 5 years. Data with other bisphosphonates may well differ. For the other subsets, 10 years of therapy did not seem better than 5. But the numbers are small, certainly too small to offer insight on the incidence of rare side effects developing with the extra 5 years of therapy.
My personal take: on the basis of limited data, I am worried about halting these drugs in patients at highest risk for fracture—those with severe osteoporosis and many prior fractures or ongoing corticosteroid use. In patients with osteoporosis but lower risk of fracture, I have increasingly offered drug holidays. Although it is clearly not based on large interventional outcome studies, I am more inclined to utilize markers of bone turnover than repeated bone density measurements in patients who have been taking bisphosphonates. Chronic bisphosphonate therapy may alter the relationship between density and fracture risk, akin (but opposite) to the way that corticosteroids increase fracture risk above what is suggested by bone density measurements.
But don’t let this discussion about how long to treat stand in the way of initiating therapy in osteoporotic patients at significant risk of fracture.
Dr. Susan M. Ott, in this issue of the Journal, argues that many patients on bisphosphonate therapy for more than 5 years should be offered a “drug holiday.” She proposes a simple algorithm that uses measurement of bone turnover and bone density to decide whether to continue therapy, the assumption being that having accumulated in bone, the drug effect will persist after discontinuation.
This will please many patients, who prefer taking fewer drugs. Cost and potential adverse effects are their concerns. Physicians worry about adynamic bone, and as bisphosphonates accumulate in bone with prolonged therapy, they may ultimately increase the incidence of what are now rare adverse effects, ie, jaw necrosis and linear atypical fractures of the femur. To date, we have little evidence that continued drug exposure will cause more of these severe complications, but lack of data is not so comforting.
The data that support taking a bisphosphonate holiday after 5 years (vs continuing therapy for 10 years) are scant compared with the data supporting their initial benefit. The FLEX study (J Bone Miner Res 2010; 25:976–982), as Dr. Ott notes, provides only tenuous benefit for the longer therapy option (with alendronate). Benefit of 10 vs 5 years of therapy is based on subset analysis of a relevant but small group of patients in this study (those with a femoral neck T score lower than −2.5 and no vertebral fracture at baseline). Patients in this subset suffered more nonvertebral fractures after stopping the drug at 5 years. Data with other bisphosphonates may well differ. For the other subsets, 10 years of therapy did not seem better than 5. But the numbers are small, certainly too small to offer insight on the incidence of rare side effects developing with the extra 5 years of therapy.
My personal take: on the basis of limited data, I am worried about halting these drugs in patients at highest risk for fracture—those with severe osteoporosis and many prior fractures or ongoing corticosteroid use. In patients with osteoporosis but lower risk of fracture, I have increasingly offered drug holidays. Although it is clearly not based on large interventional outcome studies, I am more inclined to utilize markers of bone turnover than repeated bone density measurements in patients who have been taking bisphosphonates. Chronic bisphosphonate therapy may alter the relationship between density and fracture risk, akin (but opposite) to the way that corticosteroids increase fracture risk above what is suggested by bone density measurements.
But don’t let this discussion about how long to treat stand in the way of initiating therapy in osteoporotic patients at significant risk of fracture.
Dr. Susan M. Ott, in this issue of the Journal, argues that many patients on bisphosphonate therapy for more than 5 years should be offered a “drug holiday.” She proposes a simple algorithm that uses measurement of bone turnover and bone density to decide whether to continue therapy, the assumption being that having accumulated in bone, the drug effect will persist after discontinuation.
This will please many patients, who prefer taking fewer drugs. Cost and potential adverse effects are their concerns. Physicians worry about adynamic bone, and as bisphosphonates accumulate in bone with prolonged therapy, they may ultimately increase the incidence of what are now rare adverse effects, ie, jaw necrosis and linear atypical fractures of the femur. To date, we have little evidence that continued drug exposure will cause more of these severe complications, but lack of data is not so comforting.
The data that support taking a bisphosphonate holiday after 5 years (vs continuing therapy for 10 years) are scant compared with the data supporting their initial benefit. The FLEX study (J Bone Miner Res 2010; 25:976–982), as Dr. Ott notes, provides only tenuous benefit for the longer therapy option (with alendronate). Benefit of 10 vs 5 years of therapy is based on subset analysis of a relevant but small group of patients in this study (those with a femoral neck T score lower than −2.5 and no vertebral fracture at baseline). Patients in this subset suffered more nonvertebral fractures after stopping the drug at 5 years. Data with other bisphosphonates may well differ. For the other subsets, 10 years of therapy did not seem better than 5. But the numbers are small, certainly too small to offer insight on the incidence of rare side effects developing with the extra 5 years of therapy.
My personal take: on the basis of limited data, I am worried about halting these drugs in patients at highest risk for fracture—those with severe osteoporosis and many prior fractures or ongoing corticosteroid use. In patients with osteoporosis but lower risk of fracture, I have increasingly offered drug holidays. Although it is clearly not based on large interventional outcome studies, I am more inclined to utilize markers of bone turnover than repeated bone density measurements in patients who have been taking bisphosphonates. Chronic bisphosphonate therapy may alter the relationship between density and fracture risk, akin (but opposite) to the way that corticosteroids increase fracture risk above what is suggested by bone density measurements.
But don’t let this discussion about how long to treat stand in the way of initiating therapy in osteoporotic patients at significant risk of fracture.