Abstract

There is increasing evidence that the use of bisphosphonates to prevent osteoporotic fractures, particularly long-term use, is associated with an increased risk of unusual fractures of the proximal femur. Numerous case reports of these “atypical” fractures of the femur among bisphosphonate-treated women have appeared over the last 5 years or more.1,2 A case definition for atypical femur fractures has now been proposed3 that includes subtrochanteric (below the lesser trochanter) or diaphyseal (above the distal metaphysis) location, transverse or nearly transverse “chalklike” fracture line (as opposed to the more typical spiral or comminuted fractures), and paucity of trauma. Additional features may include the presence of prodromal thigh pain, bilateral involvement, cortical thickening, and the presence of other selected diseases (such as rheumatoid arthritis or diabetes) or medication use (such as corticosteroids or proton pump inhibitors). Some reports, but certainly not all, suggest marked suppression of bone turnover as assessed by bone turnover markers and iliac crest histomorphometry.1 Even if causally related, these atypical fractures must be quite rare among osteoporotic women treated with bisphosphonates, as a recent pooled analysis of 3 large clinical trials (FIT, FLEX, and HORIZON) with up to 10 years of follow-up4 found that all types of subtrochanteric and diaphyseal fractures were infrequent and similar among placebo- and bisphospho-nate-treated women. Although these findings are reassuring, important limitations were that relatively few women received more than 5 years of bisphosphonate treatment, information on atypical features was not specifically collected, and only radiographic reports were reviewed. Clearly, because atypical subtrochanteric fractures occur infrequently, they are unlikely to be easily studied in randomized trials, and other study designs will be necessary. Large observational studies have also examined the relationship between bisphosphonate use and subtrochanteric or diaphyseal fractures. For example, Abraham-sen et al5 used Danish administrative data to examine the relationship between bisphosphonate use and subtrochanteric or diaphyseal femur fractures among 39 567 alendronate users and 158 268 nonusers. As would be ex pected because of their higher pretreatment risk, alendronate users were more likely to suffer classic hip fractures than nonusers (hazard ratio, 1.5; 95% CI, 1.4-1.5), and a similar increase was observed for subtrochanteric and diaphyseal femur fractures (hazard ratio, 2.0; CI, 1.8-2.3). The observation that subtrochanteric and diaphyseal fracture risks were similar among individuals receiving short-term (several months) and long-term (5-10 years) alendronate treatment was somewhat reassuring, but the study was unable to specifically identify which of the fractures included were atypical. Contrary to the results of Abrahamsen and colleagues, a large retrospective cohort study from a California Health Maintenance Organization linking pharmacy and radio-graphic data6 found that among 15 000 femur fractures identified between 2007 and 2009, radiographic review identified 135 subtrochanteric and diaphyseal fractures with atypical features. Nearly all of the individuals with atypical femoral fractures had taken bisphosphonates (97%), and longer duration of use further increased the risk. Although only presented in abstract form and not yet published, these preliminary data appear to support a causal relationship between bisphosphonate use and atypical femoral fractures. The case-control study by Meier et al7 in this issue of the Archives adds further data suggesting that the association between bisphosphonate use and atypical femur fractures is causal. These Swiss investigators reviewed radiographs from 477 individuals with subtrochanteric or proximal femoral shaft fractures collected between 1999 and 2010 at a single center and identified 39 with atypical features (0.7% of all femur fractures). For comparison, the investigators used 2 groups: individuals with typical femur fractures and a completely separate group of individuals without fractures. Of the individuals with atypical fractures, 82% reported bisphosphonate use compared with only 6% in the typical fracture group and 12% in the group without fractures. Furthermore, Meier and colleagues found that longer use of bisphosphonates (5-9 years) was associated with a greater risk of atypical fractures (odds ratio, 117; 95% CI, 34-402) compared with shorter use (x=gt2 years) (odds ratio, 35; 95% CI, 10-124). Although Meier and coauthors did not address the issue, a Swedish observational study8 found that the risk of atypical fractures attenuated quickly after discontinuation of long-term bisphosphonate use. Taken in aggregate, these and other high-quality studies lead to the following conclusions: bisphosphonate therapy can prevent spine and nonspine fractures among appropriately selected high-risk individuals (particularly those with previous hip or spine fractures and those with hip bone mineral density [BMD] T scores lower than −2.5); bisphosphonates are generally well tolerated; and atypical subtrochanteric and femoral shaft fractures may be more frequent after bisphosphonate therapy but are rare compared with typical osteoporotic fractures. If discontinuation of bisphosphonate therapy does indeed reduce the risk of atypical fractures (as suggested by the Swedish study), then a critical consideration is the antifracture efficacy of bisphosphonates with use beyond 3 to 5 years. As recently summarized at a hearing of the Food and Drug Administration,9 the evidence supporting the efficacy of bisphosphonate use beyond 3 to 5 years is not robust. Only 2 trials, 1 of daily oral alendronate therapy and 1 of yearly intravenous zoledronic acid therapy, randomized postmenopausal women who were previously treated with bisphosphonates for 3 years (zoledronic acid) or 5 years (alendronate) either to continue active treatment or to discontinue treatment.10,11 Both studies were designed to assess changes in bone mass over several years as the primary outcome, but both adjudicated spine and nonspine facture outcomes, and the results were surprisingly consistent: compared with the patients who continued bisphosphonate therapy, those who discontinued treatment gradually lost bone mass, but nonspine fracture risk did not differ significantly. Interestingly, both studies found that the risk of spine fractures was higher among those who discontinued use than among those who continued use. Therefore, the best data to date suggest that after 3 years of zoledronic acid therapy or 5 years of alendronate therapy, many older women can consider stopping treatment for 3 to 5 years and perhaps longer. A potential but unproven benefit of this approach is fewer atypical fractures, but at the cost of additional vertebral fractures. Operationally, individuals without previous hip or spine fractures and those with hip BMD T scores higher than −2.5 after 3 to 5 years of treatment might be the best candidates for discontinuation of bisphosphonate therapy. Because the skeletal retention and other properties of various bisphosphonates differ, it is risky to assume that all bisphosphonates provide similar persistent antifracture benefits when their use is discontinued after 3 to 5 years. In summary, atypical femur fractures are uncommon but do appear to be more frequent among individuals who are being treated with oral and intravenous bisphosphonates, and longer duration of use further increases the risk. Additional studies of atypical fractures are needed to clarify the mechanism and other key risk factors as well as to confirm that discontinuation of treatment after long-term use substantially lowers the risk. In the meantime, clinicians should continue the current practice of using bisphosphonates as first-line therapy for individuals who are at high risk of fracture and should be sure to discuss with their patients the rare but apparently causal relationship between bisphosphonate use and atypical fractures. Finally, discontinuation of treatment with selected bisphosphonates after 3 to 5 years should be considered in lower-risk individuals, but the optimal duration without therapy and the utility of follow-up BMD assessment or other tests after discontinuation of treatment remain uncertain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call