Abstract

The purpose of this investigation was to measure the reduction in bone strength resulting from drill holes in diaphyseal bone and to compare this with finite element and theoretical predictions for stresses in a tubular structure. Fifty-two pairs of canine femora were tested to failure in four-point bending. One bone of each pair was used as the control; the other femora had holes of variable size drilled in the lateral cortex. At a ratio of drill hole diameter to bone diameter of 0.2, the bone retained only 62% of its expected strength. A linear regression between the area fraction (the ratio of the cross-sectional area of the drilled specimen to the control specimen) and the percentage of expected strength yielded a strong positive correlation (R2 = 0.79). The average cross-sectional properties were used as the basis for linear orthotropic and nonlinear elastic-plastic finite element models of idealized geometry. The linear models proved insufficient for prediction of failure loads. The nonlinear models, which accounted for both material plasticity and the stress concentration effects of the defect, yielded good correspondence with the experimental data. While the influence of irregular borders and adaptive remodeling of the bone adjacent to the defect requires further investigation, our results suggest the possibility of prediction of fracture risk based on geometric properties of metastatic lesions. Prophylactic fixation remains a matter of clinical judgement based on the functional demands and expected strength of the affected bones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call