Abstract

To study the deterioration of bond performance between concrete and corroded steel bars with designed corrosion levels of 0%, 0.5%, 1.0%, 2.0%, 5.0%, 8.0%, and 10.0%, pull-out tests were performed on cube specimens with the dimensions of 10 D × 10 D × 10 D, where D is the diameter of longitudinal rebars ( D = 14, 20, and 25 mm, respectively). The experimental results indicated that with the specimen dimensions increased, the expansive cracks induced by corrosion products appeared earlier and the maximum expansive cracking width was larger at the same corrosion levels. The bond strength and the initial bond stiffness first increased and then dramatically decreased as the concrete deterioration and reinforcement corrosion levels increased for each specimen dimension, whereas the specimens with the larger diameter ( D = 25 mm) were more sensitive to the corrosion than those with the smaller diameter ( D = 14, 20 mm). The free-end slip and the energy dissipation for each specimen dimensions, which decreased slowly with increasing corrosion levels before the corrosion-induced cracks and then weakened rapidly when the corrosion-induced cracks appeared, was almost independent of the influence on corrosion levels after the corrosion-induced cracks appeared. Based on the experimental results, a simplified expression for the calculation of residual bond stress and an empirical model of the bond–slip constitutive equation that considers the influence of reinforcement corrosion were proposed, which can be used in finite element analysis of corroded reinforced concrete.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call