Damping devices for railway tracks have been developed in recent years in which a tuned mass-spring absorber system is formed by an elastomeric material and embedded steel masses. The loss factor and stiffness of the elastomer are very important for the performance of the system but, unfortunately, both properties are sensitive to changes in the temperature. Although having a high loss factor gives good noise reduction, it also means greater variation of stiffness, and consequently tuning frequency, with temperature. Conversely, with lower loss factors the tuning frequency can be kept close to the target but a smaller noise reduction is achieved.To investigate the effect of the temperature on the performance of a generic rail absorber, a simple Timoshenko beam model of the track is used. To this is added a single-frequency continuous tuned absorber. The noise reduction at each frequency is estimated from the ratio of the track decay rates of treated and untreated rails.There is a physical link between the damping loss factor and the stiffness variation with temperature, of which account must be taken. The rate of change of stiffness with log frequency is established by assuming a constant value of loss factor. Using the time-temperature superposition principle, this is expressed in terms of temperature dependence. This is used in the prediction of decay rates and thereby noise reduction at different temperatures. This leads to an assessment of the relative importance of using a high damping loss factor or a temperature-independent stiffness.Finally, a method of weighting the noise reduction at different temperatures is investigated. A distribution of rail temperatures at a site in the UK is used to develop a weighting procedure. This is extended to account for temperature variations at other locations, where less data are available.
Read full abstract