Abstract

This paper discusses the influence of decreasing the number density of reflectors on the false detection rate of ultrasound time-domain correlation (UTDC) measurements. False detections are expected to be inhibited when the number density of reflectors decreases. This is because the sidelobe peaks of the cross-correlation coefficient becomes smaller when each pulse can be recognized separately, rather than when some pulses overlap. However, this separated waveform pattern is expected to occur more frequently as the number density of reflectors decreases. This pattern develops because parts of individual pulses exist in the reference window. In this study, the influence of the positional relationship between the pulse and the reference window in the presence of noise was investigated by simulating the UTDC signal process using an assumed pulse waveform. The simulation results show that there is a high probability of an increase in the false detection rate when the pulse length is under approximately 1.7 times the ultrasound period. The simulation results also showed that damped waveform region has a high potential to cause false detection. The flow rate was measured for three different reflector number densities at a national standard calibration facility for water flow measurements in Japan. The conditions under which false detection occurred and the influence on the flow rate error were evaluated. The measurement results showed good agreement with the simulation results. In addition, this paper also presents an application of a new signal processing method for UTDC measurements for reducing the rate of false detections caused by the pulse pattern. The experimental results show that this effectively reduces the flow rate error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call