Abstract

This work discusses the effect of sampling time on noise, signal-to-noise and contrast-to-noise ratios in magnetic resonance imaging. A simple imaging experiment is performed to demonstrate the effect of sampling time on noise, confirming theoretical expectations that doubling the sampling time while decreasing the read gradient strength by a factor of two reduces statistical noise to 1 2 of its original level. This result suggests that sampling time should be maximized within the constraints of the pulse sequence: namely, that sampling time should be increased and read gradient strength decreased as TE is increased. Revised expressions for signal-to-noise and contrast-to-noise ratios are presented based on the assumption that sampling time increases linearly with echo delay time above a certain minimum TE value. The revised expressions are then used to derive new predictions of the interpulse delay times that maximize signal-to-noise and contrast-to-noise ratios in spin-echo imaging. It is demonstrated that sampling times are critical in determining whether T1-weighted or T2-weighted sequences produce superior tissue contrast in spin-echo imaging.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call