The Hydra Probe1 is a relatively inexpensive and widely used soil water content (θ, m3 m−3) sensor. It measures both the real (εr) and imaginary (εi) components of the complex soil dielectric constant at 50 MHz. Our objectives were to: (i) determine the accuracy and precision of Hydra Probe dielectric measurements, (ii) establish an electrical conductivity limit for Hydra Probe measurements, (iii) document effects of soil type and temperature, and (iv) relate these results to much more thoroughly studied relationships established for time domain reflectometry (TDR). We evaluated Hydra Probe εr measurement precision and accuracy in air, ethanol, butanol, and water. Electrical conductivity effects were established in a series of aqueous KCl solutions. Effects of soil type on calibration were evaluated with four soils. Temperature sensitivity was tested in air, oven-dried, and nearly saturated soil. Each test was performed with three sensors. We found that, in fluids, the sensors were accurate (εr within 0.5), precise (coefficient of variation [CV] < 1%), and that inter-sensor variability was generally low except in KCl solutions with electrical conductivities >0.142 S m−1 (0.01 M). There was a strong correlation between θ and εr for all soils tested but the θ–εr relationship varied with soil. Deviations of measured θ–εr from the Topp equation increased in magnitude with εi, which may be the key to more general calibrations. Temperature effects on εr were negligible in oven dry soils and different for each soil when nearly saturated. The largest temperature effect relative to 25°C was ±0.03 m3 m−3 In general, it appears that differences between Hydra Probe and TDR measurements are related to differences in soil dielectric properties at the measurement frequencies of the two instruments.