Highlights Among six manufacturer calibrations, the default calibration resulted in the largest errors. Sensor performance was negatively affected by higher clay content and salinity. Sensor-based approaches to estimating field capacity were inconsistent and spatially variable. Abstract. Maintaining the economic and environmental sustainability of crop production requires optimizing irrigation management using advanced technologies such as soil water sensors. In this study, the performance of a commercially available multi-sensor capacitance probe was evaluated under irrigated field conditions across western Oklahoma. The effects of clay content and salinity on sensor performance were investigated too. In addition, the field capacity (FC) of soil cores collected at study sites was determined in the laboratory. These laboratory FC values were used to assess the performance of two sensor-based approaches for estimating FC: the days to reach laboratory FC after major watering events and the percentile of collected sensor readings that represented laboratory FC. The results showed that among the six calibrations provided by the manufacturer, the default and silty clay loam calibrations produced the largest and smallest soil water content errors, respectively. Errors generally increased with clay and salinity, except for the heavy clay calibration, which showed improved performance with increasing clay content. The default and sand calibrations were more sensitive to increases in clay and salinity compared to other calibrations. In the case of sensor-based FC, on average, one to three days were required to reach laboratory FC, with a large range of one to nine days. The percentiles representing laboratory FC had an average of 56% and a range of 3%-97%. Overall, the sensor-based approaches produced inconsistent and highly variable estimates of FC. Keywords: Calibrations, Clay content, Irrigation scheduling, Salinity, Sensor accuracy, Soil water threshold.