The along‐track scanning radiometer (ATSR) was launched on the European Space Agency's first remote sensing satellite, ERS 1, on July 17, 1991. ATSR is designed to retrieve sea surface temperature (SST) to an accuracy of 0.25 K rms, which represents more than a factor of 2 improvement over any previously flown satellite radiometer. Early validation studies from limited regions suggest that ATSR is capable of measuring SST to near this design accuracy. We report a global validation study against quality‐controlled drifting buoys by examining 280 matchups worldwide with ATSR measurements at their full (1 km) resolution. We investigate optimizing the precision of ATSR using four different SST algorithms derived using a theoretical atmospheric transmission model, combined with various techniques to reduce remnant noise and other errors. We find that a “low‐noise” retrieval algorithm incorporating only the 3.7 and 11 μm nadir view channels gives the optimum precision, a global pixel precision of 0.26 K (or 0.25 K if 1/2° spatial averages are used). A standard deviation of 0.25 K against global drifting buoy data approaches the geophysical limit set by the inherent variability of the skin effect and by the buoy bulk temperature accuracy. Further progress will require comparison against quality in situ radiometer‐derived skin temperatures, although the problem of obtaining sufficiently large and diverse data sets will need to be addressed.