The Soil Moisture and Ocean Salinity (SMOS) mission was selected in May 1999 by the European Space Agency to provide global and frequent soil moisture and sea surface salinity maps. SMOS' single payload is Microwave Imaging Radiometer by Aperture Synthesis (MIRAS), an L band two‐dimensional aperture synthesis interferometric radiometer with multiangular observation capabilities. Most geophysical parameter retrieval errors studies have assumed the independence of measurements both in time and space so that the standard deviation of the retrieval errors decreases with the inverse of square root of the number of measurements being averaged. This assumption is especially critical in the case of sea surface salinity (SSS), where spatiotemporal averaging is required to achieve the ultimate goal of 0.1 psu error. This work presents a detailed study of the SSS error reduction by spatiotemporal averaging, using the SMOS end‐to‐end performance simulator (SEPS), including thermal noise, all instrumental error sources, current error correction and image reconstruction algorithms, and correction of atmospheric and sky noises. The most important error sources are the biases that appear in the brightness temperature images. Three different sources of biases have been identified: errors in the noise injection radiometers, Sun contributions to the antenna temperature, and imaging under aliasing conditions. A calibration technique has been devised to correct these biases prior to the SSS retrieval at each satellite overpass. Simulation results show a retrieved salinity error of 0.2 psu in warm open ocean, and up to 0.7 psu at high latitudes and near the coast, where the external calibration method presents more difficulties.