AbstractOceanic stratocumulus decks of clouds are among the largest contributors to the Earth's radiation budget, covering around a fifth of the planet's surface and reflecting a large part of the incoming solar radiation. Unfortunately, these clouds are not well represented in modern climate models, resulting in one of the leading causes of uncertainty in climate change projections. This contribution analyses this issue from a novel perspective and sheds light on the mechanisms behind the misrepresentation and resolution dependence of these large clouds. The analysis is based on realistic week‐long simulations performed over a km oceanic domain. Four horizontal resolutions, between 4.4 and 0.55 km, are employed, resulting in a timely investigation, especially in light of the high resolutions employed by present and near‐future climate projections. Results show that the liquid cloud water, the main contributor to the simulated grid‐scale clouds, decreases with a power‐law decay as the resolution increases, whereas the water vapour, responsible for subgrid‐scale clouds, is much less affected by the grid spacing. The leading cause is identified as an imbalance between the rates of change of the advection and turbulence parametrisation terms. In order to verify this observation and provide a possible mitigation to the issue, a second set of simulations is performed where the turbulence parametrisation is tuned. The strategy proves to be successful, confirming the hypotheses and resulting not only in a resolution‐independent radiation budget but also cloud coverage.