Reliability in general, and in water distribution systems in particular, is a measure of probabilistic performance. A system is said to be reliable if it functions properly for a given time interval and within boundary conditions. Although water distribution system reliability has attracted considerable research attention over the last three decades, there is still no consensus on what reliability measures or evaluation methodologies should be used for the design/operation of water distribution systems. No system is perfectly reliable. In every system undesirable events—failures—can cause a decline or interruption in system performance. Failures are of a stochastic nature and are the result of unpredictable events that occur in the system itself and/or in its environs. A least cost design problem with normal design loadings will result in the cheapest system, but this system will have minimum residual capacity. However, if an increased loading (i.e., higher than the normal design) is implemented, the system's capacity will be increased, thus improving its residual capacity. Finding this “virtual increased loading,” which results in a minimum cost residual system capacity that sustains a required reliability level, is the essence of the proposed methodology, which follows decomposition. The methodology is demonstrated on two example applications of increasing complexity. The main limitation of the suggested method for further extensions to real sized water distribution systems is the computational effort associated with the computation of the “inner” problem. Exploring the required computational burden divided between the “outer” and “inner” problems is a major challenge for future elaborations of this approach.