In humans and many animals, a trade-off between a sufficiently high concentration of erythrocytes (hematocrit) to bind oxygen and sufficiently low blood viscosity to allow rapid blood flow has been achieved during evolution. The optimal value lies between the extreme cases of pure blood plasma, which cannot practically transport any oxygen, and 100% hematocrit, which would imply very slow blood flow or none at all. As oxygen delivery to tissues is the main task of the cardiovascular system, it is reasonable to expect that maximum oxygen delivery has been achieved during evolution. Optimal hematocrit theory, based on this optimality principle, has been successful in predicting hematocrit values of about 0.3-0.5, which are indeed observed in the systemic circulation of humans and many animal species. Similarly, the theory can explain why a hematocrit higher than normal, ranging from 0.5 to 0.7, can promote better exertional performance. Here, we present a review of theoretical approaches to the calculation of the optimal hematocrit value under different conditions and discuss them in a broad physiological context. Several physiological and medical implications are outlined, for example, in view of blood doping, temperature adaptation, dehydration, and life at high altitudes.