Since Fisher (1930) conjectured on how linkage between two loci could have been reduced, many models have been constructed and analyzed in order to establish more rigorously how natural selection may have produced tightly linked clusters of loci and the evolution of supergenes. The usual approach is to set up a deterministic random mating model with viabilities and recombination. The polymorphic equilibria are determined and conditions on the selection parameters and recombination fraction ascertained such that these are stable. At this equilibrium, a crossover reducing mechanism is then supposed to occur in one of the chromosomes and the progress of this chromosome is examined as a function of the altered recombination fraction and the equilibrium frequencies. Kimura (1956) used the above approach for a special symmetric viability set up. Bodmer and Parsons (1962) treated the same problem with more general, but still symmetric, viabilities. Nei (1967) introduced a model allowing a third locus, independent of the two under investigation to determine the extent of cross-over. Turner (1967) has given a somewhat heuristic treatment of the problem with general viabilities. These authors all found that a population in equilibrium under linkage and selection will be subject to selection for a decrease in recombination fraction. Most of the models analyzed so far can be realized as particular cases of what has become known as the symmetric viability model involving two loci with two alleles at each locus. A detailed discussion of this model and some of its properties was made by Bodmer and Felsenstein (1967). The four chromosomes AB, Ab, aB, and ab occur with frequencies x1 , x2, xs , and x4 , respectively, and the fitnesses of the genotypes are as given in Table I.