Abstract

We have previously argued that the antigen receptors of T and B lymphocytes evolved to be sufficiently specific to avoid massive deletion of clonotypes by negative selection. Their optimal 'specificity' level, i.e., probability of binding any particular epitope, was shown to be inversely related to the number of self-antigens that the cells have to be tolerant to. Experiments have demonstrated that T lymphocytes also become more specific during negative selection in the thymus, because cells expressing the most crossreactive receptors have the highest likelihood of binding a self-antigen, and hence to be tolerized (i.e., deleted, anergized, or diverted into a regulatory T cell phenotype). Thus, there are two -not mutually exclusive- explanations for the exquisite specificity of T cells, one involving evolution and the other thymic selection. To better understand the impact of both, we extend a previously developed mathematical model by allowing for T cells with very different binding probabilities in the pre-selection repertoire. We confirm that negative selection tends to tolerize the most crossreactive clonotypes. As a result, the average level of specificity in the functional post-selection repertoire depends on the number of self-antigens, even if there is no evolutionary optimization of binding probabilities. However, the evolutionary optimal range of binding probabilities in the pre-selection repertoire also depends on the number of self-antigens. Species with more self antigens need more specific pre-selection repertoires to avoid excessive loss of T cells during thymic selection, and hence mount protective immune responses. We conclude that both evolution and negative selection are responsible for the high level of specificity of lymphocytes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call