The accelerated pace of discovery of disease-causing genes has made efficient mutation detection a priority. Identification of DNA sequence variants allows human geneticists to determine, among other things, whether a candidate gene contributes to disease susceptibility, to identify new alleles at known loci, and to develop molecular diagnostic tests. Once a susceptibility gene has been identified, the investigator interested in molecular diagnostics or genotype–phenotype correlation must decide how to perform rapid, sensitive detection of mutations. Grompe (1) divided these techniques into those that are most useful for identifying known mutations and those that are better for detecting novel mutations. However, he also allowed that no one method would be appropriate for all situations. Clearly, the choices investigators make will be based, in part, on what they are trying to accomplish (e.g., diagnostics vs population studies of allele frequencies); the goal will also dictate the level of sensitivity required and, conversely, the proportion of false positives or negatives that can be tolerated. Other factors such as knowledge of genomic size, sequence, and structure; availability of RNA/cDNA; and access to equipment and technical expertise will also influence the decision. Since its identification in 1993 and 1994 as the (or, at least, a) major gene responsible for multiple endocrine neoplasia types 2A and 2B (MEN 2A, MEN 2B) and Hirschsprung disease (HSCR), respectively, the RET receptor tyrosine kinase1 has provided a real-time case study of some of the more vexing aspects of both the detection and interpretation of mutations (2)(3)(4)(5)(6)(7). The latter problem, namely, correlating genotype with phenotype, is a fascinating one and has been considered elsewhere (8)(9)(10). The former issue, determining the best approach to detecting RET mutations for a given set of patients, is easier …