Wild populations are increasingly threatened by human-mediated climate change and land use changes. As populations decline, the probability of inbreeding increases, along with the potential for negative effects on individual fitness. Detecting and characterizing runs of homozygosity (ROHs) is a popular strategy for assessing the extent of individual inbreeding present in a population and can also shed light on the genetic mechanisms contributing to inbreeding depression. Here, we analyze simulated and empirical datasets to demonstrate the downstream effects of program selection and long-term demographic history on ROH inference, leading to context-dependent biases in the results. Through a sensitivity analysis we evaluate how various parameter values impact ROH-calling results, highlighting its utility as a tool for parameter exploration. Our results indicate that ROH inferences are sensitive to factors such as sequencing depth and ROH length distribution, with bias direction and magnitude varying with demographic history and the programs used. Estimation biases are particularly pronounced at lower sequencing depths, potentially leading to either underestimation or overestimation of inbreeding. These results are particularly important for the management of endangered species, as underestimating inbreeding signals in the genome can substantially undermine conservation initiatives. We also found that small true ROHs can be incorrectly lumped together and called as longer ROHs, leading to erroneous inference of recent inbreeding. To address these challenges, we suggest using a combination of ROH detection tools and ROH length-specific inferences, along with sensitivity analysis, to generate robust and context-appropriate population inferences regarding inbreeding history. We outline these recommendations for ROH estimation at multiple levels of sequencing effort, which are typical of conservation genomics studies.
Read full abstract