Tracking live cells across 2D, 3D, and multi-channel time-lapse recordings is crucial for understanding tissue-scale biological processes. Despite advancements in imaging technology, achieving accurate cell tracking remains challenging, particularly in complex and crowded tissues where cell segmentation is often ambiguous. We present Ultrack, a versatile and scalable cell-tracking method that tackles this challenge by considering candidate segmentations derived from multiple algorithms and parameter sets. Ultrack employs temporal consistency to select optimal segments, ensuring robust performance even under segmentation uncertainty. We validate our method on diverse datasets, including terabyte-scale developmental time-lapses of zebrafish, fruit fly, and nematode embryos, as well as multi-color and label-free cellular imaging. We show that Ultrack achieves state-of-the-art performance on the Cell Tracking Challenge and demonstrates superior accuracy in tracking densely packed embryonic cells over extended periods. Moreover, we propose an approach to tracking validation via dual-channel sparse labeling that enables high-fidelity ground truth generation, pushing the boundaries of long-term cell tracking assessment. Our method is freely available as a Python package with Fiji and napari plugins and can be deployed in a high-performance computing environment, facilitating widespread adoption by the research community.