AbstractEfficiently executing inference tasks of deep neural networks on devices with limited resources poses a significant load in IoT systems. To alleviate the load, one innovative method is branching that adds extra layers with classification exits to a pre‐trained model, enabling inputs with high‐confidence predictions to exit early, thus reducing inference cost. However, branching networks, not originally tailored for IoT environments, are susceptible to noisy and out‐of‐distribution (OOD) data, and they demand additional training for optimal performance. The authors introduce BrevisNet, a novel branching methodology designed for creating on‐device branching models that are both resource‐adaptive and noise‐robust for IoT applications. The method leverages the refined uncertainty estimation capabilities of Dirichlet distributions for classification predictions, combined with the superior OOD detection of energy‐based models. The authors propose a unique training approach and thresholding technique that enhances the precision of branch predictions, offering robustness against noise and OOD inputs. The findings demonstrate that BrevisNet surpasses existing branching techniques in training efficiency, accuracy, overall performance, and robustness.