The accurate detection of foot-strike and toe-off is often critical in the assessment of running biomechanics. The gold standard method for step event detection requires force data which are not always available. Although kinematics-based algorithms can also be used, their accuracy and generalisability are limited, often requiring corrections for speed or foot-strike pattern. The purpose of this study was to develop FootNet, a novel kinematics and deep learning-based algorithm for the detection of step events in treadmill running. Five treadmill running datasets were gathered and processed to obtain segment and joint kinematics, and to identify the contact phase within each gait cycle using force data. The proposed algorithm is based on a long short-term memory recurrent neural network and takes the distal tibia anteroposterior velocity, ankle dorsiflexion/plantar flexion angle and the anteroposterior and vertical velocities of the foot centre of mass as input features to predict the contact phase within a given gait cycle. The chosen model architecture underwent 5-fold cross-validation and the final model was tested in a subset of participants from each dataset (30%). Non-parametric Bland-Altman analyses (bias and [95% limits of agreement]) and root mean squared error (RMSE) were used to compare FootNet against the force data step event detection method. The association between detection errors and running speed, foot-strike angle and incline were also investigated. FootNet outperformed previously published algorithms (foot-strike bias = 0 [-10, 7] ms, RMSE = 5 ms; toe-off bias = 0 [-10, 10] ms, RMSE = 6 ms; and contact time bias = 0 [-15, 15] ms, RMSE = 8 ms) and proved robust to different running speeds, foot-strike angles and inclines. We have made FootNet's source code publicly available for step event detection in treadmill running when force data are not available.