We report the spectral index of diffuse radio emission between 50 and 100 MHz from data collected with two implementations of the Experiment to Detect the Global EoR Signature (EDGES) low-band system. EDGES employs a wide beam zenith-pointing dipole antenna centred on a declination of $-26.7^\circ$. We measure the sky brightness temperature as a function of frequency averaged over the EDGES beam from 244 nights of data acquired between 14 September 2016 to 27 August 2017. We derive the spectral index, $\beta$, as a function of local sidereal time (LST) using night-time data and a two-parameter fitting equation. We find $-2.59<\beta<-2.54 \pm 0.011$ between 0 and 12 h LST, ignoring ionospheric effects. When the Galactic Centre is in the sky, the spectral index flattens, reaching $\beta = -2.46 \pm 0.011$ at 18.2 h. The measurements are stable throughout the observations with night-to-night reproducibility of $\sigma_{\beta}<0.004$ except for the LST range of 7 to 12 h. We compare our measurements with predictions from various global sky models and find that the closest match is with the spectral index derived from the Guzm{\'a}n and Haslam sky maps, similar to the results found with the EDGES high-band instrument for 90-190 MHz. Three-parameter fitting was also evaluated with the result that the spectral index becomes more negative by $\sim$0.02 and has a maximum total uncertainty of 0.016. We also find that the third parameter, the spectral index curvature, $\gamma$, is constrained to $-0.11<\gamma<-0.04$. Correcting for expected levels of night-time ionospheric absorption causes $\beta$ to become more negative by $0.008$ - $0.016$ depending on LST.