Interval algorithms compute an interval valued function in which the solution to a system of ordinary differential equations is guaranteed to lie. Interval algorithms use differential inequalities, finite difference approximations with remainders, Taylor series, defect corrections, or contractive iterations.