Abstract

A digital computer algorithm is developed for on-line time differentiation of sampled analog voltage signals. The derivative is obtained by employing a least mean squares technique. The recursive algorithm results in a considerable reduction in computer time compared to a complete new solution of the normal equations each time a new data point is accepted. Implementations of the algorithm on a digital computer is discussed. Examples are simulated on a DEC PDP-8 computer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call