Abstract

The advent of energy efficient, embedded Deep Learning accelerators have brought inference capabilities in conjunction with Internet of Things devices in a pervasive manner. Once limited to passive data-gathering, such devices are now able to actively take part in data processing operations and predictive tasks with acceptable performance. Shifting such computation to the edge allows the creation of interconnected environments able to achieve efficient and low powered inference capabilities at the edge, without being dependent on external services in the cloud.Despite significant recent advancements, the field of Edge-ML is still maturing. Therefore, it is important to develop a framework to evaluate the performance of off-the-shelf hardware and state-of-the-art Deep Learning models suited for low-powered devices. Such a framework can be applied to new devices and models as they become available in the future. This paper describes such an evaluation framework, as well as a broad study of different Edge-ML devices, with comparisons in terms of performance, capabilities, limitations, and possible applications with a focus on deploying state-of-the-art Object Detection models. The end objective is enabling the deployment of low-latency and independent decision making processes in both civilian and military contexts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call