Abstract

Abstract Designing optimal neural network (NN) architectures is a difficult and time-consuming task, especially when error resiliency and hardware efficiency are considered simultaneously. In our paper, we extend neural architecture search (NAS) to also optimize a NN’s error resilience and hardware related metrics in addition to classification accuarcy. To this end, we consider the error sensitivity of a NN on the architecture-level during NAS and additionally incorporate checksums into the network as an external error detection mechanism. With an additional computational overhead as low as 17% for the discovered architectures, checksums are an efficient method to effectively enhance the error resilience of NNs. Furthermore, the results show that cell-based NN architectures are able to maintain their error resilience characteristics when transferred to other tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call