Abstract

Neural networks become the only way to solve problems in some areas. Such tasks as recognition of images, sounds, classification require serious processor power and memory for training and functioning of the network. Modern mobile devices have quite good characteristics for primary layers of deep neural networks, but there are not enough resources for whole network. Since neural networks for mobile devices are trained separately on external resources, a method of distributed work of a neural network with vertical distribution over sets of layers with synchronization of training data was developed. The model is divided after saving its state, all layers on the mobile device are converted to the format for the mobile framework and synchronized with the device after training on a distributed platform. Variables and coefficients are formed separately, which allows to significantly reduce the size of the neural network data file uploaded to the device. An algorithm for automatic selection of a neural network separation point was proposed. It based on the data amount transferred between the layers and the load on the mobile device resources. The approach allows to use full-size deep neural networks with a mobile device. Performance experiment showed possibility of obtains an acceptable response even with an unstable communication channel without overloading communication channels and device resources.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call