Aspect based sentiment analysis (ABSA) is the task of identifying fine-grained opinion polarity towards a specific target in a sentence, which is empowering experts and intelligent systems with enriched interaction capabilities. Most of approaches to date usually capture semantic relations between target and context words based on RNNs (Recurrent Neural Networks) or pre-trained models (e.g. BERT). However, due to computational complexity and size constraints, these models are often hosted in the cloud. Enabling ABSA models to run on resource-constrained end-devices with quick response time is still challenging and not yet well studied. This paper presents distillation network (DNet), a lightweight and efficient sentiment analysis model based on gated convolutional neural networks for on-device inference. Through combining stacked gated convolution with attention mechanism, DNet can distill aspect-aware context information from unstructured text progressively, achieving high performance with less inference latency and reduced model size. Experiments on SemEval 2014 Task 4 and ACL14 Twitter datasets demonstrate that our approach achieves the state-of-the-art performance. Furthermore, compared with the BERT-based model, DNet reduces the model size by more than 50 times and improves the responsiveness by 24 times.