Deep neural networks (dnn) techniques for aspect-based sentiment classification have been widely studied. The success of these methods depends largely on training data which are often inadequate because of the rigor involved in manually tagging large collection of opinionated texts. Attempts have been made to transfer knowledge from document-level to aspect-level sentiment task. However, the success of this approach is also dependent on the model because aspect sentiment data like other type of texts contain complex semantic features. In this paper, we present an attention-based deep learning technique which jointly learns on document and aspect-level sentiment data and which also transfers learning from the document-level data to aspect-level sentiment classification. It basically consists of a convolutional layer and a bidirectional long short-term memory (Bilstm) layer. The first variant of our technique uses convolutional neural network (cnn) to extract high-level semantic features. The output of the feature extraction is then fed into the Bilstm layer which captures the contextual feature representation of the texts. The second variant applies the Bilstm layer directly on the input data. In both variants, the output hidden representation is passed to an output layer using softmax activation function for sentiment polarity classification. We evaluate our model on four standard benchmark datasets which shows the effectiveness of our approach with improvements over baselines. We also conduct ablation studies to show the effect of the different document-level weights on the learning techniques.