Aspect-based sentiment analysis (ABSA) is a challenging task due to the presence of multiple aspect words with different sentiment polarities in a sentence. Recently, pre-trained language models like BERT have been widely used as context encoders in ABSA. Graph neural networks have also been employed to extract syntactic and semantic information from sentence parsing trees, resulting in superior results. However, dependency trees may establish irrelevant dependencies for sentences with irregular syntax and complex structures. Additionally, previous methods have not fully utilized recent developments in pre-trained language models. Therefore, we propose a Dual Syntax aware Graph attention networks with Prompt (DSGP) model to address these issues. Our model utilizes prompt templates to maximize the potential of pre-trained models and masked vector outputs of templates as supplementary aspect feature representations. We also leverage both dependency trees and constituent trees with graph attention networks to extract different types of syntactic information. The dependency tree captures syntactic correlation between words, while the constituent tree provides a high-level formation of the sentence. Finally, the output from the prompt and parsing trees is fused and fed into a standard classifier. Experimental results on four public datasets demonstrate the competitive performance of our model.