Previous methods incorporating knowledge graphs (KGs) into neural machine translation (NMT) adopt a static knowledge utilization strategy, that introduces many useless knowledge triples and makes the useful triples difficult be utilized by NMT. To address this problem, we propose a KG guided NMT model with dynamic reinforce-selected triples. The proposed methods could dynamically select the different useful knowledge triples for different source sentences. Specifically, the proposed model contains two components: 1) knowledge selector, that dynamically selects useful knowledge triples for a source sentence, and 2) knowledge guided NMT (KgNMT), that utilizes the selected triples to guide the translation of NMT. Meanwhile, to overcome the non-differentiable problem and guide the training procedure, we propose a policy gradient strategy to encourage the model to select useful triples and improve the generation probability of gold target sentence. Various experimental results show that the proposed method can significantly outperform the baseline models in both translation quality and handling the entities.