The paper proposes a new approach to the task of unsupervised human motion style transfer which transforms the style of an unlabeled input human motion into the style of another unlabeled motion sequence and preserves the behavior of the primitive motion at the same time. For supervised stylistic motion transfer task, constructing a database including a large quantity of training sample pairs is expensive and tedious for artists. Therefore, we focus on unsupervised methods. However, recent works need to retrain a network for each new character with different styles, which is a time-consuming and resource-intensive process. To tackle this problem, we utilize a meta network which builds a direct-mapping from the style motion sequence to the transformation network, namely, producing a corresponding motion transformation network in the meta network by a feed-forward propagation. The model transfers multiple styles of motion clips to different motions and generates natural stylized motions in real-time without a re-training for every style. Besides, our model can transfer styles between two movements depending on different skeletons. We also explore important features of the transfer network manifold by interpolation between the hidden states from meta network. Experiments validate the flexibility and effectiveness of our data-driven model.