In this paper, we present an Image-to-Class Dynamic Time Warping (I2C-DTW) approach for the recognition of both 3D static hand gestures and 3D hand trajectory gestures. Our contribution is twofold. First, we propose a technique to compute the image-to-class dynamic time warping distance instead of the Image-to-Image distance. By doing so, we obtain better generalization capability using the Image-to-Class distance than the Image-to-Image distance. Second, we propose a compositional model called fingerlets for static gesture representation, and a compositional model called strokelets for trajectory gesture representation. The compositional models make it possible to compute the DTW distance between a data sample and a gesture category. We have evaluated the static gesture recognition performance on several public 3D hand gesture datasets. For better evaluating the performance on trajectory gesture recognition, we collected a 3D hand trajectory gesture dataset, called UESTC-HTG, using a Kinect device. The experiment results show that the proposed I2C-DTW approach significantly improves the recognition accuracy on both static gestures and trajectory gestures.