Electroencephalogram (EEG) based motor trajectory decoding for efficient control of brain–computer interface (BCI) systems has been an active area of research. The systems include prosthesis, rehabilitation and human-power augmenting devices. In this work, three-dimensional (3D) hand kinematics is estimated using pre-movement EEG signals during grasp and lift motion. Twelve subjects’ data from the publicly available database WAY-EEG-GAL is utilized for this purpose. Multi-layer perceptron (MLP) and convolutional neural network-long short-term memory (CNN-LSTM) based deep learning frameworks are proposed that utilize the motor-neural information encoded in the EEG data preceding the actual movement execution. Frequency band features are analyzed for hand kinematics decoding using EEG data filtered in seven distinct ranges. The best performing frequency band features is taken for further analysis with different EEG window sizes and lag windows. Additionally, inter-subject hand trajectory decoding analysis is performed using leave-one-subject-out (LOSO) approach. The Pearson correlation coefficient along with hand trajectory are taken to evaluate decoding performance for the proposed neural decoders. This study explores the feasibility of inter-subject 3D hand trajectory decoding using EEG signals during reach and grasp task. The proposed CNN-LSTM decoder is able to achieve the grand correlation in three axes upto 0.730 and 0.627 in intra-subject and inter-subject settings, respectively, thus providing viable information regarding decoding hand position from pre-movement EEG signals for practical BCI applications.