This paper explores the feasibility of a contactless identification system based on hand features. The identification solution has been designed to be integrated with smart space applications and relies on a commercial 3D sensor (i.e., Leap Motion) for palm features capture. The first part of the paper is devoted to evaluate the significance of the different hand features and the performance of a set of classification algorithms. 21 users have contributed to build a testing dataset; for each user, the morphology of each hand has been gathered from 52 features, which include bones length and width, palm characteristics and relative distances among fingers, palm center and wrist. In order to get consistent samples and guarantee the best performance for Leap Motion device, the data collection system provides sweet spot control. This functionality guides the user to situate the hand in the best position and orientation with respect to the device. The selected classification strategies—nearest neighbor (NN), supported vector machine, multilayer perceptron, logistic regression and tree algorithms—have been evaluated on Weka. We have found that relative hand-pose distances are more significant than pure morphological features. On this feature set, the highest correct classified instances (CCI) rate is reached through the multilayer perceptron algorithm (>96 %), although all the evaluated classifiers provide a CCI rate above 90 %. The analysis also gather how these algorithms perform with a variable number of users in the database, and what the sensitivity to the number of training samples is for each algorithm. Results show that there are different alternatives that are accurate enough for non-critical, immediate response applications. The second part of the paper focuses on the implementation of application examples that are integrated with a real-time hand-based identification system using NN. In particular, the applications enable customization and gesture-based control of the smart space. A five-user study has provided insight into the system performance and user experience. Results confirm the viability of using in-air hand shape recognition in smart space settings, although also it is still needed to deal with aspects that hinder faster and more accurate recognition.