Completely contactless and at-a-distance personal identification provides enhanced user convenience, and improved hygiene and is highly sought under the COVID-19 pandemic. This paper proposes an accurate and generalizable deep neural network-based framework for the ‘completely’ contactless finger knuckle identification. We design and introduce a new loss function to enable a fully convolutional network to more effectively learn knuckle features that are imaged under at-a-distance imaging. A ‘completely’ contactless system also requires efficient online finger knuckle detection capabilities. This paper, for the first time in our knowledge, develops and introduces accurate capabilities to efficiently detect and segment finger knuckle patterns from images with <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">complex</i> backgrounds as widely observed in real-world applications. We introduce angular loss to accurately predict oriented knuckle patterns and incorporate into our framework. Experimental results presented in this paper on five different public databases, using challenging protocols and cross-database performance evaluation, illustrate outperforming results and validate the effectiveness of the proposed framework for completely contactless applications.