The boost of convolutional neural networks (CNNs) has promoted the development of face recognition. Recently, the emergence of margin-based loss functions has further significantly improved the performance of face recognition. However, these methods sharply degrade in performance when dealing with large intra-class variations, including age, pose, illumination, resolution, and occlusion. Unlike most methods that target specific variations, our proposed approach, HAMFace, addresses the problems uniformly from the perspective of hard positive examples. To mitigate the intra-class variance, we argue that hard positive examples prefer larger margins, which can push them closer to their corresponding class centers. First, we design a hardness adaptive margin function to adjust the margin according to the hardness of the hard positive examples. Then, to enhance performance for unconstrained face recognition with various intra-class variations, we introduce a novel loss function named Hardness Adaptive Margin (HAM) Softmax Loss. This loss function allocates larger margins to hard positive examples during training based on their level of hardness. The proposed HAMFace is evaluated on nine challenging face recognition benchmarks and exhibits its superiority compared with other state-of-the-arts.