SummarySoftware vulnerability detection is an important problem in software security. In recent years, deep learning offers a novel approach for source code vulnerability detection. Due to the similarities between programming languages and natural languages, many natural language processing techniques have been applied to vulnerability detection tasks. However, specific problems within vulnerability detection tasks, such as buffer overflow, involve numerical reasoning. For these problems, the model needs to not only consider long dependencies and multiple relationships between statements of code but also capture the magnitude property of numerical literals in the program through high‐quality number embeddings. Therefore, we propose VDTransformer, a Transformer‐based method that improves source code embedding by integrating word and number embeddings. Furthermore, we employ Transformer encoders to construct a hierarchical neural network that extracts semantic features from the code and enables line‐level vulnerability detection. To evaluate the effectiveness of the method, we construct a dataset named OverflowGen based on templates for buffer overflow. Experimental comparisons on OverflowGen with a well‐known static vulnerability detector and two state‐of‐the‐art deep learning‐based methods confirm the effectiveness of VDTransformer and the importance of high‐quality number embeddings in vulnerability detection tasks involving numerical features.