Knowledge graph embedding represents entities and relations as low-dimensional continuous vectors. Recently, researchers have attempted to leverage the potential semantic connections between entities with hierarchical relationships in the knowledge graph. However, existing knowledge embedding methods that model entity hierarchical structures face the following challenges: (1) Setting the same embedding dimension for entities at different hierarchical levels can lead to overfitting issues and the waste of storage and computational resources; (2) Manually searching the embedding dimension in discrete space makes it easy to miss the optimal parameters and increases training costs. Therefore, we propose a knowledge embedding method, HEAES, that automatically learns the embedding dimensions for entities based on their hierarchical structure. Specifically, we first propose a new modeling method to capture the hierarchical relationships of entities, and we then train the model on a triple classification task. Then, we adaptively learn the pruning threshold to trim the embedding vectors, automatically learning different embedding dimensions for entities at different hierarchical levels. Experiments on the YAGO26K and DB111K datasets verify that introducing entity hierarchical information helps boost the model performance.