In recent years, graph neural networks (GNNs) based on neighborhood aggregation schemes have become a promising method in various graph-based applications. To solve the expert-dependent and time-consuming problem in human-designed GNN architectures, graph neural architecture search (GNAS) has been popular. However, as the mainstream GNAS methods automatically design GNN architectures with fixed GNN depth, they cannot mine the true potential of GNN architectures for graph classification. Although a few GNAS methods have explored the importance of adaptive GNN depth based on fixed GNN architectures, they have not designed a general search space for graph classification, which limits the discovery of excellent GNN architectures. In this paper, we propose Depth-Adaptive Graph Neural Architecture Search for Graph Classification (DAGC), which systemically constructs and explores the search space for graph classification, rather than studying individual designs. Through decoupling the graph classification process, DAGC proposes a complete and flexible search space, including GNN depth, aggregation function, and pooling operation components. To this end, DAGC adopts a learnable agent based on reinforcement learning to effectively guide the search for depth-adaptive GNN architectures. Extensive experiments on five real-world datasets demonstrate that DAGC outperforms the state-of-the-art human-designed GNN architectures and GNAS methods. The code is available at: https://github.com/Zhen-Peng-Wu/DAGC.