3D shape learning is an important research topic in computer vision, in which the datasets play a critical role. However, most of the existing 3D datasets use voxels, point clouds, mesh, and B-rep, which are not parametric and feature-based. Thus they can not support the generation of real-world engineering computer-aided design (CAD) models with complicated shape features. Furthermore, they are based on 3D geometry results without human-computer interaction (HCI) history. This work is the first to provide a full parametric and feature-based CAD dataset with a selection mechanism to support HCI in 3D learning. First, unlike existing datasets, mainly composed of simple features (typical sketch and extrude), we devise complicated engineering features, such as fillet, chamfer, mirror, pocket, groove, and revolve. Second, different from the monotonous combination of features, we invent a select mechanism to mimic how human focuses on and selects a particular topological entity. The proposed mechanism establishes the relationships among complicated engineering features, which fully express the design intention and design knowledge of human CAD engineers. Therefore, it can process advanced 3D features for real-world engineering shapes. The experiments show that the proposed dataset outperforms existing CAD datasets in both reconstruction and generation tasks. In quantitative experiment, the proposed dataset demonstrates better prediction accuracy than other parametric datasets. Furthermore, CAD models generated from the proposed dataset comply with semantics of the human CAD engineers and can be edited and redesigned via mainstream industrial CAD software.