Network Architecture Search (NAS) automates hyperparameter adjustments in deep learning models, offering a potent solution for building intelligent fault diagnosis models. Despite its potential, NAS faces challenges in this application. The primary issues include the excessive time required when searching across the full data domain and the inefficacy of the common random search strategy in yielding high-performance sub networks. This study addresses these challenges through several key initiatives: It defines a specific library of hyperparameters tailored for intelligent fault diagnosis tasks; constructs sub data domain model clusters to reduce the computational costs associated with NAS; and introduces a hyperparameter search strategy leveraging a greedy algorithm, which enhances the search efficiency for optimal diagnostic sub network and minimizes its prediction errors. When applied to datasets involving gear and bearing faults, the proposed method demonstrates a reduction in both the time cost of searches and the prediction errors of sub networks, compared to traditional Network Architecture Search. Moreover, it outperforms manually tuned deep learning models by simultaneously optimizing multiple sets of hyperparameter and automatically generating higher accuracy diagnostic sub networks.