Abstract

Hyperparameter optimization in convolutional neural networks (CNNs) plays a vital role in ensuring the effectiveness of the models. However, with the depth of the existing CNN expanding, this task becomes very challenging due to the high-dimensional and computationally expensive characteristics. Given these difficulties, this study proposes a surrogate-assisted highly cooperative hyperparameter optimization (SHCHO) algorithm for large-scale chain-styled CNNs. SHCHO tackles the original hyperparameter optimization problem by cooperatively optimizing its subproblems. Specifically, it first decomposes the whole CNN into several overlapping sub-CNNs, which conforms to the inherent overlapping interaction structure among hyperparameters and significantly reduces the search space. The resulting hyperparameter optimization subproblems on these sub-CNNs are then coevolved collaboratively and competitively, facilitating a proper hyperparameter configuration for the whole CNN. Besides, SHCHO also employs the computationally efficient surrogate technique to assist in each subproblem optimization. As a result, the expensive computational cost can be greatly reduced. Extensive experimental results on three widely-used image classification datasets indicate that SHCHO shows competitive performance when compared with several state-of-the-art algorithms in terms of both hyperparameter superiority and computational cost.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.