There are many “hyperparameters” that dictate the performance of Artificial Neural Network (ANN) models in tasks such as classification and object detection. This study proposes an Automated, Rapid, Convergent Hyperparameter Optimizer (ARCH) implementing Design of Experiments (DOE). Before building ARCH, there are two controversial issues in using DOE for Hyperparameter Optimization (HO) problems: design choice and model selection. To tackle these issues, a “meta-experiment” with three meta-factors is devised to analyze the effects of four types of experimental designs, two types of model selections (linear regression and picking the best samples), and 11 classification datasets. The datasets are included to generalize the findings. The meta-experiment’s results support the practical value of using smaller experimental designs to achieve high classification accuracies comparable to larger designs in less time. The findings also show that the HO problem is nonlinear; therefore, picking the best run from the experiments is more effective than optimizing a linear regression model. Additionally, the meta-experiment showed that including the dataset as a random effect is significant in HO problems. Furthermore, to benchmark ARCH's performance, it is compared to Bayesian optimization and genetic algorithms on CIFAR-10. The comparison demonstrates that ARCH has an advantage when resources are limited.