The dendritic neural model (DNM) is computationally faster than other machine-learning techniques, because its architecture can be implemented by using logic circuits and its calculations can be performed entirely in binary form. To further improve the computational speed, a straightforward approach is to generate a more concise architecture for the DNM. Actually, the architecture search is a large-scale multiobjective optimization problem (LSMOP), where a large number of parameters need to be set with the aim of optimizing accuracy and structural complexity simultaneously. However, the issues of irregular Pareto front, objective discontinuity, and population degeneration strongly limit the performances of conventional multiobjective evolutionary algorithms (MOEAs) on the specific problem. Therefore, a novel competitive decomposition-based MOEA is proposed in this study, which decomposes the original problem into several constrained subproblems, with neighboring subproblems sharing overlapping regions in the objective space. The solutions in the overlapping regions participate in environmental selection for the neighboring subproblems and then propagate the selection pressure throughout the entire population. Experimental results demonstrate that the proposed algorithm can possess a more powerful optimization ability than the state-of-the-art MOEAs. Furthermore, both the DNM itself and its hardware implementation can achieve very competitive classification performances when trained by the proposed algorithm, compared with numerous widely used machine-learning approaches.