Abstract
We cast the training of minimal artificial neural network architectures as a problem of global optimization, and study the simulated annealing (SA) global optimization heuristic under a 'best-so-far' model. Our testbed consists of separated-aperature radar data for subsoil mine detection. In previous analyses, we have found that the traditional SA 'cooling' paradigm can be suboptimal for small instances of combinatorial global optimizations. Here, we demonstrate that traditional cooling is also suboptimal for training minimal neural networks for mine detection. Related issues include (i) how to find minimal network architectures; (ii) considering tradeoffs between minimality and trainability; (iii) the question of whether multistart/parallel implementations of SA can be superior to a single long SA run; and (iv) adaptive annealing strategies based on the best-so-far objective.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.