Abstract
Onboard image analysis enables real‐time autonomous capabilities for unmanned platforms including aerial, ground, and aquatic drones. Performing classification on embedded systems, rather than transmitting data, allows rapid perception and decision‐making critical for time‐sensitive applications such as search and rescue, hazardous environment exploration, and military operations. To fully capitalize on these systems’ potential, specialized deep learning solutions are needed that balance accuracy and computational efficiency for time‐sensitive inference. This article introduces the widened attention‐enhanced atrous convolution‐based efficient network (WACEfNet), a new convolutional neural network designed specifically for real‐time visual classification challenges using resource‐constrained embedded devices. WACEfNet builds on EfficientNet and integrates innovative width‐wise feature processing, atrous convolutions, and attention modules to improve representational power without excessive overhead. Extensive benchmarking confirms state‐of‐the‐art performance from WACEfNet for aerial imaging applications while remaining suitable for embedded deployment. The improvements in accuracy and speed demonstrate the potential of customized deep learning advancements to unlock new capabilities for unmanned aerial vehicles and related embedded systems with tight size, weight, and power constraints. This research offers an optimized framework, combining widened residual learning and attention mechanisms, to meet the unique demands of high‐fidelity real‐time analytics across a variety of embedded perception paradigms.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.