Abstract

PurposeThis paper aims to focus on lane detection of unmanned mobile robots. For the mobile robot, it is undesirable to spend lots of time detecting the lane. So quickly detecting the lane in a complex environment such as poor illumination and shadows becomes a challenge.Design/methodology/approachA new learning framework based on an integration of extreme learning machine (ELM) and an inception structure named multiscale ELM is proposed, making full use of the advantages that ELM has faster convergence and convolutional neural network could extract local features in different scales. The proposed architecture is divided into two main components: self-taught feature extraction by ELM with the convolution layer and bottom-up information classification based on the feature constraint. To overcome the disadvantages of poor performance under complex conditions such as shadows and illumination, this paper mainly solves four problems: local features learning: replaced the fully connected layer, the convolutional layer is used to extract local features; feature extraction in different scales: the integration of ELM and inception structure improves the parameters learning speed, but it also achieves spatial interactivity in different scales; and the validity of the training database: a method how to find a training data set is proposed.FindingsExperimental results on various data sets reveal that the proposed algorithm effectively improves performance under complex conditions. In the actual environment, experimental results tested by the robot platform named BIT-NAZA show that the proposed algorithm achieves better performance and reliability.Originality/valueThis research can provide a theoretical and engineering basis for lane detection on unmanned robots.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.