Abstract

This paper proposes a new error back propagation learning algorithm that properly enhances the sensitivity of input and output mapping by applying a high-pass filter characteristic to the conventional error back propagation learning algorithm, allowing small input feature variations to be successfully indicated. For the sensitive discrimination of novel class data with slightly different characteristics from the normal class data, the cost function in the proposed neural network algorithm is modified by further increasing the input and output sensitivity, where weight update rules are used to minimize the cost function using a gradient descent method. The proposed algorithm is applied to an auto-associative multilayer perceptron neural network and its performance evaluated with two real-world applications: a laser spot detection-based computer interface system for detecting a laser spot in complex backgrounds and an automatic inspection system for the reliable detection of Mura defects that occur during the manufacture of flat panel liquid crystal displays. When compared with the conventional error back propagation learning algorithm, the proposed algorithm shows a better performance as regards detecting small input feature variations by increasing the input–output mapping sensitivity.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.