Abstract

The work purpose is to show the advantages of moving from ordinary binary neurons to more complex neurons with ternary output quantization. As an example, a neural network combination of five classical statistical criteria is considered: Geary (1935), David-Hartley-Pearson (1954), Shapiro-Wilk (1965), Maximum deviation from the center (1965), Ali-Chergo-Revis (1992). A forecast of combining these criteria with others is given, built with confidence probability is equal to 0.99. The binary artificial neurons using will require the use of 280 statistical criteria. The conversion to the artificial neurons using with ternary quantizers should reduce the neurons number to 9 for small samples of 16 experiments. An exponential decreasing of the necessary neurons number is seen.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.