Abstract

The work purpose is to show the advantages of moving from ordinary binary neurons to more complex neurons with ternary output quantization. As an example, a neural network combination of five classical statistical criteria is considered: Geary (1935), David-Hartley-Pearson (1954), Shapiro-Wilk (1965), Maximum deviation from the center (1965), Ali-Chergo-Revis (1992). A forecast of combining these criteria with others is given, built with confidence probability is equal to 0.99. The binary artificial neurons using will require the use of 280 statistical criteria. The conversion to the artificial neurons using with ternary quantizers should reduce the neurons number to 9 for small samples of 16 experiments. An exponential decreasing of the necessary neurons number is seen.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call