Abstract

Neural networks have been developed for machine learning and data mining tasks, and because data mining problems contain a large amount of data, sampling is a necessity for the success of the task. Radial basis function networks are one of representative neural network algorithms, and known to have good prediction accuracy in many applications, but it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the neural networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples for the networks, because there is some fluctuation in accuracy depending on the samples. This paper suggests a progressive resampling technique to cope with the situation. The suggestion is proved by experiments with very promising results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.