Abstract

Traditional probability estimation often demands a large amount of data for a problem of industrial scale. Neural networks have been used as an effective alternative for estimating input-output probabilities. In this paper, the certainty-factor-based neural network (CFNet) is explored for probability estimation in discrete domains. A new analysis presented here shows that the basis functions learned by the CFNet can bear precise semantics for dependencies. In the simulation study, the CFNet outperforms both the backpropagation network and the system based on the Rademacher-Walsh expansion. In the real-data experiments on splice junction and breast cancer data sets, the CFNet outperforms other neural networks and symbolic systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call