Abstract

Among many k -winners-take-all ( k WTA) models, the dual-neural network (DNN- k WTA) model is with significantly less number of connections. However, for analog realization, noise is inevitable and affects the operational correctness of the k WTA process. Most existing results focus on the effect of additive noise. This brief studies the effect of time-varying multiplicative input noise. Two scenarios are considered. The first one is the bounded noise case, in which only the noise range is known. Another one is for the general noise distribution case, in which we either know the noise distribution or have noise samples. For each scenario, we first prove the convergence property of the DNN- k WTA model under multiplicative input noise and then provide an efficient method to determine whether a noise-affected DNN- k WTA network performs the correct k WTA process for a given set of inputs. With the two methods, we can efficiently measure the probability of the network performing the correct k WTA process. In addition, for the case of the inputs being uniformly distributed, we derive two closed-form expressions, one for each scenario, for estimating the probability of the model having correct operation. Finally, we conduct simulations to verify our theoretical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call