The need for products that are more streamlined, more useful, and have longer battery lives is rising in today's culture. More components are being integrated onto smaller, more complex chips in order to do this. The outcome is higher total power consumption as a result of increased power dissipation brought on by dynamic and static currents in integrated circuits (ICs). For effective power planning and the precise application of power pads and strips by floor plan engineers, estimating power dissipation at an early stage is essential. With more information about the design attributes, power estimation accuracy increases. For a variety of applications, including function approximation, regularization, noisy interpolation, classification, and density estimation, they offer a coherent framework. RBFNN training is also quicker than training multi-layer perceptron networks. RBFNN learning typically comprises of a linear supervised phase for computing weights, followed by an unsupervised phase for determining the centers and widths of the Gaussian basis functions. This study investigates several learning techniques for estimating the synaptic weights, widths, and centers of RBFNNs. In this study, RBF networks—a traditional family of supervised learning algorithms—are examined. Using centers found using k-means clustering and the square norm of the network coefficients, respectively, two popular regularization techniques are examined. It is demonstrated that each of these RBF techniques are capable of being rewritten as data-dependent kernels. Due to their adaptability and quicker training time when compared to multi-layer perceptron networks, RBFNNs present a compelling option to conventional neural network models. Along with experimental data, the research offers a theoretical analysis of these techniques, indicating competitive performance and a few advantages over traditional kernel techniques in terms of adaptability (ability to take into account unlabeled data) and computing complexity. The research also discusses current achievements in using soft k-means features for image identification and other tasks.
Read full abstract