Abstract

Probabilistic and neural approaches, through their incorporation of nonlinearities and compression of states, enable a broader sampling of the phase space. For a broad set of complex questions that are encountered in conventional computation, this approach is very effective. In these patterns-oriented tasks a fluctuation in the size of data is akin to a thermal fluctuation. A thermodynamic view naturally applies to this computational style to information processing and from this reasoning one may estimate a variety of interesting consequences for computing: (a) efficiencies in energy, (b) complexity of tasks that can be tackled, (c) inaccuracies in inferences, and (d) limitations arising in the incompleteness of inputs and models. We employ toy model examples to reflect on these important themes to establish the following: (.)A dissipation minimum can be predicted predicated on the averaged information being discarded under constraints of minimization of energy and maximization of information preservation and entropy. Analogous to the $k_{B}T \ln 2$ for the randomization of a bit, under biological constraints, the $\sim \! -70 \; mV$ base and $\sim \! 40 \;mV$ peak spike potential are then a natural consequence in a biological neural environment. Non-biological, that is, physical implementations can be analyzed by a similar approach for noisy and variability-prone thermodynamic setting. (.) In drawing inference, the resorting to Occam's razor as a statistical equivalent to the choice of simplest and least number of axioms in developing of a theory conflicts with Mencken's rule--for every complex problem, there is an answer that is clear, simple and wrong--as a reflection of dimensionality reduction. (.) Between these two factors, it is possible to make a measure of the error bound predicated on the averaged information being discarded and being filled in, and (.) This lets one predict the upper limits of information processing rate under constraints. These observations point to what may be achievable using neural and probabilistic computation through their physical implementation as reflected in the thermodynamics of the implementation of a statistical information mechanic engine that avoids computation via deterministic linear algebra.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.