Abstract

A loss function is proposed for solving box-constrained inverse problems. Given causality mechanisms between inputs and outputs as smooth functions, an inverse problem demands to adjust the input levels to make the output levels as close as possible to the target values; box-constrained refers to the requirement that all outcome levels remain within their respective permissible intervals. A feasible solution is assumed known, which is often the status quo. We propose a loss function which avoids activation of the constraints. A practical advantage of this approach over the usual weighted least squares is that permissible outcome intervals are required in place of target importance weights, facilitating data acquisition. The proposed loss function is smooth and strictly convex with closed-form gradient and Hessian, permitting Newton family algorithms. The author has not been able to locate in the literature the Gibbs distribution corresponding to the loss function. The loss function is closely related to the generalized matching law in psychology.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.