Abstract
This paper presents a deterministic solution to an approximated classification-error based objective function. In the formulation, we propose a quadratic approximation as the function for achieving smooth error counting. The solution is subsequently found to be related to the weighted least-squares whereby a robust tuning process can be incorporated. The tuning traverses between the least-squares estimate and the approximated total-error-rate estimate to cater for various situations of unbalanced attribute distributions. By adopting a linear parametric classifier model, the proposed classification-error based learning formulation is empirically shown to be superior to that using the original least-squares-error cost function. Finally, it will be seen that the performance of the proposed formulation is comparable to other classification-error based and state-of-the-art classifiers without sacrificing the computational simplicity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Pattern Analysis and Machine Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.