Abstract

Learning in Deep Neural Network (DNN) identifies the geometric structure of data. Hence there is a need for topological measures for quantifying this geometric structure. Based on the transformations in DNN, a metric is introduced with certain conditions to generate a base for a suitable topology. The locally indiscrete nature of the topology is applied to quantify the generalization gap in DNN approximations based on Tychonoff separation axioms. It develops a new measure T2 complexity that quantifies the model’s ability to distinguish classes irrespective of the class imbalance and the threshold applied. The efficiency of the new measure on a real dataset for binary and multiclass classification problems compared with commonly used classification metrics Accuracy, Precision, Recall, F1 score, Matthews Correlation coefficient, AUC-ROC score, and AUC-PR score. The experimental analysis shows that the proposed measure gives a necessary condition for an optimal classification model. The study shows that the measure T2 complexity can be included as a performance metric to analyze the model’s ability to distinguish classes on any binary and multiclass classification problem when the existing measures show misleading results due to class imbalance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.