Abstract

This paper addresses the feature selection problem for gene expression profiles. Feature selection based on fuzzy neighborhood rough sets is very important in gene expression profiles. The process of extracting key features in gene expression profiles may have two disadvantages: (1) it may generate many redundant features, thus reducing the classification accuracy, and (2) it may ignore some information. To address the above problems, this paper proposes a fuzzy neighborhood joint entropy model based on feature selection; this model adopts the nonnegative principle of fuzzy neighborhood joint entropy to evaluate the importance of a candidate feature gene. Based on the fuzzy neighborhood joint entropy model, this paper proposes a new feature gene selection algorithm: the fuzzy neighborhood joint entropy (FNJE) algorithm. In the model, first, fuzzy neighborhood particles and fuzzy decision-making are combined with the uncertainty measure of joint entropy to construct a fuzzy neighborhood joint entropy model. Second, the importance degree of a feature is introduced as the measurement standard of candidate feature genes to evaluate the importance of each feature. Third, the algorithm uses the importance of candidate features in selecting features to reduce the redundancy of the selected features and improve the classification accuracy. In this paper, based on the UCI and gene datasets, we conduct a series of experiments. The experimental results show that the proposed algorithm can select fewer feature genes and achieve higher classification accuracy. Compared with the other four algorithms, our proposed algorithm can improve the accuracy by 0.4%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document}-\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$-$$\\end{document}10.42%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document} and 1.16%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document}-\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$-$$\\end{document}15.18%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document} and can reach maximum accuracy values of 90.52%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document} and 87.02%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$\\%$$\\end{document} with the linear-SVM and KNN classifiers, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call