Feature selection techniques are designed to find the relevant feature subset of the original features that can facilitate clustering, classification and retrieval. It is an important research topic in pattern recognition and machine learning. Feature selection is mainly partitioned into two classes, i.e. supervised and unsupervised methods. Currently research mostly concentrates on supervised ones. Few efficient unsupervised feature selection methods have been developed because no label information is available. On the other hand, it is difficult to evaluate the selected features. An unsupervised feature selection method based on extended entropy is proposed here. The information loss based on extended entropy is used to measure the correlation between features. The method assures that the selected features have both big individual information and little redundancy information with the selected features. At last, the efficiency of the proposed method is illustrated with some practical datasets.
Publié le : 2019-04-26
Classification:  Knowledge and Information Engineering,  Unsupervised feature selection, extended entropy, information loss, correlation value,  68T01
@article{cai2019_1_223,
     author = {Zhanquan Sun; University of Shanghang for Science and Technology, Shanghai and Feng Li; Department of History, College of Liberal Arts, Shanghai University, Shanghai and Huifen Huang; Shandong Yingcai University, Shandong},
     title = {Study on Unsupervised Feature Selection Method Based on Extended Entropy},
     journal = {Computing and Informatics},
     volume = {37},
     number = {6},
     year = {2019},
     language = {en},
     url = {http://dml.mathdoc.fr/item/cai2019_1_223}
}
Zhanquan Sun; University of Shanghang for Science and Technology, Shanghai; Feng Li; Department of History, College of Liberal Arts, Shanghai University, Shanghai; Huifen Huang; Shandong Yingcai University, Shandong. Study on Unsupervised Feature Selection Method Based on Extended Entropy. Computing and Informatics, Tome 37 (2019) no. 6, . http://gdmltest.u-ga.fr/item/cai2019_1_223/