ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

Unsupervised feature selection method based on adaptive locality preserving projection

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2018.04.004
  • Received Date: 25 May 2017
  • Rev Recd Date: 24 June 2017
  • Publish Date: 30 April 2018
  • The unsupervised feature selection method based on spectrogram is constructed in the original high dimensional data space, which is easily disturbed by noise or redundant features. To overcome these deficiencies, an unsupervised feature selection method based on adaptive locality preserving projection is proposed. Global linear regression function is utilized to construct feature selection model, and the adaptive local preserving projection is adopted to improve model accuracy. Then the l2,1-norm constraint is added to improve the distinguishability of different features and avoid noise interference. A comparison with several state-of-the-art feature selection methods demostrate the effectiveness of the proposed method.
    The unsupervised feature selection method based on spectrogram is constructed in the original high dimensional data space, which is easily disturbed by noise or redundant features. To overcome these deficiencies, an unsupervised feature selection method based on adaptive locality preserving projection is proposed. Global linear regression function is utilized to construct feature selection model, and the adaptive local preserving projection is adopted to improve model accuracy. Then the l2,1-norm constraint is added to improve the distinguishability of different features and avoid noise interference. A comparison with several state-of-the-art feature selection methods demostrate the effectiveness of the proposed method.
  • loading
  • [1]
    NIE F, CAI X, HUANG H, et al. Efficient and robust feature selection via joint l2,1-norms minimization[C]// Advances in Neural Information Processing Systems. Chicago: IEEE Press, 2010:1813-1821.
    [2]
    YANG Y, MA Z, HAUPTMANN A G, et al. Feature selection for multimedia analysis by sharing information among multiple tasks[J]. IEEE Transactions on Multimedia, 2013, 15(3): 661-669.
    [3]
    CHANG X, NIE F, YANG Y, et al. A convex formulation for semi-supervised multi-label feature selection[C]// 28th AAAI Conference on Artificial Intelligence. Québec, Canada: AAAI Press, 2014:1171-1177.
    [4]
    王晓栋, 严菲, 谢勇,等. 基于稀疏图表示的特征选择方法研究[J]. 计算机工程与科学, 2015, 37(12): 2372-2378.
    WANG X D, YAN F, XIE Y, et al. A feature selection method based on sparse graph representation[J]. Computer Engineering & Science, 2015, 37(12): 2372-2378.
    [5]
    REN Y, ZHANG G, YU G, et al. Local and global structure preserving based feature selection[J]. Neurocomputing, 2012, 89(10):147-157.
    [6]
    HE X, CAI D, NIYOGI P. Laplacian score for feature selection[J]. Advances in Neural Information Processing Systems, 2005, 18: 507-514.
    [7]
    YANG Y, SHEN H T, MA Z, et al. l2,1-norm regularized discriminative feature selection for unsupervised learning[C]// International Joint Conference on Artificial Intelligence. Barcelona, Spain: AAAI Press, 2011: 1589-1594.
    [8]
    LI Z, YANG Y, LIU J, et al. Unsupervised feature selection using nonnegative spectral analysis[C]// 26th AAAI Conference on Artificial Intelligence. New York: AAAI Press, 2012:1026-1032.
    [9]
    ZHAO Z, LIU H. Spectral feature selection for supervised and unsupervised learning[C]// Proceedings of the 24th International Conference on Machine Learning. Corvalis, USA: ACM Press, 2007: 1151-1157.
    [10]
    CAI D, ZHANG C, HE X. Unsupervised feature selection for multi-cluster data[C]// ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington: ACM Press, 2010: 333-342.
    [11]
    WANG X D, ZHANG X, ZENG Z, et al. Unsupervised spectral feature selection with l2-norm graph[J]. Neurocomputing, 2016, 200(C): 47-54.
    [12]
    HOU C, NIE F, LI X, et al. Joint embedding learning and sparse regression: A framework for unsupervised feature selection[J]. IEEE Transactions on Cybernetics, 2014, 44(6): 793-804.
    [13]
    ZHANG Z, BAI L, LIANG Y, et al. Unsupervised Feature Selection by Graph Optimization[M]// Image Analysis and Processing – ICIAP, Springer, 2015.
    [14]
    HE X, NIYOGI P. Locality preserving projections (LPP)[J]. Advances in Neural Information Processing Systems, 2002, 16(1): 186-197.
    [15]
    NIE F, WANG X, HUANG H. Clustering and projected clustering with adaptive neighbors[C]// ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Québec, Canada: ACM Press, 2014: 977-986.
    [16]
    LYONS M J, BUDYNEK J, AKAMATSU S. Automatic classification of single facial images [J]. IEEE Transactions on Pattern Analysis, 1999, 21(12): 1357-1362.
  • 加载中

Catalog

    [1]
    NIE F, CAI X, HUANG H, et al. Efficient and robust feature selection via joint l2,1-norms minimization[C]// Advances in Neural Information Processing Systems. Chicago: IEEE Press, 2010:1813-1821.
    [2]
    YANG Y, MA Z, HAUPTMANN A G, et al. Feature selection for multimedia analysis by sharing information among multiple tasks[J]. IEEE Transactions on Multimedia, 2013, 15(3): 661-669.
    [3]
    CHANG X, NIE F, YANG Y, et al. A convex formulation for semi-supervised multi-label feature selection[C]// 28th AAAI Conference on Artificial Intelligence. Québec, Canada: AAAI Press, 2014:1171-1177.
    [4]
    王晓栋, 严菲, 谢勇,等. 基于稀疏图表示的特征选择方法研究[J]. 计算机工程与科学, 2015, 37(12): 2372-2378.
    WANG X D, YAN F, XIE Y, et al. A feature selection method based on sparse graph representation[J]. Computer Engineering & Science, 2015, 37(12): 2372-2378.
    [5]
    REN Y, ZHANG G, YU G, et al. Local and global structure preserving based feature selection[J]. Neurocomputing, 2012, 89(10):147-157.
    [6]
    HE X, CAI D, NIYOGI P. Laplacian score for feature selection[J]. Advances in Neural Information Processing Systems, 2005, 18: 507-514.
    [7]
    YANG Y, SHEN H T, MA Z, et al. l2,1-norm regularized discriminative feature selection for unsupervised learning[C]// International Joint Conference on Artificial Intelligence. Barcelona, Spain: AAAI Press, 2011: 1589-1594.
    [8]
    LI Z, YANG Y, LIU J, et al. Unsupervised feature selection using nonnegative spectral analysis[C]// 26th AAAI Conference on Artificial Intelligence. New York: AAAI Press, 2012:1026-1032.
    [9]
    ZHAO Z, LIU H. Spectral feature selection for supervised and unsupervised learning[C]// Proceedings of the 24th International Conference on Machine Learning. Corvalis, USA: ACM Press, 2007: 1151-1157.
    [10]
    CAI D, ZHANG C, HE X. Unsupervised feature selection for multi-cluster data[C]// ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington: ACM Press, 2010: 333-342.
    [11]
    WANG X D, ZHANG X, ZENG Z, et al. Unsupervised spectral feature selection with l2-norm graph[J]. Neurocomputing, 2016, 200(C): 47-54.
    [12]
    HOU C, NIE F, LI X, et al. Joint embedding learning and sparse regression: A framework for unsupervised feature selection[J]. IEEE Transactions on Cybernetics, 2014, 44(6): 793-804.
    [13]
    ZHANG Z, BAI L, LIANG Y, et al. Unsupervised Feature Selection by Graph Optimization[M]// Image Analysis and Processing – ICIAP, Springer, 2015.
    [14]
    HE X, NIYOGI P. Locality preserving projections (LPP)[J]. Advances in Neural Information Processing Systems, 2002, 16(1): 186-197.
    [15]
    NIE F, WANG X, HUANG H. Clustering and projected clustering with adaptive neighbors[C]// ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Québec, Canada: ACM Press, 2014: 977-986.
    [16]
    LYONS M J, BUDYNEK J, AKAMATSU S. Automatic classification of single facial images [J]. IEEE Transactions on Pattern Analysis, 1999, 21(12): 1357-1362.

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return