ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

Dimensionality reduction method of graph kernel based on KLDA

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2016.09.006
  • Received Date: 01 March 2016
  • Accepted Date: 17 September 2016
  • Rev Recd Date: 17 September 2016
  • Publish Date: 30 September 2016
  • Graph structure has strong expression ability and high flexibility. The identification and classification of graph structure data fall into the category of structural pattern recognition. The research idea of the graph structure data is to transform the graph structure data to the vector in the vector space, then the traditional machine learning algorithm is used to analyze the vector. Data representation and analysis based on graph structure has become a hot research topic in the field of machine learning. The classical graph kernel method was extended. The kernel linear discriminant analysis (KLDA) was employed to reduce the dimension of the high dimension feature space, and the low dimensional feature space corresponding to the original graph structure data was obtained. Then the traditional machine learning algorithm was used to analyze these new data. The effectiveness of the proposed method is verified by the experimental results on standard data sets.
    Graph structure has strong expression ability and high flexibility. The identification and classification of graph structure data fall into the category of structural pattern recognition. The research idea of the graph structure data is to transform the graph structure data to the vector in the vector space, then the traditional machine learning algorithm is used to analyze the vector. Data representation and analysis based on graph structure has become a hot research topic in the field of machine learning. The classical graph kernel method was extended. The kernel linear discriminant analysis (KLDA) was employed to reduce the dimension of the high dimension feature space, and the low dimensional feature space corresponding to the original graph structure data was obtained. Then the traditional machine learning algorithm was used to analyze these new data. The effectiveness of the proposed method is verified by the experimental results on standard data sets.
  • loading
  • [1]
    蒋强荣. 图核及其在模式识别中应用的研究[D]. 北京工业大学, 2011.
    [2]
    SHARAN R, IDEKER T. Modeling cellular machinery through biological network comparison [J]. Nature Biotechnology, 2006, 24(4): 427-433.
    [3]
    KUBINYI H. Drug research: Myths, hype and reality[J]. Nature Reviews: Drug Discovery, 2003, 2(8): 665-668.
    [4]
    KUMAR R, NOVAK J, TOMKINS A. Structure and evolution of online social networks[A]// Link Mining: Models, Algorithms, and Applications. New York: Springer, 2010: 337-357.
    [5]
    VISHWANATHAN S V N, SCHRAUDOLPH NN, KONDOR R, et al. Graph kernels[J]. Journal of Machine Learning Research, 2010, 11(2): 1201-1242.
    [6]
    CONTE D, FOGGIA P, SANSONE C, et al. Thirty years of graph matching in pattern recognition[J]. International Journal of Pattern Recognition and Artificial Intelligence, 2004, 18(3): 265-298.
    [7]
    吴遐, 张道强. 半监督图核降维方法[J]. 计算机科学与探索, 2010, 4(7): 629-636.
    [8]
    BORGWARDT K M, KRIEGEL H P. Shortest-path kernels on graphs[C]// Proceedings of the 5th International Conference on Data Mining. Houston: IEEE Computer Society, 2005:74-81.
    [9]
    HORVTH T, GRTNER T, WROBEL S. Cyclic pattern kernels for predictive graph mining[C]// Proceedings of the 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Seattle: ACM Press, 2004: 158-167.
    [10]
    SHERVASHIDZE N, BORGWARDT K. Fast subtree kernels on graphs[A]// Advances in Neural Information Processing Systems. Vancouver, 2009: 1660-1668.
    [11]
    KRAMER S, DE RAEDT L. Feature construction with version spaces for biochemical applications[C]// Proceedings of the 18th International Conference on Machine Learning. Williamstown: Morgan Kaufmann, 2001: 258-265.
    [12]
    吴遐. 基于约束的图核方法的研究[D]. 南京航空航天大学, 2011.
    [13]
    MIKA S,RTSCH G, WESTON J, et al. Fisher discriminant analysis with kernels[C]// Proceedings of the Neural Networks for Signal Processing. IEEE Press, 1999: 41-48.
    [14]
    SHERVASHIDZE N, SCHWEITZER P, VAN LEEUWEN E J, et al. Weisfeiler-lehman graph kernels[J]. Journal of Machine Learning Research, 2011, 12(9): 2539-2561.
    [15]
    DUDA R O, HART P E, STORK D G. Pattern Classification[M]. 2ed, New York: Wiley-Interscience, 2000.
    [16]
    SCHLKOPF B, HERBRICH R, SMOLA A J. A Generalized Representer Theorem[A]// Computational Learning Theory. Berlin Heidelberg: Springer, 2001: 416-426.
    [17]
    HELMA C, KING R D, KRAMER S, et al. The predictive toxicology challenge[J]. Bioinformatics, 2001, 17(1): 107-108.
    [18]
    SCHLKOPF B, SMOLA A J. Learning with Kernels[M]. Cambridge, MA: MIT Press, 2002.)
  • 加载中

Catalog

    [1]
    蒋强荣. 图核及其在模式识别中应用的研究[D]. 北京工业大学, 2011.
    [2]
    SHARAN R, IDEKER T. Modeling cellular machinery through biological network comparison [J]. Nature Biotechnology, 2006, 24(4): 427-433.
    [3]
    KUBINYI H. Drug research: Myths, hype and reality[J]. Nature Reviews: Drug Discovery, 2003, 2(8): 665-668.
    [4]
    KUMAR R, NOVAK J, TOMKINS A. Structure and evolution of online social networks[A]// Link Mining: Models, Algorithms, and Applications. New York: Springer, 2010: 337-357.
    [5]
    VISHWANATHAN S V N, SCHRAUDOLPH NN, KONDOR R, et al. Graph kernels[J]. Journal of Machine Learning Research, 2010, 11(2): 1201-1242.
    [6]
    CONTE D, FOGGIA P, SANSONE C, et al. Thirty years of graph matching in pattern recognition[J]. International Journal of Pattern Recognition and Artificial Intelligence, 2004, 18(3): 265-298.
    [7]
    吴遐, 张道强. 半监督图核降维方法[J]. 计算机科学与探索, 2010, 4(7): 629-636.
    [8]
    BORGWARDT K M, KRIEGEL H P. Shortest-path kernels on graphs[C]// Proceedings of the 5th International Conference on Data Mining. Houston: IEEE Computer Society, 2005:74-81.
    [9]
    HORVTH T, GRTNER T, WROBEL S. Cyclic pattern kernels for predictive graph mining[C]// Proceedings of the 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Seattle: ACM Press, 2004: 158-167.
    [10]
    SHERVASHIDZE N, BORGWARDT K. Fast subtree kernels on graphs[A]// Advances in Neural Information Processing Systems. Vancouver, 2009: 1660-1668.
    [11]
    KRAMER S, DE RAEDT L. Feature construction with version spaces for biochemical applications[C]// Proceedings of the 18th International Conference on Machine Learning. Williamstown: Morgan Kaufmann, 2001: 258-265.
    [12]
    吴遐. 基于约束的图核方法的研究[D]. 南京航空航天大学, 2011.
    [13]
    MIKA S,RTSCH G, WESTON J, et al. Fisher discriminant analysis with kernels[C]// Proceedings of the Neural Networks for Signal Processing. IEEE Press, 1999: 41-48.
    [14]
    SHERVASHIDZE N, SCHWEITZER P, VAN LEEUWEN E J, et al. Weisfeiler-lehman graph kernels[J]. Journal of Machine Learning Research, 2011, 12(9): 2539-2561.
    [15]
    DUDA R O, HART P E, STORK D G. Pattern Classification[M]. 2ed, New York: Wiley-Interscience, 2000.
    [16]
    SCHLKOPF B, HERBRICH R, SMOLA A J. A Generalized Representer Theorem[A]// Computational Learning Theory. Berlin Heidelberg: Springer, 2001: 416-426.
    [17]
    HELMA C, KING R D, KRAMER S, et al. The predictive toxicology challenge[J]. Bioinformatics, 2001, 17(1): 107-108.
    [18]
    SCHLKOPF B, SMOLA A J. Learning with Kernels[M]. Cambridge, MA: MIT Press, 2002.)

    Article Metrics

    Article views (30) PDF downloads(83)
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return