ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

Split and merge algorithm for Gaussian mixture model based on KS test

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2018.06.006
  • Received Date: 20 September 2017
  • Accepted Date: 10 April 2018
  • Rev Recd Date: 10 April 2018
  • Publish Date: 30 June 2018
  • Gaussian mixture model is a linear combination of finite numbers of independent Gaussian models. Estimating the number of components is an important research area. One class of algorithms based on the minimum description length determine the number of components by splitting and merging components during the iterations. Traditional algorithms use entropy ratio, KL divergence, model similarity as split and merge criteria. However, entropy ratio and KL divergence might result in excessive split because of their excessive sensitivity to sparse or concave models, and model similarity might result in excessive merge because of its inability to assess the merged models’ goodness of fitting Gaussian. In the iterations of algorithm, these excessive splitting and merging operations may cause oscillations. For these problems entropy ratio and KS test as split criteria, and models similarity and KS test were used as merge criteria, which be called problems, a split and merge algorithm for Gaussian mixture model based on KS test is proposed, with entropy ratio and KS test used as split criteria and model similarity and KS test as merge criteria. This algorithm is capable of preventing excessive split and merge, as validated by experiments conducted on seven datasets.
    Gaussian mixture model is a linear combination of finite numbers of independent Gaussian models. Estimating the number of components is an important research area. One class of algorithms based on the minimum description length determine the number of components by splitting and merging components during the iterations. Traditional algorithms use entropy ratio, KL divergence, model similarity as split and merge criteria. However, entropy ratio and KL divergence might result in excessive split because of their excessive sensitivity to sparse or concave models, and model similarity might result in excessive merge because of its inability to assess the merged models’ goodness of fitting Gaussian. In the iterations of algorithm, these excessive splitting and merging operations may cause oscillations. For these problems entropy ratio and KS test as split criteria, and models similarity and KS test were used as merge criteria, which be called problems, a split and merge algorithm for Gaussian mixture model based on KS test is proposed, with entropy ratio and KS test used as split criteria and model similarity and KS test as merge criteria. This algorithm is capable of preventing excessive split and merge, as validated by experiments conducted on seven datasets.
  • loading
  • [1]
    BISHOP C M. Pattern recognition[J]. Machine Learning, 2006, 128: 1-58.
    [2]
    MURPHY K P. Machine Learning: A Probabilistic Perspective[M]. Cambridge, MA: The MIT Press, 2012.
    [3]
    BLEI D M, JORDAN M I. Variational inference for Dirichlet process mixtures[J]. Bayesian Analysis, 2006, 1(1): 121-143.
    [4]
    SHIFFRIN R M, CHANDRAMOULI S H, GRNWALD P D. Bayes factors, relations to minimum description length, and overlapping model classes[J]. Journal of Mathematical Psychology, 2016, 72: 56-77.
    [5]
    RISSANEN J. Minimum description length principle[M]. John Wiley & Sons, Inc., 1985.
    [6]
    COVES T F, HRUSCHKA E R. Splitting and merging Gaussian mixture model components: An evolutionary approach[C]// 2011 10th International Conference on Machine Learning and Applications and Workshops. Piscataway, NY, USA: IEEE Press, 2011, 1: 106-111.
    [7]
    SCRUCCA L. Identifying connected components in Gaussian finite mixture models for clustering[J]. Computational Statistics & Data Analysis, 2016, 93:5-17.
    [8]
    LI Y, LI L. A greedy merge learning algorithm for Gaussian mixture model[C]// International Symposium on Intelligent Information Technology Application. Piscataway, NY, USA: IEEE Press, 2009:506-509.
    [9]
    LI Y, LI L. A novel split and merge EM algorithm for Gaussian mixture model[C]// 2009 Fifth International Conference on Natural Computation. Piscataway, NY, USA: IEEE Press, 2009, 6: 479-483.
    [10]
    COVES T F, HRUSCHKA E R, GHOSH J. Evolving Gaussian mixture models with splitting and merging mutation operators.[J]. Evolutionary Computation, 2016, 24(2):293.
    [11]
    ZHU R, WANG L, ZHAI C, et al. High-dimensional variance-reduced stochastic gradient expectation-maximization algorithm[J]. Proceedings of Machine Learning Research, 2017, 70: 4180-4188.
    [12]
    ASUNCION A, NEWMAN D H. UCI machine learning repository[EB/OL]. [2017-08-20]. http://archive.ics.uci.edu/ml/index.php.
    [13]
    COVER T M, THOMAS J A. Elements of Information Theory[M]. John Wiley & Sons, Inc., 1991.
    [14]
    MASSEY JR F J. The Kolmogorov-Smirnov test for goodness of fit[J]. Journal of the American Statistical Association, 1951, 46(253): 68-78.
    [15]
    LI R, PERNECZKY R, DRZEZGA A, et al. Survival analysis, the infinite Gaussian mixture model, FDG-PET and non-imaging data in the prediction of progression from mild cognitive impairment[J]. arXiv preprint arXiv:1512.03955, 2015.
    [16]
    LECUN Y, CORTES C, BURGES C. THE MNIST DATABASE of handwritten digits[DB/OL]. [2017-08-20]. http://yann.lecun.com/exdb/mnist/.)
  • 加载中

Catalog

    [1]
    BISHOP C M. Pattern recognition[J]. Machine Learning, 2006, 128: 1-58.
    [2]
    MURPHY K P. Machine Learning: A Probabilistic Perspective[M]. Cambridge, MA: The MIT Press, 2012.
    [3]
    BLEI D M, JORDAN M I. Variational inference for Dirichlet process mixtures[J]. Bayesian Analysis, 2006, 1(1): 121-143.
    [4]
    SHIFFRIN R M, CHANDRAMOULI S H, GRNWALD P D. Bayes factors, relations to minimum description length, and overlapping model classes[J]. Journal of Mathematical Psychology, 2016, 72: 56-77.
    [5]
    RISSANEN J. Minimum description length principle[M]. John Wiley & Sons, Inc., 1985.
    [6]
    COVES T F, HRUSCHKA E R. Splitting and merging Gaussian mixture model components: An evolutionary approach[C]// 2011 10th International Conference on Machine Learning and Applications and Workshops. Piscataway, NY, USA: IEEE Press, 2011, 1: 106-111.
    [7]
    SCRUCCA L. Identifying connected components in Gaussian finite mixture models for clustering[J]. Computational Statistics & Data Analysis, 2016, 93:5-17.
    [8]
    LI Y, LI L. A greedy merge learning algorithm for Gaussian mixture model[C]// International Symposium on Intelligent Information Technology Application. Piscataway, NY, USA: IEEE Press, 2009:506-509.
    [9]
    LI Y, LI L. A novel split and merge EM algorithm for Gaussian mixture model[C]// 2009 Fifth International Conference on Natural Computation. Piscataway, NY, USA: IEEE Press, 2009, 6: 479-483.
    [10]
    COVES T F, HRUSCHKA E R, GHOSH J. Evolving Gaussian mixture models with splitting and merging mutation operators.[J]. Evolutionary Computation, 2016, 24(2):293.
    [11]
    ZHU R, WANG L, ZHAI C, et al. High-dimensional variance-reduced stochastic gradient expectation-maximization algorithm[J]. Proceedings of Machine Learning Research, 2017, 70: 4180-4188.
    [12]
    ASUNCION A, NEWMAN D H. UCI machine learning repository[EB/OL]. [2017-08-20]. http://archive.ics.uci.edu/ml/index.php.
    [13]
    COVER T M, THOMAS J A. Elements of Information Theory[M]. John Wiley & Sons, Inc., 1991.
    [14]
    MASSEY JR F J. The Kolmogorov-Smirnov test for goodness of fit[J]. Journal of the American Statistical Association, 1951, 46(253): 68-78.
    [15]
    LI R, PERNECZKY R, DRZEZGA A, et al. Survival analysis, the infinite Gaussian mixture model, FDG-PET and non-imaging data in the prediction of progression from mild cognitive impairment[J]. arXiv preprint arXiv:1512.03955, 2015.
    [16]
    LECUN Y, CORTES C, BURGES C. THE MNIST DATABASE of handwritten digits[DB/OL]. [2017-08-20]. http://yann.lecun.com/exdb/mnist/.)

    Article Metrics

    Article views (111) PDF downloads(162)
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return