ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

Focused loss-based for imbalanced data scenarios integrated classification methods for CGAN

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2020.07.014
  • Received Date: 25 May 2020
  • Accepted Date: 27 June 2020
  • Rev Recd Date: 27 June 2020
  • Publish Date: 31 July 2020
  • For the case of imbalanced data, an integrated classification method for CGAN-focal-loss was investigated based on conditional generative adversarial networks (CGAN) using gradient boosting trees. The method first reduces the imbalance rate by CGAN, and further improves the classification performance of the classifier by increasing the focus on a few classes of samples through the weight balancing of the focused loss combined with the GBDT algorithm. The properties of the method were investigated and several theoretical results were obtained. It was proved that the empirical conditional distribution generated by CGAN converges to the conditional distribution of the corresponding aggregate under certain conditions; that the empirical risk of the CGAN method with focused loss converges to the expected risk; and that the estimator of the method converges to the function that minimizes the expected risk. The experimental results show the good performance of the CGAN-focal-loss method.
    For the case of imbalanced data, an integrated classification method for CGAN-focal-loss was investigated based on conditional generative adversarial networks (CGAN) using gradient boosting trees. The method first reduces the imbalance rate by CGAN, and further improves the classification performance of the classifier by increasing the focus on a few classes of samples through the weight balancing of the focused loss combined with the GBDT algorithm. The properties of the method were investigated and several theoretical results were obtained. It was proved that the empirical conditional distribution generated by CGAN converges to the conditional distribution of the corresponding aggregate under certain conditions; that the empirical risk of the CGAN method with focused loss converges to the expected risk; and that the estimator of the method converges to the function that minimizes the expected risk. The experimental results show the good performance of the CGAN-focal-loss method.
  • loading
  • [1]
    AKBANI R, KWEK S, JAPKOWICZ N. Applying support vector machine to imbalanced datasets[C]// Machine Learning: ECML 2004. Berlin: Springer, 2004: 39-50.
    [2]
    MAZUROWSKI M A, HABAS P A, ZURADA J M, et al. Training neural network classifiers for medical decision making: The effects of imbalanced datasets on classification performance[J]. Neural Networks, 2008, 21:427-436.
    [3]
    TAVALLAEE M, STAKHANVA N, GHORBANI A A. Toward credible evaluation of anomaly-based intrusion-detection methods[J]. IEEE Transactions on Systems, Man, and Cybernetics Part C, 2010, 40(5):516-524.
    [4]
    BERMEJO P,GSMEZ J A, PUERTA J M. Improving the performance of Naive Bayes multinomial in e-mail foldering by introducing distribution-based balance of datasets[J]. Expert Systems with Applications, 2011, 38(3): 2072-2080.
    [5]
    WEI W, LI J, CAO L, et al. Effective detection of sophisticated online banking fraud on extremely imbalanced data[J]. World Wide Web, 2013, 16: 449-475.
    [6]
    KERDPRASOP K, KERDPRASOP N. A data mining approach to automate fault detection model development in the semiconductor manufacturing process[J]. International Journal of Mechanics, 2011, 5(4): 336-344.
    [7]
    CHAWLA N V, JAPKOWICZ N, KOTCZ A. Editorial: Special issue on learning from imbalanced data sets[J]. ACM SIGKDD Explorations Newsletter, 2004, 6(1): 1-6.
    [8]
    DOUZAS G, BACAO F. Effective data generation for imbalanced learning using conditional generative adversarial networks[J]. Expert Systems with Applications, 2018, 91: 464-471.
    [9]
    CHAWLA N V. Data mining for imbalanced datasets: An overview[C]// Data Mining and Knowledge Discovery Handbook. Berlin: Springer, 2009: 875-886.
    [10]
    CHAWLA N V, BOWYER K W, HALL L O, et al. SMOTE: Synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16(1): 321-357.
    [11]
    HAN H, WANG W Y, MAO B H. Borderline-SMOTE: A new over-sampling method in imbalanced data sets learning[J]. ICIC 2005: Advances in Intelligent Computing, 2005, 17(12): 878-887.
    [12]
    HE H, BAI Y, GARCIA E A, et al. ADASYN: Adaptive synthetic sampling approach for imbalanced learning[C]// 2008 IEEE International Joint Conference on Neural Networks. IEEE, 2008: 1322-1328.
    [13]
    BATISTA G E, PRATI R C, MONARD M C. A study of the behavior of several methods for balancing machine learning training data[J]. ACM SIGKDD Explorations Newsletter, 2004, 6(1): 20-29.
    [14]
    FAN W, ZHANG J, STOTFO S J, et al. AdaCost: Misclassification cost-sensitive boosting[C]// Proceedings of the 16th International Conference on Learning, Slovenia: Morgan Kaufmann, 1999: 97-105.
    [15]
    WOZNIAK M. Classifiers: Methods of data,knowledge, and classifier combination[M]. Berlin: Springer, 2013.
    [16]
    MARIANI G, SCHEIDEGGER F, ISTRATE R, et al. BAGAN: Data augmentation with balancing GAN[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1803.09655.
    [17]
    RADFORD A, METZ L, CHINTALA S. Unsupervised representation learning with deep convolutional generative adversarial networks[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1511.06434v1.
    [18]
    GAO Y , JIAO Y , WANG Y , et al. Deep generative learning via variational gradient flow[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1901.08469.
    [19]
    ZHANG Y. Deep generative model for multi-class imbalanced learning[DB]// Open Access Master’s Theses, 2018: Paper 1277.
    [20]
    LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]// 2017 IEEE International Conference on Computer Vision (ICCV). IEEE, 2017: 2980-2988.
    [21]
    FAWCETT T. An introduction to ROC analysis[J]. Pattern Recognition Letters, 2006, 27(8): 861-874.
    [22]
    赵海霞,石洪波,武建,等.基于条件生成对抗网络的不平衡学习研究[J/OL].控制与决策, 2019: https://doi.org/10.13195/j.kzyjc.2019.0522.
    [23]
    莫赞,盖彦蓉,樊冠龙.基于GAN-AdaBoost-DT不平衡分类算法的信用卡欺诈分类[J].计算机应用, 2019, 39(2):618-622.
    [24]
    李诒靖,郭海湘,李亚楠,等.一种基于Boosting的集成学习算法在不均衡数据中的分类[J].系统工程理论与实践, 2016, 36(1):189-199.
    [25]
    GOODFELLOW I, POUGET A J, MIRZA M, et al. Generative adversarial nets[J]. Advances in Neural Information Processing Systems Conference, 2014, 27:2672-2680.
    [26]
    SALANT S W, SWITZER S, REYNOLDS R J. Losses from horizontal merger:The effects of an exogenous change in industry structure on Cournot-Nash equilibrium[J]. The Quarterly Journal of Economics, 1983, 98(2):185-199.
    [27]
    MIRZA M, OSINDERO S. Conditional generative adversarial nets[C]// Proceedings of the Neural Information Processing Systems Deep Learning Workshop, 2014.
    [28]
    SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
    [29]
    BERGSTRA J, YAMINS D, DAVID D C. Hyperopt: A Python library for optimizing the hyperparameters of machine learning algorithms[C]// Proceedings of the 12th Python in Science Conference, Austin, TX, 2012.
    [30]
    李航.统计学习方法[M].北京:清华大学出版社, 2012.
    [31]
    KINGMA D P, BA J. Adam: A method for stochastic optimization[C]// The 3rd International Conference on Learning Representations, San Diego, CA, 2015.
    [32]
    BIAU G, CADRE B, SANGNIER M, et al. Some theoretical properties of GANs[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1803.07819.
    [33]
    TSYBAKOV A B. Introduction to Nonparametric Estimation[M].Berlin: Springer, 2008.
    [34]
    VAN DER VAART A W, WELLNER J. Weak Convergence and Empirical Processes[M]. Berlin: Springer, 2000.
    [35]
    GIN W E, NICKL R. Mathematical Foundations of Infinite Dimensional Statistical Models[M]. Cambridge: Cambridge University Press, 2015.
    [36]
    FRIEDMAN J H. Greedy function approximation: A gradient boosting machine[J]. Annals of Statistics, 2001, 29(5): 1189-1232.)
  • 加载中

Catalog

    [1]
    AKBANI R, KWEK S, JAPKOWICZ N. Applying support vector machine to imbalanced datasets[C]// Machine Learning: ECML 2004. Berlin: Springer, 2004: 39-50.
    [2]
    MAZUROWSKI M A, HABAS P A, ZURADA J M, et al. Training neural network classifiers for medical decision making: The effects of imbalanced datasets on classification performance[J]. Neural Networks, 2008, 21:427-436.
    [3]
    TAVALLAEE M, STAKHANVA N, GHORBANI A A. Toward credible evaluation of anomaly-based intrusion-detection methods[J]. IEEE Transactions on Systems, Man, and Cybernetics Part C, 2010, 40(5):516-524.
    [4]
    BERMEJO P,GSMEZ J A, PUERTA J M. Improving the performance of Naive Bayes multinomial in e-mail foldering by introducing distribution-based balance of datasets[J]. Expert Systems with Applications, 2011, 38(3): 2072-2080.
    [5]
    WEI W, LI J, CAO L, et al. Effective detection of sophisticated online banking fraud on extremely imbalanced data[J]. World Wide Web, 2013, 16: 449-475.
    [6]
    KERDPRASOP K, KERDPRASOP N. A data mining approach to automate fault detection model development in the semiconductor manufacturing process[J]. International Journal of Mechanics, 2011, 5(4): 336-344.
    [7]
    CHAWLA N V, JAPKOWICZ N, KOTCZ A. Editorial: Special issue on learning from imbalanced data sets[J]. ACM SIGKDD Explorations Newsletter, 2004, 6(1): 1-6.
    [8]
    DOUZAS G, BACAO F. Effective data generation for imbalanced learning using conditional generative adversarial networks[J]. Expert Systems with Applications, 2018, 91: 464-471.
    [9]
    CHAWLA N V. Data mining for imbalanced datasets: An overview[C]// Data Mining and Knowledge Discovery Handbook. Berlin: Springer, 2009: 875-886.
    [10]
    CHAWLA N V, BOWYER K W, HALL L O, et al. SMOTE: Synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16(1): 321-357.
    [11]
    HAN H, WANG W Y, MAO B H. Borderline-SMOTE: A new over-sampling method in imbalanced data sets learning[J]. ICIC 2005: Advances in Intelligent Computing, 2005, 17(12): 878-887.
    [12]
    HE H, BAI Y, GARCIA E A, et al. ADASYN: Adaptive synthetic sampling approach for imbalanced learning[C]// 2008 IEEE International Joint Conference on Neural Networks. IEEE, 2008: 1322-1328.
    [13]
    BATISTA G E, PRATI R C, MONARD M C. A study of the behavior of several methods for balancing machine learning training data[J]. ACM SIGKDD Explorations Newsletter, 2004, 6(1): 20-29.
    [14]
    FAN W, ZHANG J, STOTFO S J, et al. AdaCost: Misclassification cost-sensitive boosting[C]// Proceedings of the 16th International Conference on Learning, Slovenia: Morgan Kaufmann, 1999: 97-105.
    [15]
    WOZNIAK M. Classifiers: Methods of data,knowledge, and classifier combination[M]. Berlin: Springer, 2013.
    [16]
    MARIANI G, SCHEIDEGGER F, ISTRATE R, et al. BAGAN: Data augmentation with balancing GAN[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1803.09655.
    [17]
    RADFORD A, METZ L, CHINTALA S. Unsupervised representation learning with deep convolutional generative adversarial networks[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1511.06434v1.
    [18]
    GAO Y , JIAO Y , WANG Y , et al. Deep generative learning via variational gradient flow[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1901.08469.
    [19]
    ZHANG Y. Deep generative model for multi-class imbalanced learning[DB]// Open Access Master’s Theses, 2018: Paper 1277.
    [20]
    LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]// 2017 IEEE International Conference on Computer Vision (ICCV). IEEE, 2017: 2980-2988.
    [21]
    FAWCETT T. An introduction to ROC analysis[J]. Pattern Recognition Letters, 2006, 27(8): 861-874.
    [22]
    赵海霞,石洪波,武建,等.基于条件生成对抗网络的不平衡学习研究[J/OL].控制与决策, 2019: https://doi.org/10.13195/j.kzyjc.2019.0522.
    [23]
    莫赞,盖彦蓉,樊冠龙.基于GAN-AdaBoost-DT不平衡分类算法的信用卡欺诈分类[J].计算机应用, 2019, 39(2):618-622.
    [24]
    李诒靖,郭海湘,李亚楠,等.一种基于Boosting的集成学习算法在不均衡数据中的分类[J].系统工程理论与实践, 2016, 36(1):189-199.
    [25]
    GOODFELLOW I, POUGET A J, MIRZA M, et al. Generative adversarial nets[J]. Advances in Neural Information Processing Systems Conference, 2014, 27:2672-2680.
    [26]
    SALANT S W, SWITZER S, REYNOLDS R J. Losses from horizontal merger:The effects of an exogenous change in industry structure on Cournot-Nash equilibrium[J]. The Quarterly Journal of Economics, 1983, 98(2):185-199.
    [27]
    MIRZA M, OSINDERO S. Conditional generative adversarial nets[C]// Proceedings of the Neural Information Processing Systems Deep Learning Workshop, 2014.
    [28]
    SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
    [29]
    BERGSTRA J, YAMINS D, DAVID D C. Hyperopt: A Python library for optimizing the hyperparameters of machine learning algorithms[C]// Proceedings of the 12th Python in Science Conference, Austin, TX, 2012.
    [30]
    李航.统计学习方法[M].北京:清华大学出版社, 2012.
    [31]
    KINGMA D P, BA J. Adam: A method for stochastic optimization[C]// The 3rd International Conference on Learning Representations, San Diego, CA, 2015.
    [32]
    BIAU G, CADRE B, SANGNIER M, et al. Some theoretical properties of GANs[DB/OL]. [2020-05-01]. https://arxiv.org/abs/1803.07819.
    [33]
    TSYBAKOV A B. Introduction to Nonparametric Estimation[M].Berlin: Springer, 2008.
    [34]
    VAN DER VAART A W, WELLNER J. Weak Convergence and Empirical Processes[M]. Berlin: Springer, 2000.
    [35]
    GIN W E, NICKL R. Mathematical Foundations of Infinite Dimensional Statistical Models[M]. Cambridge: Cambridge University Press, 2015.
    [36]
    FRIEDMAN J H. Greedy function approximation: A gradient boosting machine[J]. Annals of Statistics, 2001, 29(5): 1189-1232.)

    Article Metrics

    Article views (59) PDF downloads(99)
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return