ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

DEA based production planning considering attainability and management goals with undesirable outputs

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2020.07.011
  • Received Date: 20 May 2020
  • Accepted Date: 18 June 2020
  • Rev Recd Date: 18 June 2020
  • Publish Date: 31 July 2020
  • The problem of production planning in a centralized decision-making environment usually involves the participation of all subunits, and each subunit makes its own contribution to the total production. When making a production planning, the central decision-maker needs to determine the inputs and outputs quantities of each subunit based on the changes in product demand and available resources that are known or predictable in the next quarter. On the one hand, due to the relative stability of the production technology level in the short term, it is unrealistic to greatly increase production, we need to consider the attainability of the production planning, that is, the closer the production plan for the next quarter is to the production status of the current quarter, the easier it is to achieve. On the other hand, in the previous production planning research, the central decision-maker did not formulate targeted management goals for each subunit, so that the production plan cannot fully reflect the manager’s expectations of the goal, so the management goals should be taken into account, and the production planning is required to be as close as possible to the management goals. In addition, in the production process, outputs can be divided into desirable and undesirable outputs.From these perspectives, a production planning method based on data envelopment analysis(DEA) for decision-making units with undesirable outputs was proposed by considering the attainability of production planning and management goals set in advance in a centralized decision-making environment. The proposed model was illustrated by the real cases of 32 paper mills along the Huaihe River in Anhui Province, China.
    The problem of production planning in a centralized decision-making environment usually involves the participation of all subunits, and each subunit makes its own contribution to the total production. When making a production planning, the central decision-maker needs to determine the inputs and outputs quantities of each subunit based on the changes in product demand and available resources that are known or predictable in the next quarter. On the one hand, due to the relative stability of the production technology level in the short term, it is unrealistic to greatly increase production, we need to consider the attainability of the production planning, that is, the closer the production plan for the next quarter is to the production status of the current quarter, the easier it is to achieve. On the other hand, in the previous production planning research, the central decision-maker did not formulate targeted management goals for each subunit, so that the production plan cannot fully reflect the manager’s expectations of the goal, so the management goals should be taken into account, and the production planning is required to be as close as possible to the management goals. In addition, in the production process, outputs can be divided into desirable and undesirable outputs.From these perspectives, a production planning method based on data envelopment analysis(DEA) for decision-making units with undesirable outputs was proposed by considering the attainability of production planning and management goals set in advance in a centralized decision-making environment. The proposed model was illustrated by the real cases of 32 paper mills along the Huaihe River in Anhui Province, China.
  • loading
  • [1]
    TSAI C, WU J. Using neural network ensembles for bankruptcy prediction and credit scoring[J]. Expert Systems With Applications, 2008, 34(4): 2639-2649.
    [2]
    PEROLS J. Financial statement fraud detection: An analysis of statistical and machine learning algorithms[J]. Auditing: A Journal of Practice and Theory, 2011, 30(2): 19-50.
    [3]
    CRAWFORD M, KHOSHGOFTAAR T M, PRUSA T M, et al. Survey of review spam detection using machine learning techniques[J]. Journal of Big Data, 2015, 2(23):1-24.
    [4]
    DE BRUIJNE M. Machine learning approaches in medical image analysis: From detection to diagnosis[J]. Medical Image Analysis, 2016, 33: 94-97.
    [5]
    RUBIN V L, CHEN Y. Information manipulation classification theory for LIS and NLP[J]. Proceedings of the Association for Information Science and Technology, 2012, 49(1): 1-5.
    [6]
    FAN J, FAN Y. High dimensional classification using features annealed independence rules[J]. Annals of Statistics, 2008, 36(6): 2605-2637.
    [7]
    SCHAPIRE R E. The strength of weak learn ability[J]. Machine Learning, 1990, 5(2): 197-227.
    [8]
    FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55: 119-139.
    [9]
    FRIEDMAN J H, TIBSHIRANI R, HASTIE T. Additive logistic regression: A statistical view of boosting[J]. The Annals of Statistics, 2000, 28(2): 337-407.
    [10]
    FRIEDMAN J H. Greedy function approximation: A gradient boosting machine[J]. Annals of Statistics, 2001, 29(5): 1189-1232.
    [11]
    BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
    [12]
    BREIMAN L. Random forest[J]. Machine Learning, 2001, 45: 5-32.
    [13]
    ZUKOTYNSKI K, GAUDET V, KUO P H, et al. The use of random forests to identify brain regions on amyloid and FDG PET associated with MoCA score[J]. Clinical Nuclear Medicine, 2020, 45(6): 427-433.
    [14]
    CHEN D R, LI H. On the performance of regularized regression learning in Hilbert space[J]. Neurocomputing, 2012, 93(2): 41-47.
    [15]
    JOHNSON W B, LINDENSTRAUSS J. Extensions of Lipschitz mappings into a Hilbert space[J]. Contemporary Mathematics, 1984, 26(1): 189-206.
    [16]
    DASGUPTA S, GUPTA A. An elementary proof of the Johnson-Lindenstrauss Lemma[J]. Random Structures and Algorithms, 1999, 22(1): 1-5.
    [17]
    LI P, HASTIE T J, CHURCH K W. Very sparse random projections[C]// Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2006: 287-296.
    [18]
    ALZU’BI A, ABUARQOUB A. Deep learning model with low-dimensional random projection for large-scale image[J]. Engineering Science and Technology,2020, 23(4): 911-920.
    [19]
    NGUYEN X V, SARAH E, SAKRAPEE P, et al. Training robust models using Random Projection[C]// 2016 23rd International Conference on Pattern Recognition. IEEE, 2017: 531-536.
    [20]
    MATTHEW T, ALEX P. Eigenfaces for recognition[J]. Journal of Cognitive Neuroscience, 1991, 3(1): 71-86.
    [21]
    WANG Y, KLIJN J G, ZHANG Y, et al. Gene-expression proles to predict distant metastasis of lymph-node-negative primary breast cancer[J]. Lancet, 2005, 365(9460): 671-679.
    [22]
    LPEZ-SNCHEZ D, CORCHADO J M, GONZLEZ ARRIETA A, et al. Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two[J]. Information Sciences, 2018, 436-437: 214-226.
    [23]
    CANNINGS T I, SAMWORTH R J. Random-projection ensemble classification[J]. Journal of the Royal Statistical Society, 2017, 79(4): 959-1035.
  • 加载中

Catalog

    [1]
    TSAI C, WU J. Using neural network ensembles for bankruptcy prediction and credit scoring[J]. Expert Systems With Applications, 2008, 34(4): 2639-2649.
    [2]
    PEROLS J. Financial statement fraud detection: An analysis of statistical and machine learning algorithms[J]. Auditing: A Journal of Practice and Theory, 2011, 30(2): 19-50.
    [3]
    CRAWFORD M, KHOSHGOFTAAR T M, PRUSA T M, et al. Survey of review spam detection using machine learning techniques[J]. Journal of Big Data, 2015, 2(23):1-24.
    [4]
    DE BRUIJNE M. Machine learning approaches in medical image analysis: From detection to diagnosis[J]. Medical Image Analysis, 2016, 33: 94-97.
    [5]
    RUBIN V L, CHEN Y. Information manipulation classification theory for LIS and NLP[J]. Proceedings of the Association for Information Science and Technology, 2012, 49(1): 1-5.
    [6]
    FAN J, FAN Y. High dimensional classification using features annealed independence rules[J]. Annals of Statistics, 2008, 36(6): 2605-2637.
    [7]
    SCHAPIRE R E. The strength of weak learn ability[J]. Machine Learning, 1990, 5(2): 197-227.
    [8]
    FREUND Y, SCHAPIRE R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997, 55: 119-139.
    [9]
    FRIEDMAN J H, TIBSHIRANI R, HASTIE T. Additive logistic regression: A statistical view of boosting[J]. The Annals of Statistics, 2000, 28(2): 337-407.
    [10]
    FRIEDMAN J H. Greedy function approximation: A gradient boosting machine[J]. Annals of Statistics, 2001, 29(5): 1189-1232.
    [11]
    BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
    [12]
    BREIMAN L. Random forest[J]. Machine Learning, 2001, 45: 5-32.
    [13]
    ZUKOTYNSKI K, GAUDET V, KUO P H, et al. The use of random forests to identify brain regions on amyloid and FDG PET associated with MoCA score[J]. Clinical Nuclear Medicine, 2020, 45(6): 427-433.
    [14]
    CHEN D R, LI H. On the performance of regularized regression learning in Hilbert space[J]. Neurocomputing, 2012, 93(2): 41-47.
    [15]
    JOHNSON W B, LINDENSTRAUSS J. Extensions of Lipschitz mappings into a Hilbert space[J]. Contemporary Mathematics, 1984, 26(1): 189-206.
    [16]
    DASGUPTA S, GUPTA A. An elementary proof of the Johnson-Lindenstrauss Lemma[J]. Random Structures and Algorithms, 1999, 22(1): 1-5.
    [17]
    LI P, HASTIE T J, CHURCH K W. Very sparse random projections[C]// Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2006: 287-296.
    [18]
    ALZU’BI A, ABUARQOUB A. Deep learning model with low-dimensional random projection for large-scale image[J]. Engineering Science and Technology,2020, 23(4): 911-920.
    [19]
    NGUYEN X V, SARAH E, SAKRAPEE P, et al. Training robust models using Random Projection[C]// 2016 23rd International Conference on Pattern Recognition. IEEE, 2017: 531-536.
    [20]
    MATTHEW T, ALEX P. Eigenfaces for recognition[J]. Journal of Cognitive Neuroscience, 1991, 3(1): 71-86.
    [21]
    WANG Y, KLIJN J G, ZHANG Y, et al. Gene-expression proles to predict distant metastasis of lymph-node-negative primary breast cancer[J]. Lancet, 2005, 365(9460): 671-679.
    [22]
    LPEZ-SNCHEZ D, CORCHADO J M, GONZLEZ ARRIETA A, et al. Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two[J]. Information Sciences, 2018, 436-437: 214-226.
    [23]
    CANNINGS T I, SAMWORTH R J. Random-projection ensemble classification[J]. Journal of the Royal Statistical Society, 2017, 79(4): 959-1035.

    Article Metrics

    Article views (49) PDF downloads(122)
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return