ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Information Science

A cognitive diagnostic framework for computer science education based on probability graph model

Funds:  The Key research project for Teaching of Anhui Province (2019jyxm0001);Research project for Teaching of Anhui Province (2020jyxm2304).
Cite this:
https://doi.org/10.52396/JUST-2020-0007
More Information
  • Author Bio:

    Hu Xinying is currently a PhD student in the Department of Computer Software and Theory under the supervision of Prof. Sun Guangzhong at University of Science and Technology of China. Her research focuses on educational data mining.

    He Yu is currently a PhD student under the supervision of Prof. Sun Guangzhong at University of Science and Technology of China. Her research mainly focuses on educational data mining.

  • Corresponding author: Sun Guangzhong (corresponding author) received his PhD degree in Computer Software and Theory from University of Science and Technology of China. He is currently a professor at University of Science and Technology of China. His research interests include high performance computing, algorithm optimization, and big data processing. E-mail: gzsun@ustc.edu.cn
  • Publish Date: 31 January 2021
  • A new cognitive diagnostic framework was proposed to evaluate students' theoretical and practical abilities in computer science education. Based on the probability graph model, students' coding ability was introduced, then the students' theoretical and practical abilities was modeled. And a parallel optimization algorithm was proposed to train the model efficiently. Experimental results on multiple data sets show that the proposed model has a significant improvement in MAE and RMSE compared with the competing methods. The proposed model provides more accurate and comprehensive analysis results for computer science education.
    A new cognitive diagnostic framework was proposed to evaluate students' theoretical and practical abilities in computer science education. Based on the probability graph model, students' coding ability was introduced, then the students' theoretical and practical abilities was modeled. And a parallel optimization algorithm was proposed to train the model efficiently. Experimental results on multiple data sets show that the proposed model has a significant improvement in MAE and RMSE compared with the competing methods. The proposed model provides more accurate and comprehensive analysis results for computer science education.
  • loading
  • [1]
    Kulkarni C E, Bernstein M S, Klemmer S R. Peerstudio:Rapid peer feedback emphasizes revision and improves performance. Proceedings of the Second ACM Conference on Learning@ Scale. Vancouver, Canada: ACM, 2015: 75-84.
    [2]
    LeightonJ, Gierl M. Cognitive Diagnostic Assessment for Education: Theory and Applications. Cambridge University Press, 2007.
    [3]
    Dibello L V, Roussos L A, Stout W. 31a review of cognitively diagnostic assessment and a summary of psychometric models. Handbook of statistics, 2006, 26: 979-1030.
    [4]
    Haertel E. An application of latent class models to assessment data. Applied Psychological Measurement, 1984, 8(3): 333-346.
    [5]
    Junker B W, Sijtsma K. Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 2001, 25(3): 258-272.
    [6]
    De La Torre J. The generalizedDina model framework. Psychometrika, 2011, 76(2): 179-199.
    [7]
    De La Torre J, Douglas J A. Higher-order latent trait models for cognitive diagnosis. Psychometrika, 2004, 69(3): 333-353.
    [8]
    Embretson S E, Reise S P. Item Response Theory. New York: Psychology Press, 2013.
    [9]
    Wu R, Liu Q, Liu Y, et al. Cognitive modelling for predicting examinee performance. Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence. Buenos Aires, Argentina: ACM, 2015: 1017-1024.
    [10]
    Gu J, Wang Y, Heffernan N T. Personalizing knowledge tracing: Should we individualize slip, guess, prior or learn rate? International Conference on Intelligent Tutoring Systems. Springer, 2014: 647-648
    [11]
    Leony D, Pardo A, De La FuenteValentín L, et al. Glass: A learning analytics visualization tool. Proceedings of the 2nd International conference on Learning Analytics and Knowledge. Vancouver, Canada: ACM, 2012: 162-163.
    [12]
    Toscher A, Jahrer M. Collaborative filtering applied to educational data mining. KDDCup, 2010.
    [13]
    Thainghe N, Drumond L, Krohngrimberghe A, et al. Recommender system for predicting student performance. Procedia Computer Science, 2010, 1(2): 2811-2819.
    [14]
    Díez J, Luaces Ó, Alonso-Betanzos A, et al. Peer assessment in MOOCs using preference learning via matrix factorization. NIPS Workshop on Data Driven Education. 2013.
    [15]
    Desmarais M C. Mapping question items to skills with nonnegative matrix factorization. ACM SIGKDD Explorations Newsletter, 2012, 13(2): 30-36.
    [16]
    Sun Y, Ye S, Inoue S, et al. Alternating recursive method for q-matrix learning. Proceedings of the 7th International Conference on Educational Data Mining. London: ACM, 2014: 14-19.
    [17]
    Tha-Nnghe N, Schmidt-Thieme L. Multi-relational factorization models for student modeling in intelligent tutoring systems. Seventh International Conference on Knowledge and Systems Engineering. Ho Chi Minh City, Vietnam: IEEE, 2015: 61-66.
    [18]
    Cen H, Koedinger K, Junker B. Learning factors analysis–a general method for cognitive model evaluation and improvement. International Conference on Intelligent Tutoring Systems. Springer, 2006: 164-175.
    [19]
    Baker R S J D, Corbett A T, Aleven V. More accurate student modeling through contextual estimation of slip and guess probabilities in Bayesian knowledge tracing. International Conference On Intelligent Tutoring Systems. Springer, 2008: 406-415.
    [20]
    Mnih A, Salakhutdinov R R. Probabilistic matrix factorization. Advances in Neural Information Processing Systems. 2008: 1257-1264.
    [21]
    Liu Q, Wu R, Chen E, et al. Fuzzy cognitive diagnosis for modelling examinee performance. ACM Transactions on Intelligent Systems and Technology, 2018, 9(4): 1-26.
  • 加载中

Catalog

    [1]
    Kulkarni C E, Bernstein M S, Klemmer S R. Peerstudio:Rapid peer feedback emphasizes revision and improves performance. Proceedings of the Second ACM Conference on Learning@ Scale. Vancouver, Canada: ACM, 2015: 75-84.
    [2]
    LeightonJ, Gierl M. Cognitive Diagnostic Assessment for Education: Theory and Applications. Cambridge University Press, 2007.
    [3]
    Dibello L V, Roussos L A, Stout W. 31a review of cognitively diagnostic assessment and a summary of psychometric models. Handbook of statistics, 2006, 26: 979-1030.
    [4]
    Haertel E. An application of latent class models to assessment data. Applied Psychological Measurement, 1984, 8(3): 333-346.
    [5]
    Junker B W, Sijtsma K. Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 2001, 25(3): 258-272.
    [6]
    De La Torre J. The generalizedDina model framework. Psychometrika, 2011, 76(2): 179-199.
    [7]
    De La Torre J, Douglas J A. Higher-order latent trait models for cognitive diagnosis. Psychometrika, 2004, 69(3): 333-353.
    [8]
    Embretson S E, Reise S P. Item Response Theory. New York: Psychology Press, 2013.
    [9]
    Wu R, Liu Q, Liu Y, et al. Cognitive modelling for predicting examinee performance. Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence. Buenos Aires, Argentina: ACM, 2015: 1017-1024.
    [10]
    Gu J, Wang Y, Heffernan N T. Personalizing knowledge tracing: Should we individualize slip, guess, prior or learn rate? International Conference on Intelligent Tutoring Systems. Springer, 2014: 647-648
    [11]
    Leony D, Pardo A, De La FuenteValentín L, et al. Glass: A learning analytics visualization tool. Proceedings of the 2nd International conference on Learning Analytics and Knowledge. Vancouver, Canada: ACM, 2012: 162-163.
    [12]
    Toscher A, Jahrer M. Collaborative filtering applied to educational data mining. KDDCup, 2010.
    [13]
    Thainghe N, Drumond L, Krohngrimberghe A, et al. Recommender system for predicting student performance. Procedia Computer Science, 2010, 1(2): 2811-2819.
    [14]
    Díez J, Luaces Ó, Alonso-Betanzos A, et al. Peer assessment in MOOCs using preference learning via matrix factorization. NIPS Workshop on Data Driven Education. 2013.
    [15]
    Desmarais M C. Mapping question items to skills with nonnegative matrix factorization. ACM SIGKDD Explorations Newsletter, 2012, 13(2): 30-36.
    [16]
    Sun Y, Ye S, Inoue S, et al. Alternating recursive method for q-matrix learning. Proceedings of the 7th International Conference on Educational Data Mining. London: ACM, 2014: 14-19.
    [17]
    Tha-Nnghe N, Schmidt-Thieme L. Multi-relational factorization models for student modeling in intelligent tutoring systems. Seventh International Conference on Knowledge and Systems Engineering. Ho Chi Minh City, Vietnam: IEEE, 2015: 61-66.
    [18]
    Cen H, Koedinger K, Junker B. Learning factors analysis–a general method for cognitive model evaluation and improvement. International Conference on Intelligent Tutoring Systems. Springer, 2006: 164-175.
    [19]
    Baker R S J D, Corbett A T, Aleven V. More accurate student modeling through contextual estimation of slip and guess probabilities in Bayesian knowledge tracing. International Conference On Intelligent Tutoring Systems. Springer, 2008: 406-415.
    [20]
    Mnih A, Salakhutdinov R R. Probabilistic matrix factorization. Advances in Neural Information Processing Systems. 2008: 1257-1264.
    [21]
    Liu Q, Wu R, Chen E, et al. Fuzzy cognitive diagnosis for modelling examinee performance. ACM Transactions on Intelligent Systems and Technology, 2018, 9(4): 1-26.

    Article Metrics

    Article views (417) PDF downloads(494)
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return