Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation

研究成果: ジャーナルへの寄稿学術誌査読

1 被引用数 (Scopus)

抄録

This study helps improve the early prediction of student performance by RNN-FitNets, which applies knowledge distillation (KD) to the time series direction of the recurrent neural network (RNN) model. The RNN-FitNets replaces the teacher model in KD with “an RNN model with a long-term time-series in which the features during the entire course are inputted” and the student model in KD with “an RNN model with a short-term time-series in which only the features during the early stages are inputted.” As a result, the RNN model in the early stage was trained to output the same results as the more accurate RNN model in the later stages. The experiment compared RNN-FitNets with a normal RNN model on a dataset of 296 university students in total. The results showed that RNN-FitNets can improve early prediction. Moreover, the SHAP value was employed to explain the contribution of the input features to the prediction results by RNN-FitNets. It was shown that RNN-FitNets can consider the future effects of the input features from the early stages of the course.

本文言語英語
ジャーナルJournal of Educational Computing Research
DOI
出版ステータス印刷中 - 2022

!!!All Science Journal Classification (ASJC) codes

  • 教育
  • コンピュータ サイエンスの応用

フィンガープリント

「Recurrent Neural Network-FitNets: Improving Early Prediction of Student Performanceby Time-Series Knowledge Distillation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル