Performance Prediction and Importance Analysis Using Transformer

Akiyoshi Satake, Hironobu Fujiyoshi, Takayoshi Yamashita, Tsubasa Hirakawa, Atsushi Shimada

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The growth of online education has made it easier to capture learner activity. It is expected that detailed feedback to learners will lead to better performance. For this purpose, it is important to predict the performance of learners. Methods using classical machine learning and RNNs that take time series information into account have been proposed. In this paper, we propose a Transformer-based performance prediction method that aims to improve accuracy and extract important activity. The proposed method achieves more accurate performance prediction than conventional methods. In addition, we found that NEXT, SEARCH_JUMP and LINK_CLICK are important behaviors by analyzing the rationale of the Transformer.

Original languageEnglish
Title of host publication29th International Conference on Computers in Education Conference, ICCE 2021 - Proceedings
EditorsMaria Mercedes T. Rodrigo, Sridhar Iyer, Antonija Mitrovic, Hercy N. H. Cheng, Dan Kohen-Vacs, Camillia Matuk, Agnieszka Palalas, Ramkumar Rajenran, Kazuhisa Seta, Jingyun Wang
PublisherAsia-Pacific Society for Computers in Education
Pages538-543
Number of pages6
ISBN (Electronic)9789869721486
Publication statusPublished - Nov 22 2021
Event29th International Conference on Computers in Education Conference, ICCE 2021 - Virtual, Online
Duration: Nov 22 2021Nov 26 2021

Publication series

Name29th International Conference on Computers in Education Conference, ICCE 2021 - Proceedings
Volume2

Conference

Conference29th International Conference on Computers in Education Conference, ICCE 2021
CityVirtual, Online
Period11/22/2111/26/21

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Education

Fingerprint

Dive into the research topics of 'Performance Prediction and Importance Analysis Using Transformer'. Together they form a unique fingerprint.

Cite this