TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Towards an Appropriate Query, Key, and Value Computation f...

Towards an Appropriate Query, Key, and Value Computation for Knowledge Tracing

Youngduck Choi, Youngnam Lee, Junghyun Cho, Jineon Baek, Byung-soo Kim, Yeongmin Cha, Dongmin Shin, Chan Bae, Jaewe Heo

2020-02-14Collaborative FilteringKnowledge Tracing
PaperPDFCodeCodeCodeCodeCode

Abstract

Knowledge tracing, the act of modeling a student's knowledge through learning activities, is an extensively studied problem in the field of computer-aided education. Although models with attention mechanism have outperformed traditional approaches such as Bayesian knowledge tracing and collaborative filtering, they share two limitations. Firstly, the models rely on shallow attention layers and fail to capture complex relations among exercises and responses over time. Secondly, different combinations of queries, keys and values for the self-attention layer for knowledge tracing were not extensively explored. Usual practice of using exercises and interactions (exercise-response pairs) as queries and keys/values respectively lacks empirical support. In this paper, we propose a novel Transformer based model for knowledge tracing, SAINT: Separated Self-AttentIve Neural Knowledge Tracing. SAINT has an encoder-decoder structure where exercise and response embedding sequence separately enter the encoder and the decoder respectively, which allows to stack attention layers multiple times. To the best of our knowledge, this is the first work to suggest an encoder-decoder model for knowledge tracing that applies deep self-attentive layers to exercises and responses separately. The empirical evaluations on a large-scale knowledge tracing dataset show that SAINT achieves the state-of-the-art performance in knowledge tracing with the improvement of AUC by 1.8% compared to the current state-of-the-art models.

Results

TaskDatasetMetricValueModel
Knowledge TracingEdNetAUC0.7811SAINT
Knowledge TracingEdNetAcc73.68SAINT

Related Papers

SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17Personalized Exercise Recommendation with Semantically-Grounded Knowledge Tracing2025-07-15NLGCL: Naturally Existing Neighbor Layers Graph Contrastive Learning for Recommendation2025-07-10From ID-based to ID-free: Rethinking ID Effectiveness in Multimodal Collaborative Filtering Recommendation2025-07-08RAG-VisualRec: An Open Resource for Vision- and Text-Enhanced Retrieval-Augmented Generation in Recommendation2025-06-25CogGen: A Learner-Centered Generative AI Architecture for Intelligent Tutoring with Programming Video2025-06-25LLM2Rec: Large Language Models Are Powerful Embedding Models for Sequential Recommendation2025-06-16CORONA: A Coarse-to-Fine Framework for Graph-based Recommendation with Large Language Models2025-06-14