叶峻峣,苏敬勇,王耀威,徐勇. 基于LSTM的语言学习长期记忆预测模型[J]. 中文信息学报, 2022, 36(12): 133-138,148.
YE Junyao, SU Jingyong, WANG Yaowei, XU Yong. A Long-term Memory Prediction Model for Language Learning via LSTM. , 2022, 36(12): 133-138,148.
A Long-term Memory Prediction Model for Language Learning via LSTM
YE Junyao1, SU Jingyong1,2, WANG Yaowei2, XU Yong1
1.School of Lomputer Science and Technology Harbin Institute of Technology (Shenzhen), Shenzhen, Guangdong 518055, China; 2.PengCheng Laboratory, Institute of Vision Intelligence, Shenzhen, Guangdong 518055, China
Abstract：Spaced repetition is a common mnemonic method in language learning. In order to decide proper review intervals for a desired memory effect, it is necessary to predict the learners’ long-term memory. This paper proposes a long-term memory prediction model for language learning via LSTM. We extract statistical features and sequence features from the memory behavior history of learners. The LSTM is used to learn the memory behavior sequence. The half-life regression model is applied to predict the probability of foreign language learners' recall of words. Upon the 9 billion pieces of real memory behavior data collected for evaluation, the sequence features are revealed more informative than statistical features. Compared with the state-of-the-art models, the error of the proposed LSTM-HLR model is significantly reduced by 50%.
 Ebbinghaus H. Memory: A contribution to experimental psychology[J]. Annals of Neurosciences, 2013, 20(4): 155-156.  Cepeda N J, Vul E, Rohrer D, et al. Spacing effects in learning: A temporal ridgeline of optimal retention[J]. Psychological Science, 2008, 19(11): 1095-1102.  Smolen P, Zhang Y, Byrne J H. The right time to learn: Mechanisms and optimization of spaced learning[J]. Nature Reviews Neuroscience, 2016, 17(2): 77-88.  Roediger H L, Butler A C. The critical role of retrieval practice in long-term retention[J]. Trends in Cognitive Sciences, 2011, 15(1): 20-27.  Maddox G B, Balota D A, Coane J H, et al. The role of forgetting rate in producing a benefit of expanded over equal spaced retrieval in young and older adults[J]. Psychology and Aging, 2011, 26(3): 661-670.  Settles B, Meeder B. A trainable spaced repetition model for language learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, 2016: 1848-1858.  Leitner S.So lernt man leben[M]. München, Zürich: Droemer-Knaur, 1974.  Woiniak P A. Optimization of learning: A new approach and computer application[D]. Master's Thesis, Poznan: University of Technology in Poznan, 1990.  Woiniak P A, Gorzelanczyk E J. Optimization of repetition spacing in the practice of learning[J]. Acta Neurobiologiae Experimentalis, 1994, 54(1): 59-62.  Reddy S, Labutov I, Banerjee S, et al. Unbounded human learning: Optimal scheduling for spaced repetition[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data mining, 2016: 1815-1824.  Pashler H, Cepeda N, Lindsey R V, et al. Predicting the optimal spacing of study: A multiscale context model of memory[J]. Advances in Neural Information Processing Systems, 2009, 22: 1321-1329.  Zaidi A, Caines A, Moore R, et al. Adaptive forgetting curves for spaced repetition language learning[C]//Proceedings of the 21st AIED International Conference on Artificial Intelligence in Education, 2020: 358-363.  Tabibian B, Upadhyay U, De A, et al. Enhancing human learning via spaced repetition optimization[C]//Proceedings of the National Academy of Sciences, 2019, 116(10): 3988-3993.  Reddy S, Levine S, Dragan A. Accelerating human learning with deep reinforcement learning[C]//Proceedings of the NIPS Workshop on Teaching Machines, Robots, and Humans, 2017.  Sinha S.Using deep reinforcement learning for personalizing review sessions on E-learning platforms with spaced repetition[D]. Master's Thesis, Stockholm: KTH Royal Institute of Technology, 2019.  Yang Z, Shen J, Liu Y, et al. TADS: Learning time-aware scheduling policy with dyna-style planning for spaced repetition[C]//Proceedings of the 43rd ACM SIGIR International Conference on Research and Development in Information Retrieval, 2020: 1917-1920.  Hunziker A, Chen Y, Mac Aodha O, et al. Teaching multiple concepts to a forgetful learner[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems, 2020,6: 4025-4036.  Dietterich T G. Approximate statistical tests for comparing supervised classification learning algorithms[J]. Neural Computation, 1998, 10(7): 1895-1923.  Bjork R A, Bjork E L, et al. A new theory of disuse and an old theory of stimulus fluctuation[J]. From Learning Processes to Cognitive Processes: Essays in Honor of William K. Estes, 1992, 2: 35-67.