基于自注意力机制的序列推荐算法在捕获用户交互序列的全局特征方面表现出了强大的能力,得到了广泛应用。然而交互序列当中只有一部分关键行为会对用户未来行为的演化起到决定性作用,其余冗余的噪声行为会干扰推荐结果的准确性。同时,单一尺度的自注意力机制难以从不同粒度上捕获用户行为。该文提出基于行为路径的多尺度自注意力机制序列推荐算法,在不同粒度上动态地捕获对最终推荐起到决定性作用的行为演化模式,屏蔽冗余的非关键行为,提高了推荐系统的用户体验。该模型在三个公开数据集上与同类型方法进行比较,实验结果显示,该文所提出的算法在不同的评估指标上较基线方法均有一定的提升,验证了模型的有效性。
Abstract
The sequence recommendation algorithm based on self-attention mechanism shows strong ability in capturing the global features of user interaction sequence. However, not all the behaviors in the interaction sequence will play a decisive role in the evolution of the user's future behavior, and the single-scale self-attention mechanism is difficult to capture user behavior from different granularity. This paper proposes a multi-scale self-attention mechanism based on behavior path for sequence recommendation. It dynamically captures the behavior evolution mode that plays a decisive role in the final recommendation at different granularity, and removes redundant non-critical behaviors. The experimental results on three public datasets show that the proposed algorithm has a certain improvement over the baseline method in different evaluation metrics.
关键词
序列推荐 /
自注意力机制 /
行为路径
{{custom_keyword}} /
Key words
sequence recommendation /
self-attention mechanism /
behavior pathway
{{custom_keyword}} /
{{custom_sec.title}}
{{custom_sec.title}}
{{custom_sec.content}}
参考文献
[1] GAO C, LEI W, HE X, et al. Advances and challenges in conversational recommender systems: A survey[J]. AI Open, 2021, 2: 100-126.
[2] KANG W C, MCAULEY J. Self-attentive sequential recommendation[C]//Proceedings of the IEEE International Conference on Data Mining. IEEE, 2018: 197-206.
[3] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[J]. Advances in Neural Information Processing Systems, 2017: 30-79.
[4] LI J, WANG Y, MCAULEY J. Time interval aware self-attention for sequential recommendation[C]//Proceedings of the 13th International Conference on Web Search and Data Mining, 2020: 322-330.
[5] CHEN C, GENG H, YANG N, et al. Learning self-modulating attention in continuous time space with applications to sequential recommendation[C]//Proceedings of the International Conference on Machine Learning, 2021: 1606-1616.
[6] DU Q, YU L, LI H, et al. Denoising-oriented deep hierarchical reinforcement learning for next-basket recommendation[C]//Proceedings of the International Conference on Acoustics, Speech and Signal Processing, 2022: 4093-4097.
[7] YANG Y, HUANG C, XIA L, et al. Multi-behavior hypergraph-enhanced transformer for sequential recommendation[C]//Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022: 2263-2274.
[8] QIN Y, WANG P, LI C. The world is binary: Contrastive learning for denoising next basket recommendation[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2021: 859-868.
[9] YAO Z, CHEN X, WANG S, et al. Recommender transformers with behavior pathways[J]. arXiv preprint arXiv: 2206. 06804, 2022.
[10] SARWAR B, KARYPIS G, KONSTAN J, et al. Item-based collaborative filtering recommendation algorithms[C]//Proceedings of the 10th International Conference on World Wide Web, 2001: 285-295.
[11] KOREN Y, BELL R, VOLINSKY C. Matrix factorization techniques for recommender systems[J]. Computer, 2009, 42(8): 30-37.
[12] HE R, MCAULEY J. Fusing similarity models with Markov chains for sparse sequential recommendation[C]//Proceedings of the 16th International Conference on Data Mining, 2016: 191-200.
[13] HIDASI B, KARATZOGLOU A, BALTRUNAS L, et al. Session-based recommendations with recurrent neural networks[J]. arXiv preprint arXiv: 1511.06939, 2015.
[14] TANG J, WANG K. Personalized top-n sequential recommendation via convolutional sequence embedding[C]//Proceedings of the 11th ACM International Conference on Web Search and Data Mining, 2018: 565-573.
[15] SUN F, LIU J, WU J, et al. BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019: 1441-1450.
[16] CHANG J, GAO C, ZHENG Y, et al. Sequential recommendation with graph neural networks[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2021: 378-387.
[17] JANG E, GU S, POOLE B. Categorical reparameterization with gumbel-softmax[J].arXiv preprint arXiv: 1611.01144, 2016.
[18] ZHOU K, YU H, ZHAO W X, et al. Filter-enhanced MLP is all you need for sequential recommendation[C]//Proceedings of the ACM Web Conference, 2022: 2388-2399.
[19] MCAULEY J, TARGETT C, SHI Q, et al. Image-based recommendations on styles and substitutes[C]//Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2015: 43-52.
[20] HARPER F M, KONSTAN J A. The movielens datasets: History and context[J]. ACM Transactions on Interactive Intelligent Systems, 2015, 5(4): 1-19.
[21] HIDASI B, QUADRANA M, KARATZOGLOU A, et al. Parallel recurrent neural network architectures for feature-rich session-based recommendations[C]//Proceedings of the 10th ACM Conference on Recommender Systems, 2016: 241-248.
[22] HIDASI B, KARATZOGLOU A. Recurrent neural networks with top-k gains for session-based recommendations[C]//Proceedings of the 27th ACM International Conference on Information and Knowledge Management, 2018: 843-852.
[23] YU L, ZHANG C, LIANG S, et al. Multi-order attentive ranking model for sequential recommendation[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019, 33(01): 5709-5716.
[24] KINGMA D P, BA J. Adam: A method for stochastic optimization[J]. arXiv preprint arXiv: 1412.6980, 2014.
[25] HE X, LIAO L, ZHANG H, et al. Neural collaborative filtering[C]//Proceedings of the 26th International Conference on World Wide Web, 2017: 173-182.
[26] HE X, CHEN T, KAN M Y, et al. Trirank: Review-aware explainable recommendation by modeling aspects[C]//Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, 2015: 1661-1670.
{{custom_fnGroup.title_cn}}
脚注
{{custom_fn.content}}
基金
国家自然科学基金(82160347)
{{custom_fund}}