Content of 信息检索与问答 in our journal
  • Published in last 1 year
  • In last 2 years
  • In last 3 years
  • All

Please wait a minute...
  • Select all
    |
  • Information Retrieval and Question Answering
    XIANG Junyi, HU Huijun, LIU Maofu, MAO Ruibin
    . 2022, 36(3): 109-119.
    In order to improve the ability of semantic retrieval in search engines, this paper proposes a semantic relevancy model for news entity and text. A corpus 10, 000 financial news with the semantic relatedness between entities in headlines and text has been manually annotated. Then the BERTCA (Bidirectional Encoder Representation from Transformers Co-Attention semantic relevancy computing) model has been established using this corpus. Through the co-attention mechanism, this model can obtain the semantic matching between the entity and text, and it can not only calculate the degree of correlation between entity and text, but also determine the degree of correlation according to the semantic relevancy. The experimental results show that the accuracy of the proposed model surpasses 95%, which is better than the state-of-the-art models.
  • Information Retrieval and Question Answering
    LI Jianhong, HUANG Yafan, WANG Chengjun, DING Yunxia, ZHENG Wenjun,
    LI Jianhua, QIAN Fulan, ZHAO Xin
    . 2022, 36(3): 120-127.
    To further improve current recommendation algorithms, such as Matrix Factorization, a method of Deep Attention Matrix Factorization (DeepAMF) are introduced in this paper. First, the multi-layer perceptron technology is applied to obtain a better feature representation and got the relational information through the dot product operation during the original input, which are named as Deep Matrix Factorization (DeepMF). Then multi-layer attention network is exploited to to obtain the user's preference for the item. Besides, the dot product operation is applied before the output to obtain the related information of the feature expression. And the module was called. Experiments on four public data sets prove the effectiveness of the MAMF algorithm.