Leveraging Title-Abstract Attentive Semantics for Paper Recommendation

Authors

  • Guibing Guo Northeastern University
  • Bowei Chen Northeastern University
  • Xiaoyan Zhang Shenzhen University
  • Zhirong Liu Huawei Noah's Ark Research Lab
  • Zhenhua Dong Huawei Noah's Ark Research Lab
  • Xiuqiang He Huawei Noah's Ark Research Lab

DOI:

https://doi.org/10.1609/aaai.v34i01.5335

Abstract

Paper recommendation is a research topic to provide users with personalized papers of interest. However, most existing approaches equally treat title and abstract as the input to learn the representation of a paper, ignoring their semantic relationship. In this paper, we regard the abstract as a sequence of sentences, and propose a two-level attentive neural network to capture: (1) the ability of each word within a sentence to reflect if it is semantically close to the words within the title. (2) the extent of each sentence in the abstract relative to the title, which is often a good summarization of the abstract document. Specifically, we propose a Long-Short Term Memory (LSTM) network with attention to learn the representation of sentences, and integrate a Gated Recurrent Unit (GRU) network with a memory network to learn the long-term sequential sentence patterns of interacted papers for both user and item (paper) modeling. We conduct extensive experiments on two real datasets, and show that our approach outperforms other state-of-the-art approaches in terms of accuracy.

Downloads

Published

2020-04-03

How to Cite

Guo, G., Chen, B., Zhang, X., Liu, Z., Dong, Z., & He, X. (2020). Leveraging Title-Abstract Attentive Semantics for Paper Recommendation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(01), 67-74. https://doi.org/10.1609/aaai.v34i01.5335

Issue

Section

AAAI Technical Track: AI and the Web