Position-Aware Neural Attentive Graph Networks for Multi-hop Question Answering


Recently, graph neural networks (GNN) based multi-hop question answering (QA) has been studied extensively, as such graph representation can express rich dependencies in language explicitly. However, graph representation suffers from the loss of sequential information, and hardness in representing global semantic information with specific to downstream tasks. In this work, we propose the \textit{query-attention mechanism} to enhance the GNN-QA system by utilizing both global and local contextual information. We also explore injecting the positional information into the graph as to complement the sequential information. We experiment our idea in Entity Relational-Graph Convolutional Networks \cite{decao2019questionansweringRGCN} on part of WikiHop dataset. We identify the existence of \textit{position bias} in the dataset, and the experiment results with ablation study confirmed that our proposed modules improve baseline to achieve higher generalization accuracy with 1.43%.

Technical Report
Yifu Qiu
Yifu Qiu
PhD student in Natural Language Processing

My research interests include Machine Learning, Natural Language Processing, Cognitive Science.