2017 Poster Sessions : Get To The Point: Summarization with Pointer-Generator Networks

Student Name : Abigail See
Advisor : Chris Manning
Research Areas: Artificial Intelligence
Neural sequence-to-sequence models provide a promising new approach for text summarization. Unlike the majority of past approaches that have been extractive (restricted to simply selecting and rearranging passages from the original text), the neural approach is abstractive (can freely generate text). However, the summaries produced by neural networks tend to contain factual inaccuracies. We propose a hybrid abstractive-extractive architecture, the pointer-generator model, which is able to copy words from the source text via pointing, and produce new words via generating. We apply our model to the CNN / Daily Mail summarization task and find that our hybrid network produces summaries that contain fewer factual inaccuracies. We outperform the current abstractive state-of-the-art by at least 2 ROUGE points.

Abi is a second-year PhD student in the Natural Language Processing group, advised by Chris Manning. She is interested in deep learning for language tasks such as machine translation and summarization, and spends time thinking about how to make deep learning more interpretable. Abi obtained her undergraduate degree in Mathematics from Cambridge University and has interned at Microsoft Research and Google Brain.