ConciseNet: An End To End Abstractive Model For Topic Generation

Published in IJEAST, 2019

Recent approaches in Title generation using neural approaches have relied on an end to end deep learning system based on the sequence to sequence model. Such approaches have yielded good results but remain constricted in use due to a fixed size input which is often very small compared to the text being used or might take huge compute power to train and use if input size is increased. Our approach amalgamates an extractive and abstractive approach to get the best of both worlds using a textrank algorithm for the extractive part and a reasonably small seq2seq architecture as the abstractive part. Testing on the Amazon Fine Food Review dataset, our approach gives good results using less compute power. We utilize the prevailing metrics of ROUGE and Cosine Similarity. Manual checking shows that the majority of our generated topics are grammatically correct.