Indian Institute of Science | India
Shreedhar Kodate | Tushar Shinde
Precis/summary writing is a very interesting task for humans. After collecting all kinds of information, it is being compressed and stored in abstractively summarized way, let's say "small text". This small text can now be distributed as quick source of important information. It can also be expanded back into loads of data based on certain and different requirements. Neural sequence-to-sequence models with various techniques viz. recurrent networks, pointer-generator, coverage vector, intra-attention, multi-sentence summaries have improved the state-of-the-art for Abstractive Summarization. In this project we intend to twist and tweak various existing models to understand how these techniques would act as different dimensions of summarization and if there are any overlaps. We would also like to find out if there is a "dominating technique" that can take the state-of-the-art to the next level. We will be proposing a new model BKA which is intuitionally similar to what humans do as part of their day-to-day learning.