
Abstract Abstractive text summarization is more challenging than the extractive one since it is performed by paraphrasing the entire contents of the text, which has a higher difficulty. But, it produces a more natural summary and higher inter-sentence cohesion. Recurrent Neural Network (RNN) has experienced success in summarizing abstractive texts for English and Chinese texts. The Bidirectional Gated Recurrent Unit (BiGRU) RNN architecture is used so that the resulted summaries are influenced by the surrounding words. In this research, such a method is applied for Bahasa Indonesia to improve the text summarizations those are commonly developed using some extractive methods with low inter-sentence cohesion. An evaluation on a dataset of Indonesian journal documents shows that the proposed model is capable of summarizing the overall contents of testing documents into some summaries with high similarities to the provided abstracts. The proposed model resulting success in understanding source text for generating summarization.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 45 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
