Abstractive Summarization of News Articles Using BART
(Preprint)
This paper explores abstractive text summarization using the transformer-based BART-Large model to generate concise, newly constructed summaries of news articles. The model is fine-tuned on two benchmark datasets with distinct summarization requirements, allowing evaluation of its flexibility across different abstraction levels and output lengths. The results demonstrate that fine-tuned large language models can produce fluent and contextually meaningful summaries, highlighting their potential for real-world applications such as news aggregation and personalized information services.
Recommended citation: K C, M. B., Kaundinya, A. S., Adhikari, P., & Adhikari, S. (2025). "Abstractive Summarization of News Articles Using BART." (preprint).
Download Paper
