Abstract:Abstractive text summarization aims to generate concise and readable summaries by understanding the original input text. However, the summaries produced by existing models still face issues such as semantic redundancy, factual errors, and exposure bias. Addressing these problems is crucial for improving model performance and summary quality. Therefore, an abstractive text summarization model that integrates knowledge enhancement with the SimCLS framework is proposed. First, a knowledge-enhanced encoder is designed to obtain structured knowledge information from the source text to preserve the structural information of the global context, and it is combined with a text encoder to fully capture the semantic information of the entire text. Then, the copy mechanism is utilized in the decoder to more accurately reproduce the information from the original text. Finally, the summaries generated by the model are scored using the SimCLS two-stage contrastive learning framework to guide the generation of high-quality summaries. The experimental results show that, when compared with the higher-performing SeqCo model, the proposed model improves ROUGE-1/2/L and BERTScore on the CNN/Daily Mail dataset by 1.84, 0.65, 2.04, and 0.21 percentage points, respectively, and on the XSum dataset by 1.78, 2.16, 2.36, and 0.13 percentage points, confirming the model’s effectiveness.