Existing natural language generation models often face the problems of context loss and incoherent responses when dealing with multi-round dialogs. In this paper, a multi-round dialog system based on Transformer architecture is constructed, and an intention recognition algorithm is used to form a technical support for the construction of multi-round dialog system. And the attention adapter is introduced into the natural language generation module in the system, which utilizes contextual features to improve the performance of the natural language generation model. Semantic slot extraction experiments are carried out on the ATIS dataset, and the F1 values of the semantic slot extraction task and intention recognition task of the BERT multi-task natural language generation model with the addition of the attention mechanism are improved by 0.64% and 0.15%, respectively. The multi-round dialog system designed in this paper has a perplexity of 18.33, and the BLEU metrics are higher on orders 1-4 compared to other models. Manually evaluated in terms of syntactic semantic coherence, relevance, and information content, the system performs better. It shows that the natural language generation model incorporating the attention mechanism can effectively improve the application effect of the multi-round dialog system.
1970-2025 CP (Manitoba, Canada) unless otherwise stated.