Публікація: Towards Russian Text Generation Problem Using OpenAI’s GPT-2
Завантаження...
Дата
2021
Автори
Назва журналу
ISSN журналу
Назва тома
Видавництво
CEUR Workshop Proceedings
Анотація
This work is devoted to Natural Language Generation (NLG) problem. The modern approaches in this area based on deep neural networks are considered. The most famous and promising deep neural network architectures that are related to this problem are considered, in particular, the most popular free software solutions for NLG based on Transformers architecture with pre-trained deep neural network models GPT-2 and BERT. The main problem is that the main part of already existing solutions is devoted to the English language. But there are few models that are able to generate text in Russian. Moreover, the text they generate often belongs to a general topic and not about a specific subject area. The object of the study is the generation of a contextually coherent narrow-profile text in Russian. Within the framework of the study, a model was trained for generating coherent articles of a given subject area in Russian, as well as a software application for interacting with it.
Опис
Ключові слова
Natural Language Generation, Natural Language Processingч, Transformers Architecture, Deep Learning, Transfer Learning, GPT-2
Бібліографічний опис
Shatalov O., Ryabova N. Towards Russian Text Generation Problem Using OpenAI’s GPT-2. Proc. 5th Int. Conf. On Computational Linguistics and Intelligent Systems (COLINS), Volume I: Main Conference. CEUR Workshop Proceedings, 2021, 2870, pp.141-153.