Shatalov, O.Ryabova, N. V.2022-01-112022-01-112021Shatalov O., Ryabova N. Towards Russian Text Generation Problem Using OpenAI’s GPT-2. Proc. 5th Int. Conf. On Computational Linguistics and Intelligent Systems (COLINS), Volume I: Main Conference. CEUR Workshop Proceedings, 2021, 2870, pp.141-153.URN: urn:nbn:de:0074-2870-3ARCHIVE: http://sunsite.informatik.rwth-aachen.de/ftp/pub/publications/CEUR-WS/Vol-2870.ziphttps://openarchive.nure.ua/handle/document/19040This work is devoted to Natural Language Generation (NLG) problem. The modern approaches in this area based on deep neural networks are considered. The most famous and promising deep neural network architectures that are related to this problem are considered, in particular, the most popular free software solutions for NLG based on Transformers architecture with pre-trained deep neural network models GPT-2 and BERT. The main problem is that the main part of already existing solutions is devoted to the English language. But there are few models that are able to generate text in Russian. Moreover, the text they generate often belongs to a general topic and not about a specific subject area. The object of the study is the generation of a contextually coherent narrow-profile text in Russian. Within the framework of the study, a model was trained for generating coherent articles of a given subject area in Russian, as well as a software application for interacting with it.enNatural Language GenerationNatural Language ProcessingчTransformers ArchitectureDeep LearningTransfer LearningGPT-2Towards Russian Text Generation Problem Using OpenAI’s GPT-2Proc. 5th Int. Conf. On Computational Linguistics and Intelligent Systems (COLINS), Volume I: Main Conference.Conference proceedings