ka | en
TSU

GPT2(Generative Pre-trained Transformer), Georgian texts generator model based on transformer architecture

Author: Aleksandre Sabanadze
Keywords: Artificial Intelligence, Natural Language Processing, Text Generation, GPT2
Annotation:

Georgian texts generation artificial intelligence based on GPT2.



Web Development by WebDevelopmentQuote.com
Design downloaded from Free Templates - your source for free web templates
Supported by Hosting24.com