Browsing by Subject "Text generation"
Now showing items 1-3 of 3
-
The Development of a Sepedi Text Generation Model Using Transformers
(Southern Africa Telecommunication Networks and Applications Conference (SATNAC) 2022, 2022)Text generation is one of the important sub-tasks of natural language generation (NLG), and aims to produce humanly readable text given some input text. Deep learning approaches based on neural networks have been proposed ... -
Pre-training a Transformer-Based Generative Model Using a Small Sepedi Dataset
(2025-01-25)Due to the scarcity of data in low-resourced languages, the development of language models for these languages has been very slow. Currently, pre-trained language models have gained popularity in natural language processing, ... -
Transformer-based Text Generation for Code-Switched Sepedi-English News
(2023)Code-switched data is rarely available in written form and this makes the development of large datasets required to train code switched language models difficult. Currently, available Sepedi-English code-switched corpora ...