Browsing by Subject "Transformers"
Now showing items 1-3 of 3
-
The Analysis of the Sepedi-English Code-switched Radio News Corpus
(UP Jornals, 2022)Code-switching is a phenomenon that occurs mostly in multilingual countries where multilingual speakers often switch between languages in their conversations. The unavailability of largescale code-switched corpora ... -
The Development of a Sepedi Text Generation Model Using Transformers
(Southern Africa Telecommunication Networks and Applications Conference (SATNAC) 2022, 2022)Text generation is one of the important sub-tasks of natural language generation (NLG), and aims to produce humanly readable text given some input text. Deep learning approaches based on neural networks have been proposed ... -
Pre-training a Transformer-Based Generative Model Using a Small Sepedi Dataset
(2025-01-25)Due to the scarcity of data in low-resourced languages, the development of language models for these languages has been very slow. Currently, pre-trained language models have gained popularity in natural language processing, ...