Please use this identifier to cite or link to this item:
https://rfos.fon.bg.ac.rs/handle/123456789/3161Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.creator | Damjanović, Boris | en_US |
| dc.creator | Korać, Dragan | en_US |
| dc.creator | Simić, Dejan | en_US |
| dc.creator | Stamenković, Negovan | en_US |
| dc.date.accessioned | 2025-12-22T07:05:39Z | - |
| dc.date.available | 2025-12-22T07:05:39Z | - |
| dc.date.issued | 2025-12 | - |
| dc.identifier.uri | https://rfos.fon.bg.ac.rs/handle/123456789/3161 | - |
| dc.description.abstract | While the early evolution of large language models (LLMs), including shift from statistical approaches to the Transformer architecture, illustrates their historical impact on the processing of natural language; however, the latest research in neural networks has enabled the faster and more powerful rise of language models grounded in solid theoretical foundations. These advantages, driven by advances in computing systems (e.g., ultra-powerful processing and memory capabilities), enable the development of numerous new models based on new emerging technologies such as artificial intelligence (AI). Thus, we provide an evolutionary overview of LLMs involved in the shift from the statistical to deep learning approach, highlighting their key stages of development, with a particular focused on concepts such as self-attention, the Transformer architecture, BERT, GPT, DeepSeek, and Claude. Finally, our conclusions present a reference point for future research associated with the emergence of new AI-supported models that are irreversibly transforming the way an increasing number of human activities are performed. | en_US |
| dc.language.iso | en | en_US |
| dc.publisher | PANEVROPSKI UNIVERZITET | en_US |
| dc.rights | openAccess | en_US |
| dc.source | The Journal of Informational Technology and Applications (JITA) | en_US |
| dc.subject | artificial intelligence | en_US |
| dc.subject | large language models | en_US |
| dc.subject | transformer architecture | en_US |
| dc.subject | self-attention | en_US |
| dc.title | An Evolutionary Overview of Large Language Models: From Statistical Methods to the Transformer Era | en_US |
| dc.type | article | en_US |
| dc.citation.epage | 153 | en_US |
| dc.citation.issue | 2 | en_US |
| dc.citation.spage | 145 | en_US |
| dc.citation.volume | 15 | en_US |
| dc.identifier.doi | 10.7251/JIT2502145D | - |
| dc.type.version | publishedVersion | en_US |
| item.cerifentitytype | Publications | - |
| item.fulltext | With Fulltext | - |
| item.languageiso639-1 | en | - |
| item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
| item.grantfulltext | open | - |
| item.openairetype | article | - |
| Appears in Collections: | Radovi istraživača / Researchers’ publications | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| Pages-from-JITA_Vol-15_Issue-2-WEB-7.pdf | 321.44 kB | Adobe PDF | View/Open |
Page view(s)
10
checked on Dec 28, 2025
Download(s)
10
checked on Dec 28, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.