Please use this identifier to cite or link to this item: https://rfos.fon.bg.ac.rs/handle/123456789/3161
Full metadata record
DC FieldValueLanguage
dc.creatorDamjanović, Borisen_US
dc.creatorKorać, Draganen_US
dc.creatorSimić, Dejanen_US
dc.creatorStamenković, Negovanen_US
dc.date.accessioned2025-12-22T07:05:39Z-
dc.date.available2025-12-22T07:05:39Z-
dc.date.issued2025-12-
dc.identifier.urihttps://rfos.fon.bg.ac.rs/handle/123456789/3161-
dc.description.abstractWhile the early evolution of large language models (LLMs), including shift from statistical approaches to the Transformer architecture, illustrates their historical impact on the processing of natural language; however, the latest research in neural networks has enabled the faster and more powerful rise of language models grounded in solid theoretical foundations. These advantages, driven by advances in computing systems (e.g., ultra-powerful processing and memory capabilities), enable the development of numerous new models based on new emerging technologies such as artificial intelligence (AI). Thus, we provide an evolutionary overview of LLMs involved in the shift from the statistical to deep learning approach, highlighting their key stages of development, with a particular focused on concepts such as self-attention, the Transformer architecture, BERT, GPT, DeepSeek, and Claude. Finally, our conclusions present a reference point for future research associated with the emergence of new AI-supported models that are irreversibly transforming the way an increasing number of human activities are performed.en_US
dc.language.isoenen_US
dc.publisherPANEVROPSKI UNIVERZITETen_US
dc.rightsopenAccessen_US
dc.sourceThe Journal of Informational Technology and Applications (JITA)en_US
dc.subjectartificial intelligenceen_US
dc.subjectlarge language modelsen_US
dc.subjecttransformer architectureen_US
dc.subjectself-attentionen_US
dc.titleAn Evolutionary Overview of Large Language Models: From Statistical Methods to the Transformer Eraen_US
dc.typearticleen_US
dc.citation.epage153en_US
dc.citation.issue2en_US
dc.citation.spage145en_US
dc.citation.volume15en_US
dc.identifier.doi10.7251/JIT2502145D-
dc.type.versionpublishedVersionen_US
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextopen-
item.openairetypearticle-
Appears in Collections:Radovi istraživača / Researchers’ publications
Files in This Item:
File Description SizeFormat 
Pages-from-JITA_Vol-15_Issue-2-WEB-7.pdf321.44 kBAdobe PDFView/Open
Show simple item record

Page view(s)

10
checked on Dec 28, 2025

Download(s)

10
checked on Dec 28, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.