Please use this identifier to cite or link to this item: https://rfos.fon.bg.ac.rs/handle/123456789/3161
Title: An Evolutionary Overview of Large Language Models: From Statistical Methods to the Transformer Era
Authors: Damjanović, Boris
Korać, Dragan
Simić, Dejan 
Stamenković, Negovan
Keywords: artificial intelligence;large language models;transformer architecture;self-attention
Issue Date: Dec-2025
Publisher: PANEVROPSKI UNIVERZITET
Abstract: While the early evolution of large language models (LLMs), including shift from statistical approaches to the Transformer architecture, illustrates their historical impact on the processing of natural language; however, the latest research in neural networks has enabled the faster and more powerful rise of language models grounded in solid theoretical foundations. These advantages, driven by advances in computing systems (e.g., ultra-powerful processing and memory capabilities), enable the development of numerous new models based on new emerging technologies such as artificial intelligence (AI). Thus, we provide an evolutionary overview of LLMs involved in the shift from the statistical to deep learning approach, highlighting their key stages of development, with a particular focused on concepts such as self-attention, the Transformer architecture, BERT, GPT, DeepSeek, and Claude. Finally, our conclusions present a reference point for future research associated with the emergence of new AI-supported models that are irreversibly transforming the way an increasing number of human activities are performed.
URI: https://rfos.fon.bg.ac.rs/handle/123456789/3161
Appears in Collections:Radovi istraživača / Researchers’ publications

Files in This Item:
File Description SizeFormat 
Pages-from-JITA_Vol-15_Issue-2-WEB-7.pdf321.44 kBAdobe PDFView/Open
Show full item record

Page view(s)

10
checked on Dec 28, 2025

Download(s)

10
checked on Dec 28, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.