It’s an AI system trained on massive amounts of text to predict the next word in a sequence.
By learning patterns, grammar, facts, and reasoning from text, it can generate human-like language, answer questions, write code, summarize, translate, and more.
Most modern LLMs (like GPT-4, GPT-5, Claude, LLaMA) are built on the transformer architecture introduced by Google in 2017.