GPT
Generative Pre-trained Transformer
ELI5 — The Vibe Check
GPT is the brand of AI model from OpenAI that kicked off the LLM revolution. GPT-3 made everyone's jaw drop, GPT-4 made jaws stay dropped. The 'Generative' part means it creates new text, 'Pre-trained' means it already learned from the internet, and 'Transformer' is the special architecture that made it all possible.
Real Talk
GPT refers to OpenAI's series of autoregressive language models using the Transformer architecture. Pre-trained on large text corpora via next-token prediction and refined with RLHF (Reinforcement Learning from Human Feedback). GPT-4 and its successors power ChatGPT and the OpenAI API.
When You'll Hear This
"We're using GPT-4 for the summarization feature." / "This reads like it was written by GPT."
Related Terms
ChatGPT
ChatGPT is the app that made AI mainstream — it's the iPhone moment for artificial intelligence.
Fine-tuning
Fine-tuning is like taking a smart graduate student who knows everything and then sending them to a specialist bootcamp.
LLM (Large Language Model)
An LLM is a humongous AI that read basically the entire internet and learned to predict what words come next, really really well.
OpenAI
OpenAI is the company behind ChatGPT, GPT-4, DALL-E, and Codex.
Token
In AI-land, a token is a chunk of text — roughly 3/4 of a word.
Transformer
The Transformer is THE architecture behind all modern AI. ChatGPT, Claude, Midjourney, Whisper — all transformers under the hood. The key innovation?