Skip to content

Token

Easy — everyone uses thisVibecoding

ELI5 — The Vibe Check

In AI-land, a token is a chunk of text — roughly 3/4 of a word. Every time you talk to an AI, your message gets chopped into tokens, processed, and you get tokens back. More tokens = more expensive. It's the currency of the AI world. 'Hello world' is 2 tokens, but emoji are weirdly expensive and non-English text uses more tokens per word.

Real Talk

A token is the fundamental unit of text processing in LLMs. Tokenizers split text into subword units using algorithms like BPE (Byte Pair Encoding). Token count determines both cost (API pricing is per-token) and context window usage. English text averages ~1.3 tokens per word; code typically has a higher token-per-character ratio due to syntax.

When You'll Hear This

"That prompt is 4,000 tokens — it's going to cost us." / "We optimized the system prompt from 2,000 to 800 tokens."

Made with passive-aggressive love by manoga.digital. Powered by Claude.