A token is the basic unit of text an LLM processes. English tokens are typically 3-4 characters or roughly 0.75 words. The word "tokenization" itself is one token; "ChatGPT" is two tokens. Code tokens are usually shorter because of more punctuation.
LLM pricing is almost always per-token: GPT-4o charges roughly $5 per 1M input tokens and $15 per 1M output tokens (2026 pricing). Context windows are also measured in tokens: a 200K-token window holds about 500 pages of English text. When you see a tool advertise "100K tokens of free use per month", multiply by 0.75 to get word count.