What is Token?
TL;DR
The smallest unit of text that an AI model processes. Used as the basis for pricing and context window limits.
Token: Definition & Explanation
A token is the smallest unit of text that an LLM processes. In English, one token is roughly 4 characters or about three-quarters of a word. AI service pricing is often calculated based on token count, with both input tokens (the prompt) and output tokens (the generated text) counted separately. The context window — the maximum amount of text a model can process at once — is also measured in tokens. ChatGPT's GPT-4o supports 128K tokens, Claude supports 200K tokens, and Gemini supports up to 1 million tokens. A larger context window means the model can handle longer conversations and process more documents at once.