What is Context Window?

TL;DR

The maximum amount of text an AI model can process at once, measured in tokens.

Context Window: Definition & Explanation

The context window refers to the maximum amount of text an LLM can reference and understand in a single processing session, measured in tokens. A larger context window enables tasks like summarizing long documents, analyzing large codebases, and maintaining extended conversation histories. For example, GPT-4o has a 128K-token context window, Claude 3.5 supports 200K tokens, and Gemini 1.5 Pro offers up to 1 million tokens. The size of the context window significantly impacts a model's practical usefulness, and when combined with RAG techniques, even more effective information processing becomes possible. However, larger context windows also increase processing costs, so efficient utilization is important.

Related AI Tools

Related Terms

AI Marketing Tools by Our Team