What is Long Context?

TL;DR

The ability of LLMs to process extremely long inputs exceeding 1 million tokens at once. Enables analysis of entire books.

Long Context: Definition & Explanation

Long Context refers to an LLM's ability to handle a very large number of tokens (context window) in a single input. Google Gemini 1.5 Pro supports 1 million tokens (roughly equivalent to 7 books), and Anthropic Claude supports 200K tokens. This enables comprehensive analysis of lengthy documents, batch review of large codebases, summarization of extended meeting transcripts, and cross-document analysis. The Needle-in-a-Haystack test evaluates whether models can accurately locate specific information within a long context. Since large amounts of information can be input directly without RAG, long context capabilities can also simplify system architecture.

Related AI Tools

Related Terms

AI Marketing Tools by Our Team