What is Few-shot Learning?
TL;DR
A technique that teaches AI new tasks with just a handful of examples. Used by providing examples within the prompt.
Few-shot Learning: Definition & Explanation
Few-shot Learning is a technique that enables an AI model to perform new tasks by providing just a small number of examples (from a few to several dozen). With LLMs, including a few input-output examples in the prompt allows the model to recognize the pattern and execute similar tasks. For instance, you can achieve high accuracy in translation, classification, format conversion, or style changes by including just 2-3 examples in the prompt. Unlike fine-tuning, few-shot learning adapts to new tasks through prompts alone, without modifying the model's weights — a significant advantage. The effectiveness of few-shot learning was widely recognized through the GPT-3 paper. It is a fundamental prompt engineering technique used daily with ChatGPT, Claude, Gemini, and other conversational AI tools. It works best when combined strategically with one-shot (one example) and zero-shot (no examples) approaches.