What is Few-shot ICL (In-Context Learning)?
TL;DR
A technique that teaches LLMs a task by providing just a few examples in the prompt. No additional training required.
Few-shot ICL (In-Context Learning): Definition & Explanation
Few-shot ICL (Few-shot In-Context Learning) is a technique where a small number of input-output examples (typically 2-10) are included in the LLM's prompt, enabling the model to understand the task pattern and generate appropriate outputs for new inputs. Unlike fine-tuning, it requires no additional training — task behavior is controlled entirely through prompt design. For instance, showing three examples like 'Q: What is the population of Tokyo? A: Approximately 14 million' before asking a new question causes the model to respond in the same format. Accuracy generally improves from Zero-shot (no examples) to One-shot (one example) to Few-shot (several examples). This approach became widely known through the GPT-3 paper.