What is Mixture of Agents (MoA)?
TL;DR
A technique that hierarchically combines multiple LLMs to achieve performance exceeding any single model.
Mixture of Agents (MoA): Definition & Explanation
Mixture of Agents (MoA) is a technique that hierarchically combines multiple different LLMs to achieve performance surpassing individual models. While Mixture of Experts (MoE) is an internal model architecture, MoA externally combines independent models. In the first layer, multiple LLMs (such as GPT-4o, Claude, and Gemini) answer the same question, and in the second layer, an aggregator model integrates their responses to produce the final answer. Research from Together AI demonstrated that MoA achieved performance exceeding GPT-4o on the AlpacaEval 2.0 benchmark.