MytheAi

Glossary entry

Foundation Model

A large, general-purpose model trained on broad data that serves as the base for many downstream applications.

A foundation model is a large, general-purpose model trained on broad data and adapted (via prompting, fine-tuning, or RAG) to many downstream tasks. The term was coined by Stanford in 2021. GPT-4o, Claude, Gemini, Llama, and Mistral are foundation models for text. Stable Diffusion, Flux, and Imagen are foundation models for image.

The economics of foundation models concentrate at the largest labs because training requires hundreds of millions of dollars. Most application builders use foundation models via API rather than training their own.

Related terms

Written by

John Ethan

Founder & Editor-in-Chief

Founder of MytheAi. Tracking and reviewing AI and SaaS tools since January 2026. Built MytheAi out of frustration with pay-to-rank listicles and SEO-driven AI directories that prioritize ad revenue over honest guidance. Hands-on testing across 500+ tools to date.

·How we rank tools

See also: all 30 terms·how we research·Last reviewed 2026