LiteLLM
An open-source Python library and proxy server providing a unified API interface for calling over 100 different LLM providers through a single OpenAI-compatible format. Developers write code against the LiteLLM interface once and switch between OpenAI, Anthropic, Azure OpenAI, Google Gemini, Cohere, Mistral, Ollama, and many others by changing a single model string without rewriting API call logic. The LiteLLM Proxy Server mode adds a production-grade gateway with load balancing across multiple API keys, automatic retries and fallbacks, cost tracking per team or project, rate limiting, and logging to observability tools. Budget controls prevent individual teams from exceeding allocated API spend. Open source under MIT license on GitHub; a hosted proxy option is available. Popular with MLOps engineers, AI platform teams, and developers working with multiple LLM providers who need a single unified interface.
Comments
Sign in to add a comment. Your account must be at least 1 day old.