LiteLLM
5.8/10$540/yr moreBest free OSS proxy, Apache 2 / MIT Python with 100+ providers
Apache 2 / MIT Python proxy translating 100+ provider APIs to OpenAI Chat Completions schema; YC W23.
| Plan | Monthly | Annual | What you get |
|---|---|---|---|
| Open Source | Free | — | Apache 2 / MIT-licensed self-hosted Python proxy that translates 100+ provider APIs to the OpenAI Chat Completions schema; free forever |
| Cloud Free | Free | — | Free hosted tier with limited request volume and standard provider integrations |
| Cloud Pro | $50.00/mo | $600.00/yr | $50 per user a month with cost tracking, budgets, team management, and virtual API keys |
| Enterprise | $2,000.00/mo | $24,000.00/yr | Custom contract with self-hosted enterprise, SSO, SOC 2, and HIPAA available |
LiteLLM is the Apache 2 / MIT-licensed OSS proxy pick and the largest community among self-hosted LLM gateway proxies. Founded 2023 by BerriAI Inc (Y Combinator W23) in San Francisco. The wedge for free readers: Apache 2 / MIT permissive license, Python proxy that translates 100-plus provider APIs to the OpenAI Chat Completions schema, and self-host with zero licensing cost. Run it as a self-hosted service in front of your model providers and your application code targets the OpenAI SDK only.
Open Source is Apache 2 / MIT-licensed and free forever for self-hosted deployment with 100-plus provider integrations and community support. Cloud Free is a hosted tier with limited request volume. Cloud Pro is the upgrade tier at fifty dollars per user monthly with cost tracking, budgets, team management, and virtual API keys. Enterprise covers self-hosted enterprise with SSO and SOC 2.
The trade-off versus OpenRouter is operational tax; LiteLLM self-host requires Python operational maturity to run reliably. The trade-off versus Cloudflare is delivery model; LiteLLM is a proxy you run in front of providers where Cloudflare is bundled with Workers infrastructure. For free Python-stack readers who want full code control and zero markup, LiteLLM is the right call.
Pros
- Apache 2 / MIT permissive license; all commercial use allowed at any revenue level
- 100+ provider integrations free forever for self-hosted deployment
- Drop-in replacement at the application level: OpenAI SDK calls work unchanged
- Self-host on your own infrastructure for full code control and zero markup
- Y Combinator W23; the most-recognized OSS LLM proxy among Python developers
Cons
- Self-hosted Python proxy requires Python operational maturity to run reliably
- Cloud Pro per-user pricing scales fast for larger teams; self-host is cheaper above 5 users
Best for: Free Python-stack readers who want an Apache 2 / MIT OSS proxy for full code control and zero markup with self-host operational maturity.
- Routing
- 10
- Latency
- 8
- DX
- 7
- Value
- 10
- Support
- 7