Skip to content

Best Free LLM Gateways of 2026

Updated · 5 picks · live pricing · affiliate disclosure

Apache 2 / MIT Python proxy translating 100+ provider APIs to OpenAI Chat Completions schema; YC W23.

BEST OVERALL5.8/10$540/yr more

LiteLLM

Apache 2 / MIT Python proxy translating 100+ provider APIs to OpenAI Chat Completions schema; YC W23.

OSS Apache 2 / MIT free forever; cancel Cloud anytime

How it stacks up

  • OSS Apache 2 / MIT free

    vs Cloudflare bundled

  • Cloud Pro $50/user/mo

    vs OpenRouter SaaS

  • 100+ provider integrations

    vs Portkey OSS Apache 2.0

#2
Portkey5.8/10

From $49/mo

View
#3
Cloudflare AI Gateway5.6/10

From $5/mo

View

All picks at a glance

#PickBest forStartingScore
1LiteLLMBest free OSS proxy, Apache 2 / MIT Python with 100+ providers$50.00/mo5.8/10
2PortkeyBest free production OSS gateway, Apache 2.0 with caching and guardrails$49.00/mo5.8/10
3Cloudflare AI GatewayBest free edge-native, 10K requests a day across 300+ data centers$5.00/mo5.6/10
4Vercel AI GatewayBest free Vercel-native, bundled with Vercel Hobby$20.00/mo5.5/10
5OpenRouterBest free API aggregator, 300+ models with $1 starter credit$20.00/mo5.1/10

Quick pick by use case

If you only have thirty seconds, find your situation below and skip to that pick.

Compare all 5 picks

Top spec
#1LiteLLM5.8/10$50.00/mo$600.00/yr$540/yr moreOSS Apache 2 / MIT free
#2Portkey5.8/10$49.00/mo$588.00/yr$528/yr moreOSS Apache 2.0 free self-host
#3Cloudflare AI Gateway5.6/10$200.00/mo$2,400.00/yr$2,340/yr moreFree 10K req/day
#4Vercel AI Gateway5.5/10$20.00/mo$240.00/yr$180/yr moreFree with Vercel Hobby
#5OpenRouter5.1/10$20.00/mo$180/yr moreFree $1 starter credit
#1

LiteLLM

5.8/10$540/yr more

Best free OSS proxy, Apache 2 / MIT Python with 100+ providers

Apache 2 / MIT Python proxy translating 100+ provider APIs to OpenAI Chat Completions schema; YC W23.

PlanMonthlyAnnualWhat you get
Open SourceFreeApache 2 / MIT-licensed self-hosted Python proxy that translates 100+ provider APIs to the OpenAI Chat Completions schema; free forever
Cloud FreeFreeFree hosted tier with limited request volume and standard provider integrations
Cloud Pro$50.00/mo$600.00/yr$50 per user a month with cost tracking, budgets, team management, and virtual API keys
Enterprise$2,000.00/mo$24,000.00/yrCustom contract with self-hosted enterprise, SSO, SOC 2, and HIPAA available

LiteLLM is the Apache 2 / MIT-licensed OSS proxy pick and the largest community among self-hosted LLM gateway proxies. Founded 2023 by BerriAI Inc (Y Combinator W23) in San Francisco. The wedge for free readers: Apache 2 / MIT permissive license, Python proxy that translates 100-plus provider APIs to the OpenAI Chat Completions schema, and self-host with zero licensing cost. Run it as a self-hosted service in front of your model providers and your application code targets the OpenAI SDK only.

Open Source is Apache 2 / MIT-licensed and free forever for self-hosted deployment with 100-plus provider integrations and community support. Cloud Free is a hosted tier with limited request volume. Cloud Pro is the upgrade tier at fifty dollars per user monthly with cost tracking, budgets, team management, and virtual API keys. Enterprise covers self-hosted enterprise with SSO and SOC 2.

The trade-off versus OpenRouter is operational tax; LiteLLM self-host requires Python operational maturity to run reliably. The trade-off versus Cloudflare is delivery model; LiteLLM is a proxy you run in front of providers where Cloudflare is bundled with Workers infrastructure. For free Python-stack readers who want full code control and zero markup, LiteLLM is the right call.

Pros

  • Apache 2 / MIT permissive license; all commercial use allowed at any revenue level
  • 100+ provider integrations free forever for self-hosted deployment
  • Drop-in replacement at the application level: OpenAI SDK calls work unchanged
  • Self-host on your own infrastructure for full code control and zero markup
  • Y Combinator W23; the most-recognized OSS LLM proxy among Python developers

Cons

  • Self-hosted Python proxy requires Python operational maturity to run reliably
  • Cloud Pro per-user pricing scales fast for larger teams; self-host is cheaper above 5 users
OSS Apache 2 / MIT freeCloud Pro $50/user/mo100+ provider integrationsOSS Apache 2 / MIT free forever; cancel Cloud anytime

Best for: Free Python-stack readers who want an Apache 2 / MIT OSS proxy for full code control and zero markup with self-host operational maturity.

Routing
10
Latency
8
DX
7
Value
10
Support
7
#2

Portkey

5.8/10$528/yr more

Best free production OSS gateway, Apache 2.0 with caching and guardrails

Apache 2.0 gateway since March 2026; caching, fallbacks, prompt management, guardrails on self-host free.

PlanMonthlyAnnualWhat you get
Open SourceFreeApache 2.0 self-hosted gateway with Universal API, retries, routing, guardrails, automatic fallbacks, basic dashboard, and load balancing; free forever (open-sourced March 2026)
Developer FreeFreeFree hosted tier with 10K recorded logs a month, Universal API, key management, 3 prompt templates, and basic observability
Production$49.00/mo$588.00/yr$49 a month for 100K logs with unlimited prompt templates, alerts, LLM guardrails, semantic caching, RBAC, and overages at $9 per additional 100K (up to 3M)
Enterprise$2,000.00/mo$24,000.00/yrCustom contract with 10M-plus logs, custom retention, advanced guardrails, SSO, VPC hosting, SOC 2 Type II, GDPR, and HIPAA

Portkey is the production-gateway pick that fully open-sourced under Apache 2.0 on March 24, 2026 and the cleanest free-forever path for production-scale features. Founded 2023 in San Francisco. The wedge: Apache 2.0 self-host ships caching, fallbacks, prompt management, guardrails, RBAC, and observability across 40-plus providers. Cloud handles around one trillion tokens a day across roughly 24,000 organizations.

Open Source Apache 2.0 is free forever for self-host with Universal API, retries, routing, guardrails, and load balancing. Developer Free hosted covers 10,000 logs a month with key management. Production is the upgrade tier at forty-nine dollars monthly with 100,000 logs and semantic caching. Enterprise covers VPC hosting, SSO, SOC 2 Type II, GDPR, and HIPAA.

The trade-off versus LiteLLM is scope; Portkey is a full production gateway where LiteLLM is a focused proxy. The trade-off versus Cloudflare is delivery model; Portkey is self-hosted rather than bundled with edge. For free production teams who want caching and guardrails on a self-hosted Apache 2.0 gateway, Portkey is the right call.

Pros

  • Apache 2.0 OSS gateway since March 2026; full production feature set free for self-host
  • Caching, fallbacks, prompt management, guardrails, and observability on one dashboard
  • Cloud handles around 1 trillion tokens daily across roughly 24,000 organizations
  • Developer Free hosted covers 10,000 logs a month for evaluation
  • Production upgrade at forty-nine dollars monthly with semantic caching

Cons

  • Self-host requires you to run and monitor the gateway versus managed Cloud
  • Indie-dev brand recognition narrower than OpenRouter
OSS Apache 2.0 free self-hostDeveloper Free 10K logs/moProduction $49/mo upgradeOSS Apache 2.0 free forever; Developer Free cancel-anytime

Best for: Free production teams who want a self-hosted Apache 2.0 gateway with caching and guardrails on dollar-zero licensing cost.

Routing
9
Latency
9
DX
8
Value
8
Support
8
#3

Cloudflare AI Gateway

5.6/10$2,340/yr more

Best free edge-native, 10K requests a day across 300+ data centers

Edge-native gateway bundled with Cloudflare Workers; free 10K requests a day; launched 2023.

PlanMonthlyAnnualWhat you get
FreeFreeFree 10K requests a day with caching, analytics, rate limits, and OpenAI/Anthropic/Workers AI provider support
Workers Paid$5.00/mo$60.00/yr$5 a month for the Workers Paid plan with higher request limits and edge-native gateway access
Workers Standard$200.00/mo$2,400.00/yr$200 a month for Workers Standard with 50M Workers requests included and unlimited AI Gateway calls
Enterprise$2,000.00/mo$24,000.00/yrCustom contract with dedicated regions, private connect, SLA, and audit logs

Cloudflare AI Gateway is the edge-native pick and the cleanest free-forever path for indie developers running serverless apps. Launched 2023 by Cloudflare (founded 2009 in San Francisco). The wedge for free readers: the gateway is bundled with Cloudflare Workers across 300-plus data centers, calls run from the closest Cloudflare PoP, and cached responses serve at the edge with zero infrastructure setup.

Free covers 10,000 requests a day with caching, analytics, rate limits, and OpenAI plus Anthropic plus Workers AI provider support. Workers Paid is the cheapest paid upgrade at five dollars monthly. Workers Standard at the next upgrade tier bundles 50 million Workers requests. Most free indie developers stay on Free indefinitely; Workers Paid is the upgrade when daily volume passes 10,000 requests.

The trade-off versus OpenRouter is provider count; Cloudflare ships fewer model providers than OpenRouter's 60-plus. The trade-off versus LiteLLM is gateway feature breadth; Cloudflare ships caching but lighter prompt management. For free indie readers building edge-native serverless apps, Cloudflare is the right call.

Pros

  • Free 10,000 requests a day with caching, analytics, and rate limits
  • Edge-native: calls run from the closest Cloudflare PoP across 300+ data centers
  • Workers Paid at five dollars monthly is the cheapest paid LLM gateway entry
  • Tight integration with Cloudflare Workers for serverless edge LLM apps
  • OpenAI, Anthropic, and Workers AI providers supported on Free

Cons

  • Fewer model providers than OpenRouter (60+ catalog versus Cloudflare baseline)
  • Gateway feature set lighter than LiteLLM self-host or Portkey OSS
Free 10K req/dayWorkers Paid $5/moEdge across 300+ PoPsFree 10K req/day permanent; cancel-anytime

Best for: Free indie developers building edge-native serverless apps on Cloudflare Workers who want zero infrastructure setup and the cheapest paid upgrade path.

Routing
8
Latency
10
DX
7
Value
8
Support
7
#4

Vercel AI Gateway

5.5/10$180/yr more

Best free Vercel-native, bundled with Vercel Hobby

Free with Vercel Hobby plus zero token markup; tight AI SDK v5 integration; launched 2024.

PlanMonthlyAnnualWhat you get
HobbyFreeFree with the Vercel Hobby plan with hundreds of models, no token markup, and built-in observability
Pro$20.00/mo$240.00/yr$20 per user a month bundled with the Vercel Pro plan with load balancing, fallbacks, and budgets
Enterprise$2,000.00/mo$24,000.00/yrCustom contract with private clusters, SAML SSO, audit logs, dedicated support, and SLA

Vercel AI Gateway is the frontend-stack-native pick and the cleanest path for Next.js teams already on Vercel. Launched 2024 by Vercel (founded 2015 in San Francisco). The wedge for free readers: the gateway is bundled with the Vercel Hobby plan at zero cost, ships hundreds of models behind a unified endpoint with zero token markup, and integrates tightly with the AI SDK v5 ecosystem for the cleanest Next.js stack experience.

Hobby is free with the Vercel Hobby plan with hundreds of models, no markup, BYOK support, and built-in observability. Pro is the upgrade tier at twenty dollars per user monthly bundled with Vercel Pro and adds load balancing, fallbacks, budgets, and spend monitoring. Enterprise covers private clusters and SAML SSO. Most free Next.js developers stay on Hobby indefinitely; the Pro upgrade is for production team workflows.

The trade-off versus OpenRouter is provider count; Vercel ships hundreds of models versus OpenRouter's 300-plus. The trade-off versus Cloudflare is edge-native delivery; Vercel runs from Vercel's edge but Cloudflare's edge is broader. For free Next.js readers already on Vercel and using the AI SDK, Vercel AI Gateway is the friction-free path.

Pros

  • Free with Vercel Hobby; zero token markup on inference
  • Tight integration with Vercel AI SDK v5 for the cleanest Next.js stack experience
  • Hundreds of models behind a unified endpoint with built-in observability
  • BYOK supported: bring your own provider keys without surrendering control
  • Pro upgrade at twenty dollars per user bundled with Vercel Pro

Cons

  • Ecosystem lock-in: AI SDK integration is the load-bearing differentiator outside Vercel
  • Less feature breadth than Portkey OSS: no guardrails, narrower prompt management
Free with Vercel HobbyPro $20/user bundledAI SDK v5 nativeFree with Vercel Hobby; cancel anytime

Best for: Free Next.js developers already on Vercel who want the cleanest AI SDK integration and accept ecosystem lock-in as the trade-off.

Routing
8
Latency
9
DX
10
Value
9
Support
8
#5

OpenRouter

5.1/10$180/yr more

Best free API aggregator, 300+ models with $1 starter credit

Around 300+ models from 60+ providers behind a single OpenAI-compatible endpoint; $1 starter credit free.

PlanMonthlyWhat you get
FreeFreeFree $1 credit with access to 300+ models from 60+ providers behind a unified OpenAI-compatible API
Pay-as-you-goFreePass-through provider pricing with zero markup on inference; a small Stripe fee applies on credit purchases
Pro$20.00/mo$20 a month credit minimum with higher rate limits, Slack support, and the indie-dev paid entry
Enterprise$1,000.00/moCustom contract with dedicated routing, private deploys, SLA, and audit logs

OpenRouter is the indie-dev brand reference for LLM gateways and the cleanest path for free model evaluation. Founded 2023 in San Francisco. The wedge for free readers: every account starts with a one-dollar credit that grants full access to all 300-plus models from 60-plus providers behind a single OpenAI-compatible endpoint. No credit card required for evaluation.

Free gives every account a one-dollar starter credit with full access to all models. Pay-as-you-go bills consumed tokens at provider price with no inference markup; a small Stripe processing fee applies on credit purchases. Pro is the upgrade tier at twenty dollars monthly as a credit minimum unlocking higher rate limits and Slack support. Most free indie developers run small evaluations on the starter credit then move to pay-as-you-go.

The trade-off versus Cloudflare is request volume on the free path; OpenRouter's starter credit funds testing rather than ongoing free use. The trade-off versus LiteLLM is OSS optionality; OpenRouter is closed-source SaaS where LiteLLM is Apache 2 / MIT self-host. For free indie readers evaluating multiple models without managing provider keys, OpenRouter is friction-free.

Pros

  • One-dollar starter credit with full access to 300+ models for evaluation
  • 300+ models from 60+ providers behind one OpenAI-compatible endpoint
  • Pass-through provider pricing with zero markup on inference tokens
  • No credit card required for evaluation; auto-fallback between providers
  • Founded 2023; the most-recognized LLM gateway name among indie developers

Cons

  • Free starter credit funds testing, not ongoing free use like Cloudflare
  • Closed-source SaaS versus auditable LiteLLM Apache 2 / MIT or Portkey Apache 2.0
Free $1 starter creditPay-as-you-go provider rate300+ modelsFree $1 starter credit; cancel anytime

Best for: Free indie developers evaluating multiple LLM models without managing provider keys and accept starter-credit ceiling as the trade-off.

Routing
7
Latency
8
DX
10
Value
9
Support
7

How we picked

Each pick gets a transparent composite score from price, features, free-tier availability, and editor fit. Pricing flows from our live database, so when a vendor changes prices the score updates here too.

We weight price at 40 percent, features at 30, free tier at 15, fit at 15. Cloudflare AI Gateway leads because free 10,000 requests a day on the edge plus zero infrastructure setup is the strongest indie-dev free-forever defensibility. See the parent /best/llm-gateways guide for paid-only picks like Helicone and observability-only picks like Langfuse excluded from this lens.

We don't claim "30,000 hours of testing." Our methodology is the formula above plus the editor's published verdict for each pick. Verifiable, auditable, and updated when the underlying data changes.

Why trust Subrupt

We're a subscription tracker first, a buying guide second. Every claim on this page is something you can check.

By use case

Best free edge-native

Cloudflare AI Gateway

Read the full review →

Best free API aggregator

OpenRouter

Read the full review →

Best free Vercel-native

Vercel AI Gateway

Read the full review →

Best free OSS proxy

LiteLLM

Read the full review →

Best free production OSS gateway

Portkey

Read the full review →

How to choose your Free LLM Gateway

OSS self-host vs SaaS free tier is the only question that matters

Most 'best free LLM gateway' lists conflate two genuinely different paths. OSS self-host covers LiteLLM (Apache 2 / MIT proxy) and Portkey (Apache 2.0 gateway since March 2026). Self-host is dollar-zero on licensing but readers pay infrastructure (typically a small VPS at five to fifteen dollars monthly) and absorb the operational tax of running Python or Node.js plus dependencies. SaaS free tiers cover Cloudflare AI Gateway (10,000 requests a day on the edge), OpenRouter ($1 starter credit), and Vercel AI Gateway (free with Vercel Hobby). The reader who searches 'best free LLM gateway' and has DevOps capacity wants OSS self-host; the reader without DevOps wants a SaaS free tier and accepts the cap as the trade-off.

Stack fit decides the right SaaS free tier

The three SaaS free tiers each match a different stack. Cloudflare AI Gateway fits teams already running Cloudflare Workers; the gateway is bundled with Workers infrastructure and ships edge-native delivery across 300-plus data centers. OpenRouter fits indie developers evaluating multiple models without managing provider keys; the one-dollar starter credit funds testing across 300-plus models. Vercel AI Gateway fits Next.js teams already on Vercel; the AI SDK v5 integration is the load-bearing differentiator. Off the matched stack, each gateway loses meaningful value. Pick by where your application already runs, not by which free tier looks most generous on paper.

Apache 2 / MIT vs Apache 2.0 vs no license

All three OSS picks here ship permissive licenses. LiteLLM is Apache 2 / MIT (permissive at any revenue level). Portkey open-sourced their full gateway under Apache 2.0 on March 24, 2026; permissive at any revenue level. Out-of-catalog OSS projects worth knowing include Bifrost (Go-based with around 11 microseconds overhead at 5,000 requests per second; Apache 2.0), Kong AI Gateway (Apache 2.0), and Apache APISIX (Apache 2.0). All four catalog plus out-of-catalog OSS picks permit commercial use including SaaS resale at any revenue level; license posture is uniform. The decision pivots on architectural fit (proxy versus full gateway versus edge) rather than license restrictions.

Cap thresholds on SaaS free tiers

Each SaaS free tier here has a different cap and reader regret lives where the cap bites first. Cloudflare AI Gateway Free covers 10,000 requests a day; the cap bites at roughly seven requests per minute sustained or any production-scale traffic. OpenRouter Free is a one-dollar starter credit; the cap bites quickly at any meaningful evaluation because LLM inference burns through the credit fast. Vercel AI Gateway Free is bundled with Vercel Hobby; the cap is the Vercel Hobby plan limits rather than gateway-specific. The honest sequencing: solo evaluation readers run Cloudflare and Vercel free tiers indefinitely; OpenRouter free is best for one-off model comparisons; production-bound readers hit upgrade triggers within weeks.

When to upgrade to a paid tier (cross-link to parent)

Free paths cover most starter projects but each pick has a clear upgrade trigger. Cloudflare AI Gateway Free outgrows past 10,000 daily requests; Workers Paid at five dollars monthly. OpenRouter Free starter credit outgrows quickly; pay-as-you-go covers usage. Vercel AI Gateway Free outgrows when team workflows need shared budgets; Pro at twenty dollars per user. LiteLLM OSS self-host outgrows when Python ops capacity becomes the bottleneck; Cloud Pro at fifty dollars per user. Portkey OSS self-host outgrows similarly; Production at forty-nine dollars monthly with overages. At any of those triggers, see [our /best/llm-gateways guide](/best/llm-gateways) for the broader paid lineup including Helicone Pro, Langfuse Core, and the upgrade tiers excluded from this lens.

Frequently asked questions

Are these free tiers genuinely free forever or limited trials?

OSS self-host (LiteLLM Apache 2 / MIT, Portkey Apache 2.0) is genuinely free forever on licensing; you pay infrastructure only. SaaS free tiers are free indefinitely but with caps: Cloudflare AI Gateway Free covers 10,000 requests a day, Vercel AI Gateway Free is bundled with Vercel Hobby. OpenRouter Free is a one-dollar starter credit rather than ongoing free use; once depleted, pay-as-you-go covers usage at provider rates with no inference markup. None of the picks are time-limited trials.

Does Subrupt earn a commission from these free picks?

On a few. We disclose this on every /best page. Self-host OSS (LiteLLM Apache 2 / MIT, Portkey Apache 2.0) has no affiliate path because there is no transaction. Cloud upgrades on LiteLLM, Portkey, OpenRouter, Vercel, and Cloudflare have paid plans where we earn commission only on conversion. The composite ranking weights price at 40 percent, features at 30, free tier at 15, fit at 15; none tuned by affiliate rate.

Why is Cloudflare ranked first over OpenRouter?

Cloudflare wins on free-tier defensibility because 10,000 requests a day on the edge plus zero infrastructure setup is the strongest indie-dev free-forever profile. OpenRouter wins on model variety specifically with 300-plus models from 60-plus providers. The decision pivots on use case. Edge-native serverless apps pick Cloudflare. Multi-model evaluation picks OpenRouter. Both are friction-free for free indie readers.

How much does self-hosting LiteLLM or Portkey cost in infrastructure?

A small LiteLLM deployment runs on a single VPS with Python at around five to fifteen dollars monthly. A small Portkey OSS deployment runs on a similar VPS with Node.js at similar economics. Production typically runs staging and prod, closer to thirty dollars monthly. Above that, you absorb the operational tax of patching, backups, monitoring, and on-call. Cost is comparable to LiteLLM Cloud Pro at fifty dollars per user; trade-off is labor versus managed.

Bifrost vs LiteLLM: which OSS proxy should I pick?

Bifrost is out of our catalog but worth knowing. Built in Go with around 11 microseconds gateway overhead at 5,000 requests per second under sustained traffic. LiteLLM is Python-based; Python overhead can add hundreds of microseconds at high concurrency. For latency-critical production workloads at scale, Bifrost performs better. For Python-stack teams that want OpenAI-SDK compatibility and 100-plus providers, LiteLLM is the friction-free path. From our catalog, LiteLLM is the OSS proxy pick.

Can I use OpenRouter Free indefinitely?

No. OpenRouter Free is a one-dollar starter credit that grants full access to all 300-plus models for evaluation; once the credit is depleted, you must add credits via Pay-as-you-go. Pay-as-you-go bills consumed tokens at the underlying provider price with no inference markup; a small Stripe processing fee applies on credit purchases. For ongoing free use, Cloudflare AI Gateway Free or LiteLLM OSS self-host are better fits.

When should I run my own gateway versus pay for managed?

Run your own LiteLLM or Portkey OSS gateway when you have Python or Node.js operational maturity, when you process meaningful token volume making per-user pricing expensive, or when self-host control matters for compliance or cost. Pay for managed (Cloudflare paid, Vercel Pro, OpenRouter Pro, Portkey Production) when DevOps capacity is the bottleneck, when you want zero operational tax, or when team-collaboration features matter more than license posture.

EU data residency: which free picks store gateway data in the EU?

LiteLLM self-host gives full control of where data lives. Portkey self-host gives full control. Cloudflare AI Gateway runs from EU PoPs by default. Vercel AI Gateway routes through Vercel infrastructure with multi-region. OpenRouter routes via cheapest provider (US-default). For EU-resident free use, LiteLLM self-host, Portkey self-host, Cloudflare, and Vercel all qualify; OpenRouter requires explicit provider routing for EU residency.

How often is this guide updated?

We re-review pricing and feature changes annually at minimum, with mid-year refreshes when major vendor announcements happen. Cloudflare AI Gateway Free 10K/day stable. OpenRouter starter credit stable. Vercel AI Gateway graduated from beta to GA in 2024. Portkey open-sourced under Apache 2.0 in March 2026. LiteLLM crossed 100 providers in 2024. The lastReviewed date reflects the most recent editorial pass.

Subrupt Editorial

The team behind subrupt.com. We track subscriptions, surface cheaper alternatives, and publish buying guides where the score formula is on the page so you can recompute it yourself. We do not claim 30,000 hours of testing. What we claim is live pricing from our database, a transparent composite score, and honest savings math against a category baseline.

Last reviewed

Citations

Affiliate disclosure: Subrupt earns a commission when you switch to a service through our recommendation links. This never changes the price you pay. We only recommend services where there's a real cost or feature advantage for you, and our picks are based on the data on this page, not on which programs pay the most.

Related buying guides

Track your subscriptions on Subrupt

Add the Free LLM Gateway you pay for and see how much you'd save by switching.

Open dashboard

More buying guides

Independent rankings for the subscriptions worth paying for.

See all guides