Cost Calculator
Enter tokens and compare total cost across models.
Cheapest: OpenAI / GPT-5 nano — $0.000420
Saves $0.779580 vs most expensive model.
Estimated savings: $7795.80 / month (at 10,000 requests/month)
Saves $0.779580 vs most expensive model.
Estimated savings: $7795.80 / month (at 10,000 requests/month)
| Rank | Provider | Model | Total Cost ($) |
|---|---|---|---|
| 1 | OpenAI | GPT-5 nano | 0.000420 |
| 2 | StepFun | Step 3.5 Flash | 0.000445 |
| 3 | Gemini 2.0 Flash | 0.000520 | |
| 4 | Gemini 2.5 Flash-Lite | 0.000520 | |
| 5 | OpenAI | GPT-4.1 nano | 0.000520 |
| 6 | Baidu | ERNIE 4.5 Turbo | 0.000600 |
| 7 | OpenAI | GPT-4o mini | 0.000780 |
| 8 | Upstage | Solar Pro 3 | 0.000780 |
| 9 | xAI | Grok 4 Fast | 0.000800 |
| 10 | xAI | Grok 4 Fast Reasoning | 0.000800 |
| 11 | xAI | Grok 4.1 Fast | 0.000800 |
| 12 | xAI | Grok 4.1 Fast Reasoning | 0.000800 |
| 13 | Meta | Llama 4 Scout | 0.000868 |
| 14 | DeepSeek | DeepSeek Chat V3.2 | 0.000896 |
| 15 | DeepSeek | DeepSeek Reasoner V3.2 | 0.000896 |
| 16 | DeepSeek | DeepSeek V3 | 0.000896 |
| 17 | DeepSeek | DeepSeek V3.1 | 0.000896 |
| 18 | DeepSeek | DeepSeek V3.1 Terminus | 0.000896 |
| 19 | DeepSeek | DeepSeek V3.2 Exp | 0.000896 |
| 20 | DeepSeek | R1 | 0.000896 |
| 21 | xAI | Grok 3 mini | 0.001000 |
| 22 | ByteDance | Seed 2.0 Mini | 0.001040 |
| 23 | MiniMax | MiniMax-01 | 0.001218 |
| 24 | Meta | Llama 4 Maverick | 0.001220 |
| 25 | Z.AI | GLM 4.5 Air | 0.001280 |
| 26 | OpenAI | GPT-5.4 nano | 0.001400 |
| 27 | Baidu | ERNIE 4.5 VL 28B A3B | 0.001450 |
| 28 | Anthropic | Claude Haiku 3 | 0.001500 |
| 29 | MiniMax | MiniMax M2 | 0.001582 |
| 30 | MiniMax | MiniMax M2.1 | 0.001582 |
| 31 | MiniMax | MiniMax M2.5 | 0.001582 |
| 32 | xAI | Grok Code Fast 1 | 0.001600 |
| 33 | Alibaba | Qwen 3.5 Flash | 0.001720 |
| 34 | Baidu | ERNIE 4.5 Turbo VL | 0.001914 |
| 35 | ByteDance | Skylark Embedding Vision | 0.002080 |
| 36 | OpenAI | GPT-4.1 mini | 0.002080 |
| 37 | OpenAI | GPT-5 mini | 0.002100 |
| 38 | OpenAI | GPT-3.5 Turbo | 0.002200 |
| 39 | Gemini 2.5 Flash | 0.002600 | |
| 40 | Z.AI | GLM 4.5V | 0.002640 |
| 41 | Perplexity | Sonar | 0.002800 |
| 42 | Z.AI | GLM 4.5 | 0.002960 |
| 43 | Z.AI | GLM 4.6 | 0.002960 |
| 44 | OpenAI | GPT Audio Mini | 0.003120 |
| 45 | MiniMax | MiniMax M2.1 HighSpeed | 0.003167 |
| 46 | MiniMax | MiniMax M2.5 HighSpeed | 0.003167 |
| 47 | Moonshot AI | Kimi K2 | 0.003200 |
| 48 | MiniMax | MiniMax M1 | 0.003479 |
| 49 | Alibaba | Qwen 3.5 Plus | 0.003898 |
| 50 | ByteDance | Seed 2.0 Lite | 0.004200 |
| 51 | OpenAI | GPT-3.5 Turbo Instruct | 0.004600 |
| 52 | Anthropic | Claude Haiku 3.5 | 0.004800 |
| 53 | OpenAI | GPT-5.4 mini | 0.005100 |
| 54 | Alibaba | Qwen 3 Max | 0.005219 |
| 55 | OpenAI | o4 mini | 0.005720 |
| 56 | Anthropic | Claude Haiku 4.5 | 0.006000 |
| 57 | OpenAI | GPT-4.1 | 0.010400 |
| 58 | Perplexity | Sonar Deep Research | 0.010400 |
| 59 | Perplexity | Sonar Reasoning Pro | 0.010400 |
| 60 | Gemini 2.5 Pro | 0.010500 | |
| 61 | OpenAI | GPT-5 Codex | 0.010500 |
| 62 | OpenAI | GPT-5 Image Mini | 0.011400 |
| 63 | OpenAI | GPT Audio | 0.013000 |
| 64 | OpenAI | GPT-4o | 0.013000 |
| 65 | OpenAI | GPT-5.2 | 0.014700 |
| 66 | OpenAI | GPT-5.4 | 0.017000 |
| 67 | Anthropic | Claude 3.5 Sonnet | 0.018000 |
| 68 | Anthropic | Claude Sonnet 3.7 | 0.018000 |
| 69 | Anthropic | Claude Sonnet 4 | 0.018000 |
| 70 | Anthropic | Claude Sonnet 4.5 | 0.018000 |
| 71 | Anthropic | Claude Sonnet 4.6 | 0.018000 |
| 72 | Perplexity | Sonar Pro | 0.018000 |
| 73 | Perplexity | Sonar Pro Search | 0.018000 |
| 74 | xAI | Grok 3 | 0.018000 |
| 75 | Anthropic | Claude Opus 4.5 | 0.030000 |
| 76 | Anthropic | Claude Opus 4.6 | 0.030000 |
| 77 | OpenAI | GPT-5 Image | 0.041600 |
| 78 | OpenAI | GPT-4 Turbo | 0.044000 |
| 79 | Anthropic | Claude Opus 3 | 0.090000 |
| 80 | Anthropic | Claude Opus 4 | 0.090000 |
| 81 | Anthropic | Claude Opus 4.1 | 0.090000 |
| 82 | OpenAI | GPT-4 | 0.108000 |
| 83 | OpenAI | GPT-5.2 pro | 0.176400 |
| 84 | OpenAI | GPT-5.4 pro | 0.204000 |
| 85 | OpenAI | o1-pro | 0.780000 |