Skip to main content

General

Samurai AI is a unified AI API gateway that gives you access to 400+ large language models from OpenAI, Anthropic, Google, Meta, and other providers through a single API. Think of it as your “super-API” for AI — instead of integrating with multiple providers separately, you access all their models through one OpenAI-compatible interface at 50% of the original provider price.
We currently support 400+ models including:
  • OpenAI: gpt-4.1, gpt-4o, gpt-4o-mini, gpt-3.5-turbo, o1, o3, o4-mini
  • Anthropic: claude-opus-4-5, claude-sonnet-4, claude-3-5-sonnet, claude-3-7-sonnet
  • Google: gemini-2.5-pro, gemini-2.0-flash, gemini-1.5-pro
  • Meta: llama-3.3-70b, llama-3.1-8b, llama-3.2-vision
  • And 400+ more models!
New models are added as they become available from providers. Check our Models page for the full current list.
Yes! Switching models is as simple as changing the model parameter in your API call. No code changes needed beyond that single parameter.
# Switch models instantly — only the model ID changes
response = client.chat.completions.create(
    model="gpt-4.1",  # Change this to any supported model ID
    messages=[{"role": "user", "content": "Hello!"}]
)
No. You only need one Samurai AI account to access all 400+ models. We handle the complexity of managing multiple provider relationships so you don’t have to.

Pricing & Billing

You pay based on actual token usage, just like with individual providers — but at 50% of the original provider price. For example:
  • gpt-4o: **1.25/1Mtokens(vs1.25/1M tokens** (vs 2.50 direct from OpenAI)
  • claude-3-5-sonnet: **1.50/1M(vs1.50/1M** (vs 3.00 from Anthropic)
  • gemini-2.5-pro: Half the Google price
See our Pricing page for the full breakdown.
Always — every model is priced at 50% of what you’d pay the provider directly. On top of that:
  1. No integration complexity — one API instead of managing many
  2. Unified billing — one invoice for all your AI usage
  3. Saved engineering time — no managing multiple keys, SDKs, or rate limits
Yes. Models are organized into tiers based on provider pricing:
  • Free: Low-cost models (provider output price < $5/1M tokens)
  • Starter: Mid-range models (55–20/1M tokens)
  • Pro: Premium models (> $20/1M tokens)
See our Pricing page for the full model tier breakdown.

Technical Integration

Yes — fully. Samurai AI is 100% OpenAI-compatible. Migration is just two lines:
# Before (OpenAI)
client = OpenAI(api_key="sk-openai-...")

# After (Samurai AI) — only these two lines change
client = OpenAI(
    base_url="https://www.samuraiapi.in/v1",
    api_key="sk-samurai-YOUR_KEY"
)
All your existing code — SDKs, streaming, function calling, vision — continues to work unchanged.
Yes, streaming is fully supported and works exactly the same as with OpenAI:
stream = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)
for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="")
See our Streaming guide for details.
Yes — function calling is fully supported and works identically to the OpenAI API. See our Function Calling guide for examples.
Yes — vision is fully supported for all models that support it (GPT-4o, Claude, Gemini). See our Vision guide for details.
Yes — embeddings are fully supported. Use text-embedding-3-small, text-embedding-3-large, and other embedding models. See our Embeddings guide.
Rate limits vary by plan. See our Rate Limits page for current limits per tier.
Samurai AI returns standard OpenAI-compatible error responses. See our Error Reference for the full list of error codes and how to handle them.

Compatible Libraries

Any library that supports a custom base_url works with Samurai AI:
LibraryLanguageHow to Configure
openaiPython / Node.jsbase_url / baseURL
langchainPython / JSopenai_api_base
llama_indexPythonapi_base
@ai-sdk/openaiTypeScriptbaseURL in createOpenAI()
litellmPythonapi_base param
No special SDK needed.
Yes — any tool that lets you configure a custom OpenAI-compatible base URL works with Samurai AI. Set the base URL to https://www.samuraiapi.in/v1 and your Samurai AI API key.

Use Cases

Developers: Experiment with 400+ models without rewriting code. Build faster with one API key and one billing relationship.Businesses: Cut AI costs by 50% immediately. Ensure high availability for customer-facing AI features.Researchers: Access every cutting-edge model as it launches. Benchmark across providers from one endpoint without managing multiple accounts.
Check our Models page for full guidance. General recommendations:
  • Fast responses: gpt-4o-mini, gemini-2.0-flash
  • Complex reasoning: gpt-4.1, claude-opus-4-5
  • Code generation: gpt-4.1, claude-3-7-sonnet
  • Creative writing: claude-opus-4-5
  • Cost-effective: gpt-4o-mini, gemini-2.0-flash
  • Large context: gemini-2.5-pro (1M+ tokens)
Yes — Samurai AI is built for production use. We offer high availability, comprehensive logging, and enterprise-grade infrastructure.

Support

  1. Sign up for a Samurai AI account
  2. Get your API key from the dashboard
  3. Follow our Quick Start guide to make your first request in minutes
Yes — migration typically takes under 5 minutes. Change two lines of code (base URL + API key) and you’re done. Everything else — your existing SDK, your code, your prompts — continues to work unchanged.
Still have questions? Reach out at support@samuraiapi.in or join our Discord.