Open Source · Rust-Native

Every AI provider.
One place to configure,
route, and trust.

Valymux routes your requests across providers, tells you exactly what each model supports, and keeps your credentials isolated from your application code.

Already building with AI? Share your experience →

Rust-Native
Auditable
Self-Hostable
API Call
SDK
cURL
V
OpenAI
Anthropic
Gemini
Mistral
One Gateway · Any Provider
The Problem

Chaos under the hood.

Every provider has its own formats, its own auth, its own quirks. Your team writes glue code instead of building product.

Different model names across providers
Which parameters does this model support?
Secret sprawl across teams
Provider lock-in risk
No unified logging
Provider docs change without warning
Different streaming formats
Cost blind spots
8+ integration headaches per provider

What if every provider just worked the same way?

One interface. Route, translate, and observe — across all of them.

The Solution

One stable layer.

OpenAI
/v1/chat/completionsmessages[]
Anthropic
/v1/messagescontent[]
Gemini
/v1/generateContentparts[]
Mistral
/v1/chat/completionsmessages[]
4 different APIs
V
ValymuxUNIFIED
POST /v1/chat/completions
model: "primary-model"
messages: [{ role, content }]
Routes to best available provider
1
API Format
All
Providers

One integration. One interface. One mental model.

How It Works

Route. Translate. Observe.

Smart Routing

Requests routed to the best available provider based on config, load, and fallback rules.

Universal Translation

One API format across all providers. No more adapting to each provider's quirks.

Secure Credentials

Provider keys never leave the gateway. Virtual API keys for your team, automatic rotation.

Full Observability

Every request traced. Latency, tokens, cost — unified across all providers in real time.

valymux-request-flow
POST /v1/chat/completions
primary: openai/gpt-5.4 → fallback: anthropic/claude-sonnet-4-6
{ "model": "any", "messages": [...] } → normalized
vk_live_*** → resolved provider key (never exposed)
trace_id: abc123 | 182ms | 1.2k tokens | $0.003
200 OK — 182ms
Your AppValymuxProvider
Three Pillars

Built without compromise.

Security

Provider credentials encrypted at rest. Virtual keys displayed once on creation, never stored in recoverable form. AGPL codebase — every line auditable. Self-hostable by design.

Credential Encryptionat rest
Virtual Keysone-time display
Self-Hostableair-gapped option

Speed

Rust-native engine with no garbage collection overhead. Concurrent streaming across providers. Designed to never be your bottleneck.

TARGET OVERHEAD0.4ms
TARGET P9912ms
DESIGN CAPACITY10k+

Clarity

Every model cataloged with its exact capabilities: streaming, thinking, tools, temperature range, context window. Configure once. Copy to code. No docs tab.

claude-sonnet-4-6 · thinking✓ supported
gpt-5.4 · tools + vision✓ supported
gemini-3.1-pro · context1M tokens
Developer Experience

Less glue code. More product.

Before
Multi-provider chaos
// Different client for each provider
const openai = new OpenAI({ apiKey: KEY_1 })
const anthropic = new Anthropic({ apiKey: KEY_2 })
const gemini = new Gemini({ apiKey: KEY_3 })

// Different formats everywhere
if (provider === "openai") {
  res = await openai.chat.completions.create(...)
} else if (provider === "anthropic") {
  res = await anthropic.messages.create(...)
} else if (provider === "gemini") {
  res = await gemini.generateContent(...)
}

// Different streaming, tools, errors...
// 200+ lines of glue code per provider
After
One Valymux call
// One client. Any provider.
const res = await fetch("http://valymux/v1/chat", {
  method: "POST",
  headers: {
    "Authorization": "Bearer vk_live_***",
    "Content-Type": "application/json"
  },
  body: JSON.stringify({
    model: "primary-model",
    messages: [{ role: "user", content: "..." }]
  })
})

// That's it. Routing, failover, auth,
// streaming, tracing — all handled.

Configuration First

Swap LLMs in YAML, not in production code.

# gateway-config.yaml
providers:
  - id: primary-model
    target: openai/gpt-5.4
    fallback: anthropic/claude-sonnet-4-6

security:
  virtual_keys: true
  pii_filter: enabled
  budget_cap: $500/mo
Your App
Valymux
OpenAI
Anthropic
Gemini
Any
Open Source

Built in the open. Shaped by developers.

Valymux is open source from day one. We believe the infrastructure you trust with your API keys should be transparent, auditable, and under your control.

CLoaKY233/Valymux
Stars
🦀
Rust
AGPL
License

Transparent

Every line of code is public. Audit the gateway yourself.

Community

Feature requests, bug reports, and PRs welcome from day one.

Honest

We're early. We share what works, what doesn't, and what's next.

Secure

Rust-native; avoids dynamic imports where possible. Audit the binary. Host it yourself.

V
Early Access

Stop managing providers. Start building product.

MVP launching Q2 2026. Join early — your feedback shapes what gets built next.

git clone https://github.com/CLoaKY233/Valymux.git

OSS • Rust-Native • Self-Hostable