Skip to main content

🚀 Quickstart

Welcome to Emby — the unified AI gateway that gives developers fast, predictable, EU-hosted access to the best open-source and provider models, all through fully OpenAI- and Anthropic-compatible APIs. No lock-in. No complex setup.
Just swap your API key + base URL and keep your existing workflow.
TL;DR:
Point your requests to
https://api.emby.dev/v1/...
and use your EMBY_API_KEY.

1 · Get Your API Key

Create your Emby account → copy your key → set it in your environment.
export EMBY_API_KEY="emby_XXXXXXXXXXXXXXXX"
export OPENAI_API_KEY="$EMBY_API_KEY"         # optional alias
export OPENAI_API_BASE="https://api.emby.dev/v1"
Your tools (Cursor, Claude Code, Kilo, Warp, n8n, Aider, etc.) now work out of the box.

2 · Pick Your Language

Below are simple examples using Emby’s OpenAI-compatible /chat/completions endpoint.

curl

curl -X POST https://api.emby.dev/v1/chat/completions \
  -H "Authorization: Bearer $EMBY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek-v3",
    "messages": [
      {"role": "user", "content": "Hello, how are you?"}
    ]
  }'

3 · SDK Integrations

Emby works with any OpenAI-style client.

ai-sdk (Vercel AI SDK)

import { openai } from "@ai-sdk/openai";

const client = openai({
  apiKey: process.env.EMBY_API_KEY!,
  baseURL: "https://api.emby.dev/v1",
});

const { text } = await client.generateText({
  model: "deepseek-v3",
  prompt: "Write a vegetarian lasagna recipe for 4 people.",
});

console.log(text);

OpenAI SDK (official)

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.EMBY_API_KEY,
  baseURL: "https://api.emby.dev/v1",
});

const completion = await client.chat.completions.create({
  model: "deepseek-v3",
  messages: [{ role: "user", content: "Hello!" }],
});

console.log(completion.choices[0].message);

Vercel AI SDK (low-level)

import { createOpenAI } from "@ai-sdk/openai";

const emby = createOpenAI({
  apiKey: process.env.EMBY_API_KEY!,
  baseURL: "https://api.emby.dev/v1",
});

const result = await emby.chat({
  model: "kimi-k2-thinking",
  messages: [{ role: "user", content: "Explain WebAssembly in simple terms." }],
});

console.log(result.choices[0].message.content);

4 · Going Further

Streaming

Add stream: true to get event-streamed responses.

Telemetry & Usage

Every request appears in your dashboard with latency + cost.

Unified Routing

Route to Emby models, Azure, Bedrock, DeepInfra or BYOK providers.

Claude-Style API

Use Emby’s /v1/messages endpoint to run any model via Anthropic format.

5 · FAQ

Yes — Emby is fully OpenAI API compatible.
Yes — through the native Anthropic-compatible /v1/messages endpoint.
Yes — Emby runs on ISO27001 + NEN7510 certified infrastructure inside the EU (bit.nl & Nebius Amsterdam).
Yes — flat monthly per developer, no surprise token bills.

6 · Next Steps


Need Help?

WhatsApp Support

Chat with us instantly:
https://wa.absolum.nl

Book a Call

For enterprise routing, custom GPU servers, or help migrating:
https://cal.com/absolum/30min
Happy building with Emby! ✨