Melious
Get started

Quickstart

Your first Melious call in five minutes

Get a real response back from Melious in five minutes, using the OpenAI SDK you already know.

Get an API key

Sign in at melious.ai, then go to Account → API keys and click Create API key. Copy it when it appears.

Keys are shown once. If you lose it, rotate it — there's no recovery button.

Set it as an environment variable:

export MELIOUS_API_KEY=sk-mel-<YOUR_API_KEY>

Install the OpenAI SDK

Melious accepts the OpenAI SDK as-is. You don't need a Melious-specific client.

pip install openai
npm install openai

curl, fetch, or any HTTP client works. Skip to step 3.

Make the call

import os
from openai import OpenAI

client = OpenAI(
    api_key=os.environ["MELIOUS_API_KEY"],
    base_url="https://api.melious.ai/v1",
)

response = client.chat.completions.create(
    model="glm-4.7",
    messages=[{"role": "user", "content": "Name three Hanseatic cities."}],
)
print(response.choices[0].message.content)
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.MELIOUS_API_KEY,
  baseURL: "https://api.melious.ai/v1",
});

const response = await client.chat.completions.create({
  model: "glm-4.7",
  messages: [{ role: "user", content: "Name three Hanseatic cities." }],
});
console.log(response.choices[0].message.content);
curl https://api.melious.ai/v1/chat/completions \
  -H "Authorization: Bearer $MELIOUS_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-4.7",
    "messages": [{"role": "user", "content": "Name three Hanseatic cities."}]
  }'

glm-4.7 is a good general-purpose default. Browse the full catalog at melious.ai/hub or hit GET /v1/models for a programmatic list.

Read the whole response

The response is OpenAI-shaped, plus two fields that only Melious returns:

{
  "id": "chatcmpl-...",
  "model": "glm-4.7",
  "choices": [
    {
      "index": 0,
      "message": { "role": "assistant", "content": "Hamburg, Lübeck, Bremen." },
      "finish_reason": "stop"
    }
  ],
  "usage": { "prompt_tokens": 12, "completion_tokens": 7, "total_tokens": 19 },
  "environment_impact": {
    "energy_kwh": 0.00015,
    "carbon_g_co2": 0.06,
    "water_liters": 0.0002,
    "renewable_percent": 85,
    "pue": 1.18,
    "provider_id": "ovhcloud",
    "location": "FR"
  },
  "billing_cost": { "energy": "0.0008", "credits": "0.0", "paid_with": "energy" }
}

That's your first Melious call. Hamburg, Lübeck, Bremen — if the response looks like that, routing is happy. The environment_impact block is what makes us different: carbon, water, renewables, PUE, country, every response.

Where next

Read onceRouting tells you how to bias toward speed, price, or lower carbon with a single suffix like glm-4.7:eco. That's the knob most people reach for first.

Pick your path:

  • Migrating from OpenAI? From OpenAI covers the model-name swap.
  • Using Claude Code or the Anthropic SDK? From Anthropic gets you running in two env vars.
  • Want the full endpoint list? Reference has everything.
  • Prefer a shell? CLI is a different front door to the same platform.

On this page