For every developer, every spark of genius, rendered real.

For every developer, every spark of genius, rendered real.

A multimodal AI model API aggregation platform designed specifically for developers. One API accesses top models from around the world.

One API, total mastery of every top-tier model.

Stop juggling keys, SDKs, and provider-specific JSON. Atlas Cloud aggregates 300+ models — LLM, image, video, and audio — behind one OpenAI-compatible endpoint. We pull directly from official sources and verified cloud hubs, so the result is the real model, not a filtered clone. Swap the model string; the rest of your code stays identical.

BUILD WITH ATLAS CLOUD

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.ATLASCLOUD_API_KEY,
  baseURL: "https://api.atlascloud.ai/v1"
});

const model = "moonshotai/kimi-k2.6";
const prompt = "Summarise this PDF in 3 bullets.";

const resp = await client.chat.completions.create({
  model,
  messages: [{ role: "user", content: prompt }]
});
console.log(resp.choices[0].message.content);

Drop into your agent.

{
  "mcpServers": {
    "atlascloud": {
      "command": "npx",
      "args": ["-y", "atlascloud-mcp"],
      "env": {
        "ATLASCLOUD_API_KEY": "your-api-key-here"
      }
    }
  }
}

Drop this into any MCP-compatible client — Cursor, Windsurf, VS Code, Claude Desktop, Zed, JetBrains, Trae, Claude Code, Gemini CLI, Codex CLI, Goose and more.

How it works

Our platform already hosts 300+ models ready to run in production. You can call any of them with one line of code.

Plug into any MCP client

Drop one JSON block into Cursor, Claude Code, Claude Desktop, VS Code, Windsurf, Zed, JetBrains, Codex CLI, Gemini CLI, Goose or any other MCP-compatible client. No provider glue code.

Add this to your client's mcp.json:
{
  "mcpServers": {
    "atlascloud": {
      "command": "npx",
      "args": ["-y", "atlascloud-mcp"],
      "env": {
        "ATLASCLOUD_API_KEY": "your-api-key-here"
      }
    }
  }
}

Then ask your agent to…

Once the atlascloud MCP server is wired up, your agent can call any of Atlas Cloud's 300+ models from plain English. Mention Atlas Cloud by name so the agent routes through the MCP tool.

Chat an LLM

Use the Atlas Cloud MCP server to ask DeepSeek V3.2 to summarise this PDF in three bullet points.

Generate an image

Use Atlas Cloud to generate an image with Seedream v5.0 — a cyberpunk street market at rainy dusk, 1024x1024.

Generate a video

Call the Atlas Cloud MCP tool and create a 10s cinematic shot of a rocket launch at dawn with Seedance 2.0 at 1080p.

Edit local media

Via the Atlas Cloud MCP server, edit ~/photos/cat.jpg with Nano Banana 2 — add a wizard hat, keep composition identical.

How to build on Atlas Cloud

Get running in minutes — follow the six steps below to go from a fresh account to a production integration.

Create your Atlas Cloud account

Sign up at atlascloud.ai and verify your email to start exploring every model on the platform.

Use with Coding tools

Cursor
TRAE
droid
Roo Code
Codex CLI
Gemini CLI
Kilo Code
Cline
Claude
opencode
OpenClaw

Frequently asked questions

Everything you need to know before writing your first line of code.

No. The chat endpoint is OpenAI-compatible — point the OpenAI SDK (or any HTTP client) at api.atlascloud.ai/v1 and swap the model string. Streaming, tool use and function calling all work unchanged.

Chat is synchronous. Image and video models run as async predictions: you POST to the submit endpoint and get a prediction id back, then GET the prediction endpoint with that id until status is succeeded. Poll roughly every 2 seconds — no webhooks required.

300+ models across LLM, image, video and audio — DeepSeek, Qwen, Kimi, GLM, Seedance, Seedream, Nano Banana and more. Browse the full catalogue at /models; the model id you copy there is the exact string to pass in the API call.

You pay per token or per prediction depending on modality — pricing shows on each model card. Default rate limits are generous and enough for most production workloads. If you need more, email [email protected] and we'll raise the cap for you.

Yes — one MCP config plugs Atlas Cloud into every major MCP-compatible client (Cursor, Windsurf, VS Code, Claude Desktop, Claude Code, Zed, JetBrains, Codex CLI, Gemini CLI, Goose and more). The agent can then call any Atlas Cloud model from plain English. A one-line Skills install works too.

Check docs.atlascloud.ai for reference and guides, or open a ticket from the console. For MCP and Skills issues, the AtlasCloudAI/mcp-server and AtlasCloudAI/atlas-cloud-skills repos on GitHub accept issues and PRs.

Join our Discord community

Join the Discord community for the latest model updates, prompts, and support.