Skip to main content

Quickstart

Get your first ATXP-powered request working in minutes.

Prerequisites

You need an ATXP connection string. Get one from: Your connection string looks like:
https://accounts.atxp.ai?connection_token=<token>&account_id=<id>

Option 1: LLM inference

Make a request to the LLM Gateway using any OpenAI-compatible client:
curl -X POST "https://llm.atxp.ai/v1/chat/completions" \
  -H "Authorization: Bearer $ATXP_CONNECTION" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4.1",
    "messages": [{"role": "user", "content": "What is 2+2?"}]
  }'

Option 2: Claude Code with ATXP tools

Install the ATXP plugin marketplace and use tools directly:
Claude Code
/plugin marketplace add atxp-dev/claude
/setup <your-atxp-connection-string>
Then use tools naturally:
Claude Code
Generate an image of a sunset over mountains
Search the web for the latest AI news

Option 3: Direct MCP connection

Connect to ATXP MCP servers directly using the @atxp/client SDK:
import { atxpClient, ATXPAccount } from '@atxp/client';

const client = await atxpClient({
  mcpServer: 'https://search.mcp.atxp.ai/',
  account: new ATXPAccount(process.env.ATXP_CONNECTION),
});

const result = await client.callTool({
  name: 'atxp_search',
  arguments: { query: 'latest AI news' },
});

console.log(result.content[0].text);

Next steps