Skip to main content

Overview

The Vercel AI SDK provides powerful tools for building AI applications with streaming responses, while ATXP offers pay-per-use access to various MCP (Model Context Protocol) tools. By combining them, you can create AI applications that can search the web, generate images, browse websites, and more - all with usage-based pricing. This guide will show you how to integrate ATXP’s MCP tools with the Vercel AI SDK for streaming responses and real-time interactions. You can find a full example of integrating ATXP’s SDK with the Vercel AI SDK in the ATXP Vercel AI SDK demo.

Prerequisites

Create an ATXP account

If you don’t have an ATXP account yet, create one and copy your ATXP connection string. It should look something like this:

https://accounts.atxp.ai?connection_token=<random_string>

If you’ve already created an ATXP account, you visit the ATXP account dashboard to get your connection string.

Usage

  • LLM Gateway
  • Basic
ATXP provides a LLM Gateway that allows you to use any model from any provider and pay per use using only your ATXP account’s connection string.
1

Install dependencies

Install the required packages in your project:
npm install @atxp/client ai @ai-sdk/openai @ai-sdk/openai-compatible
2

Configure environment

Create a .env file with your connection string:
.env
# ATXP connection string from your ATXP account dashboard (https://accounts.atxp.ai)
ATXP_CONNECTION=https://accounts.atxp.ai?connection_token=<your_token>&account_id=<your_account_id>
Never commit your .env file to version control. It is a good idea to add your .env to your .gitignore file to prevent it from being committed.
echo .env >> .gitignore
3

Import the needed libraries

To use ATXP with the Vercel AI SDK, you need to import a few things from the Vercel AI SDK and the ATXP client SDK. The Vercel AI SDK supports OpenAI-compatible models through the createOpenAICompatible function, which we need in order to use the LLM Gateway.
Import libraries
import { buildStreamableTransport, ATXPAccount } from '@atxp/client';
import { generateText, experimental_createMCPClient } from 'ai';
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
4

Initialize the ATXP account

Initialize the ATXP account by creating a new ATXPAccount object with your ATXP connection string.
Initialize ATXP account
const account = new ATXPAccount(process.env.ATXP_CONNECTION!);
5

Create a streamable transport

Create a streamable transport for a specific ATXP MCP server by using the buildStreamableTransport function.
Create a streamable transport
const transport = buildStreamableTransport({
  mcpServer: 'https://search.mcp.atxp.ai',
  account,
});
6

Create an MCP client

Create an MCP client using the experimental_createMCPClient function.
Create an MCP client
const mcpClient = await experimental_createMCPClient({ transport });
7

Get available tools

Get available tools from the MCP client by using the tools function.
Get available tools
const tools = await mcpClient.tools();
8

Create an OpenAI-compatible client

Create an OpenAI-compatible client using the createOpenAICompatible function.
Create an OpenAI-compatible client
const atxp = createOpenAICompatible({
  name: 'atxp-llm',
  apiKey: process.env.ATXP_CONNECTION,
  baseURL: 'https://llm.atxp.ai/v1',
});
9

Use the LLM Gateway

Use the LLM Gateway to call the specific model with the tools available from the MCP server.Your ATXP Account will be used to pay for the tokens used by the specified model. See the docs for more information on available models and pricing.
Use the tools with the LLM Gateway
const response = await generateText({
  model: atxp("gpt-4.1"),
  tools,
  messages: [
    ...systemPrompt,
    {
      role: "user",
      content: prompt,
    },
  ],
});
console.log(JSON.stringify(response, null, 2));
You can find a full example of integrating ATXP’s SDK with the Vercel AI SDK in the ATXP Vercel AI SDK demo.

Next steps

Now that you have the basics, you’re ready to start building your own applications. You can explore the following topics to learn more:
I