Overview
The Vercel AI SDK provides powerful tools for building AI applications with streaming responses, while ATXP offers pay-per-use access to various MCP (Model Context Protocol) tools. By combining them, you can create AI applications that can search the web, generate images, browse websites, and more - all with usage-based pricing. This guide will show you how to integrate ATXP’s MCP tools with the Vercel AI SDK for streaming responses and real-time interactions.
You can find a full example of integrating ATXP’s SDK with the Vercel AI SDK in the ATXP Vercel AI SDK demo.
Prerequisites
Create an ATXP account
If you don’t have an ATXP account yet, create one and copy your ATXP connection string. It should look something like this:
https://accounts.atxp.ai?connection_token=<random_string>
If you’ve already created an ATXP account, you visit the ATXP account dashboard to get your connection string.
Usage
ATXP provides a LLM Gateway that allows you to use any model from any provider and pay per use using only your ATXP account’s connection string.Install dependencies
Install the required packages in your project:npm install @atxp/client ai @ai-sdk/openai @ai-sdk/openai-compatible
Configure environment
Create a .env file with your connection string:# ATXP connection string from your ATXP account dashboard (https://accounts.atxp.ai)
ATXP_CONNECTION=https://accounts.atxp.ai?connection_token=<your_token>&account_id=<your_account_id>
Never commit your .env file to version control. It is a good idea to add your .env to your .gitignore file to prevent it from being committed. Import the needed libraries
To use ATXP with the Vercel AI SDK, you need to import a few things from the Vercel AI SDK and the ATXP client SDK. The Vercel AI SDK supports OpenAI-compatible models through the createOpenAICompatible function, which we need in order to use the LLM Gateway.import { buildStreamableTransport, ATXPAccount } from '@atxp/client';
import { generateText, experimental_createMCPClient } from 'ai';
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
Initialize the ATXP account
Initialize the ATXP account by creating a new ATXPAccount object with your ATXP connection string.const account = new ATXPAccount(process.env.ATXP_CONNECTION!);
Create a streamable transport
Create a streamable transport for a specific ATXP MCP server by using the buildStreamableTransport function.Create a streamable transport
const transport = buildStreamableTransport({
mcpServer: 'https://search.mcp.atxp.ai',
account,
});
Create an MCP client
Create an MCP client using the experimental_createMCPClient function.const mcpClient = await experimental_createMCPClient({ transport });
Get available tools
Get available tools from the MCP client by using the tools function.const tools = await mcpClient.tools();
Create an OpenAI-compatible client
Create an OpenAI-compatible client using the createOpenAICompatible function.Create an OpenAI-compatible client
const atxp = createOpenAICompatible({
name: 'atxp-llm',
apiKey: process.env.ATXP_CONNECTION,
baseURL: 'https://llm.atxp.ai/v1',
});
Use the LLM Gateway
Use the LLM Gateway to call the specific model with the tools available from the MCP server.Your ATXP Account will be used to pay for the tokens used by the specified model. See the docs for more information on available models and pricing.Use the tools with the LLM Gateway
const response = await generateText({
model: atxp("gpt-4.1"),
tools,
messages: [
...systemPrompt,
{
role: "user",
content: prompt,
},
],
});
console.log(JSON.stringify(response, null, 2));
If you have your own OpenAI key, you can use it directly with the Vercel AI SDK and ATXP.If you don’t have an OpenAI key, you can use the ATXP LLM Gateway to use your ATXP account to pay-per-use for any OpenAI-compatible model. Install dependencies
Install the required packages in your project:npm install @atxp/client ai @ai-sdk/openai
The @atxp/client package provides the MCP transport that allows you to use ATXP’s MCP tools in your Vercel AI SDK application, while ai and @ai-sdk/openai are from Vercel’s AI SDK.
Configure environment
Create a .env file with your ATXP connection string and OpenAI API key:# ATXP connection string from https://accounts.atxp.ai
ATXP_CONNECTION=https://accounts.atxp.ai?connection_token=<your_token>&account_id=<your_account_id>
# Required for the OpenAI client
OPENAI_API_KEY=your_openai_api_key_here
Never commit your .env file to version control. It is a good idea to add your .env to your .gitignore file to prevent it from being committed. Import the needed libraries
To use ATXP with the Vercel AI SDK, you need to import a few things from the Vercel AI SDK and the ATXP client SDK. The Vercel AI SDK supports OpenAI models through the openai function, which we need in order to use your own OpenAI key.import { buildStreamableTransport, ATXPAccount } from '@atxp/client';
import { generateText, experimental_createMCPClient } from 'ai';
import { openai } from '@ai-sdk/openai';
Initialize the ATXP account
Initialize the ATXP account by creating a new ATXPAccount object with your ATXP connection string.const account = new ATXPAccount(process.env.ATXP_CONNECTION!);
Create a streamable transport
Create a streamable transport for a specific ATXP MCP server by using the buildStreamableTransport function.Create a streamable transport
const transport = buildStreamableTransport({
mcpServer: 'https://search.mcp.atxp.ai',
account,
});
Create an MCP client
Create an MCP client using the experimental_createMCPClient function.const mcpClient = await experimental_createMCPClient({ transport });
Get available tools
Get available tools from the MCP client by using the tools function.const tools = await mcpClient.tools();
Call the LLM
Use the OpenAI client to call the LLM with the tools available from the MCP server.const response = await generateText({
model: openai('gpt-4o-mini'),
tools,
messages: [
...systemPrompt,
{
role: "user",
content: prompt,
},
],
});
console.log(JSON.stringify(response, null, 2));
Next steps
Now that you have the basics, you’re ready to start building your own applications. You can explore the following topics to learn more: