Overview
The Vercel AI SDK provides powerful tools for building AI applications with streaming responses, while ATXP offers pay-per-use access to various MCP (Model Context Protocol) tools. By combining them, you can create AI applications that can search the web, generate images, browse websites, and more - all with usage-based pricing. This guide will show you how to integrate ATXP’s MCP tools with the Vercel AI SDK for streaming responses and real-time interactions. You can find a full example of integrating ATXP’s SDK with the Vercel AI SDK in the ATXP Vercel AI SDK demo.Prerequisites
Create an ATXP account
If you don’t have an ATXP account yet, create one and copy your ATXP connection string. It should look something like this:
If you’ve already created an ATXP account, you visit the ATXP account dashboard to get your connection string.
Usage
- LLM Gateway
- Basic
ATXP provides a LLM Gateway that allows you to use any model from any provider and pay per use using only your ATXP account’s connection string.
1
Install dependencies
Install the required packages in your project:
2
Configure environment
Create a
.env
file with your connection string:.env
Never commit your
.env
file to version control. It is a good idea to add your .env
to your .gitignore
file to prevent it from being committed.3
Import the needed libraries
To use ATXP with the Vercel AI SDK, you need to import a few things from the Vercel AI SDK and the ATXP client SDK. The Vercel AI SDK supports OpenAI-compatible models through the
createOpenAICompatible
function, which we need in order to use the LLM Gateway.Import libraries
4
Initialize the ATXP account
Initialize the ATXP account by creating a new
ATXPAccount
object with your ATXP connection string.Initialize ATXP account
5
Create a streamable transport
Create a streamable transport for a specific ATXP MCP server by using the
buildStreamableTransport
function.Create a streamable transport
6
Create an MCP client
Create an MCP client using the
experimental_createMCPClient
function.Create an MCP client
7
Get available tools
Get available tools from the MCP client by using the
tools
function.Get available tools
8
Create an OpenAI-compatible client
Create an OpenAI-compatible client using the
createOpenAICompatible
function.Create an OpenAI-compatible client
9
Use the LLM Gateway
Use the LLM Gateway to call the specific model with the tools available from the MCP server.Your ATXP Account will be used to pay for the tokens used by the specified model. See the docs for more information on available models and pricing.
Use the tools with the LLM Gateway
You can find a full example of integrating ATXP’s SDK with the Vercel AI SDK in the ATXP Vercel AI SDK demo.
Next steps
Now that you have the basics, you’re ready to start building your own applications. You can explore the following topics to learn more:MCP server documentation
Learn about specific tools and capabilities available in each ATXP MCP server.
Build an agent using paid MCP servers
Follow a complete tutorial to build your first ATXP-powered agent.
Join the community
Join the ATXP community on Discord to ask questions, share your projects, and get help from the team.