Why use ATXP LLM Gateway?
- Unified access: Use multiple LLM providers through a single API endpoint
- No vendor lock-in: Switch between models without changing your code
- Pay-per-use pricing: Only pay for what you use with transparent per-token pricing
- OpenAI compatibility: Works with any client that supports OpenAI’s chat-completions interface
- Streaming support: Both chat completions and streaming APIs are fully supported
- Easy integration: Get started in minutes with your existing OpenAI-compatible code
Get started with the LLM Gateway
Code integration
Integrate the LLM Gateway into your applications using JavaScript, Python, or direct HTTP calls with OpenAI-compatible SDKs.
Desktop applications
Configure desktop applications like Goose to use the LLM Gateway with your ATXP account for seamless model access.
Join the community
Join the ATXP community on Discord to ask questions, share your projects, and get help from the team.
Build an agent
Learn how to build agents that can use paid MCP servers with ATXP for more advanced AI capabilities.