Use the ATXP LLM Gateway with any desktop application that supports OpenAI-compatible providers. Simply configure your application to use https://llm.atxp.ai/v1 as the base URL and your ATXP connection string as the API key.
1

Set up your account

Visit your ATXP account dashboard and copy your account connection string. It should look something like this:
https://accounts.atxp.ai?connection_token=<random_string>&account_id=<random_string>
2

Configure your desktop application

We’re going to use Goose as an example, but you can use any desktop application that supports OpenAI-compatible providers.In Goose, navigate to SettingsModelsConfigure Providers.Goose settings
3

Select OpenAI provider

Select OpenAI from the list of providers and click Configure.
Goose currently supports all OpenAI-compatible models through its OpenAI provider. You’ll have access to all models available through the ATXP LLM Gateway under the OpenAI provider in Goose.
4

Enter configuration

Configure the OpenAI provider with these settings:
  • API Key: Your ATXP connection string
  • API Host: https://llm.atxp.ai/v1
  • OpenAI Base Path: Leave as the default value v1/chat/completions Goose OpenAI configuration
Click Submit to save your configuration.
5

Switch models

Navigate to SettingsSwitch models to select from the available models provided by the ATXP LLM Gateway.Goose switch models
6

Start chatting

You can now use any of the available models provided by the ATXP LLM Gateway in your desktop application. Open a new chat to get started!