OpenCode Integration

OpenCodearrow-up-right is a terminal-based AI coding assistant. Because the AI Model Hub exposes an OpenAI-compatible API, you can connect OpenCode to it using the @ai-sdk/openai-compatible SDK package—no custom adapter required.

By the end of this guide, you will have OpenCode configured to use the AI Model Hub as its model provider.

Prerequisites

  • An IONOS Cloud account with access to the AI Model Hub

  • OpenCodearrow-up-right installed on your system

  • An IONOS Cloud API authentication token

Step 1: Get an Authentication Token

You need an authentication token to access the AI Model Hub. For instructions on how to generate a token in the Data Center Designer (DCD), see Generate authentication token.

circle-info

IONOS Cloud tokens are JSON Web Tokens (JWTs) with an expiration date. If your token stops working, check the exp claim and generate a new token if needed.

Step 2: Set the Token as an Environment Variable

Add the token to your shell profile so it persists across terminal sessions.

Add the following line to your ~/.zshrc file:

export IONOS_API_TOKEN="your-token-here"

Then reload the profile:

source ~/.zshrc

Replace your-token-here with the token you generated in Step 1.

To verify that the variable is set, run:

You should see your token printed in the terminal.

Step 3: Select a Language Model

The AI Model Hub offers a variety of large language models. Choose the model that best fits your use case from the AI Model Hub Models.

circle-info

IONOS Cloud periodically adds and retires models. For the latest list, see the LLMs page or query the API directly:

Step 4: Configure OpenCode

Open your OpenCode configuration file at ~/.config/opencode/opencode.json and add the provider block. If the file already exists with other configuration, merge the provider section into it.

Single model

To configure a single model, use the following example:

All available models

To make all AI Model Hub models available in OpenCode, use the following configuration:

Configuration fields

Field

Description

npm

The AI SDK package. Use @ai-sdk/openai-compatible for any OpenAI-compatible API.

name

Display name shown in the OpenCode model picker.

options.baseURL

The IONOS Cloud OpenAI-compatible inference endpoint. Must end with /v1.

options.apiKey

References the IONOS_API_TOKEN environment variable using {env:...} syntax.

models.<id>

The model identifier. Must match the exact ID from the AI Model Hub API.

models.<id>.name

A human-readable display name for the model.

models.<id>.limit.context

Maximum input context window in tokens.

models.<id>.limit.output

Maximum output tokens the model can generate per response.

Step 5: Select a Model in OpenCode

  1. Launch OpenCode.

  2. Use the /models command to open the model picker.

  3. Your models appear under the IONOS Cloud AI Model Hub provider.

  4. Select the model you want to use.

Troubleshooting

Unauthorized or 401 error

  • Verify your token is set: echo $IONOS_API_TOKEN

  • Ensure the token has not expired — IONOS Cloud JWTs have an exp claim.

  • Generate a new token from the DCD if needed.

Model not found

  • Model identifiers are case-sensitive and must match exactly.

  • Run the model list query in Step 3 to confirm the identifier.

  • Review the Models for retirement notices.

Connection timeout

  • Confirm the base URL is https://openai.inference.de-txl.ionos.com/v1.

  • The URL must end with /v1 — not /v1/chat/completions.

  • Verify that your network allows outbound HTTPS to inference.de-txl.ionos.com.

Token not picked up by OpenCode

  • Ensure your shell profile has been sourced: source ~/.zshrc or source ~/.bashrc.

  • Restart OpenCode after changing environment variables.

  • Verify the configuration uses {env:IONOS_API_TOKEN} and not a hardcoded token value.

Last updated

Was this helpful?