Skip to content
Using OpenCode with Dedicated Inference endpoints

Using OpenCode with Dedicated Inference endpoints

Additional model providers can be added to OpenCode via configuration. Assuming you have deployed MiniMaxAI/MiniMax-M2.5

~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "exoscale": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Exoscale",
      "options": {
        "baseURL": "https://<deployment-uuid>.inference.hr-zag-1.exoscale-cloud.com/v1"
      },
      "models": {
        "minimax": {
          "name": "MiniMaxAI/MiniMax-M2.5"
        }
      }
    }
  }
}

Once configured:

  • run opencode
  • type /connect in the prompt
  • select “Exoscale” in the providers list and enter the API key of your deployment.
  • type /models and pick your deployment from the list.
Last updated on