Skip to content

OpenCode – AI Agent for Your Terminal

OpenCode is a free tool that assists you in development using artificial intelligence. It integrates naturally into your workflow, whether in command line (terminal), on desktop, or directly in your code editor (IDE).

Official website: https://opencode.ai/


Installation

For an optimal experience, we strongly recommend using OpenCode via agent-vm.

For Windows users: OpenCode works better in WSL (Windows Subsystem for Linux) than in a native Windows terminal. WSL ensures better compatibility and a smoother experience.


System Requirements

Installing Lima (Official Documentation)

VERSION=$(curl -fsSL https://api.github.com/repos/lima-vm/lima/releases/latest | jq -r .tag_name)
curl -fsSL "https://github.com/lima-vm/lima/releases/download/${VERSION}/lima-${VERSION:1}-$(uname -s)-$(uname -m).tar.gz" | sudo tar Cxzvm /usr/local
curl -fsSL "https://github.com/lima-vm/lima/releases/download/${VERSION}/lima-additional-guestagents-${VERSION:1}-$(uname -s)-$(uname -m).tar.gz" | sudo tar Cxzvm /usr/local

Installing QEMU (Linux only) (Download)

# Debian / Ubuntu
sudo apt-get install qemu-user-static qemu-system-x86

# Arch Linux
sudo pacman -S qemu-base

Installing Shasum (for file verification)

# Debian / Ubuntu
sudo apt-get install libdigest-sha-perl

# Arch Linux
sudo pacman -S perl-digest-sha

For Arch Linux: Add this line to your profile file (~/.bashrc, ~/.zshrc, etc.):

export PATH="/usr/bin/core_perl:$PATH"


Configuration and Execution

On Linux, macOS, or WSL

mkdir -p ~/.local/bin
git clone https://github.com/sylvinus/agent-vm.git ~/.local/bin/agent-vm

Then add this line to your profile file (~/.bashrc, ~/.zshrc, etc.):

source ~/.local/bin/agent-vm/agent-vm.sh

Restart your terminal, then run one of the following commands based on your needs:

# Light configuration (10 GB disk / 2 GB RAM / 1 CPU)
agent-vm setup

# Intensive configuration (50 GB disk / 16 GB RAM / 8 CPU)
agent-vm setup --disk 50 --memory 16 --cpus 8

If an error occurs, the VM logs are available at ~/.lima/agent-vm-base/ha.stderr.log.

Finally, access your working directory and launch OpenCode:

cd working-directory
agent-vm opencode


Initial Configuration

  1. Create and configure the configuration file:

    mkdir -p ~/.config/opencode
    nano ~/.config/opencode/opencode.json
    

  2. Add your LiteLLM API key (available here) in the "options" section of the "litellm" provider:

    {
      "$schema": "https://opencode.ai/config.json",
      "provider": {
        "litellm": {
          "npm": "@ai-sdk/openai-compatible",
          "models": {
            "general": { "name": "general" },
            "general_nothink": { "name": "general_nothink" },
            "dev-model": { "name": "dev-model" }
          },
          "name": "Litellm",
          "options": {
            "baseURL": "https://api.ia.limos.fr/v1",
            "apiKey": "API_KEY"
          }
        }
      },
      "model": "litellm/dev-model"
    }
    

  3. Copy this file into each project where you want to use OpenCode:

    cp ~/.config/opencode/opencode.json .
    

You must absolutely add this file to your project's .gitignore to avoid sharing your API key.

  1. Launch OpenCode in the relevant folder:
    agent-vm opencode
    

Current Limitation: agent-vm does not allow mounting other folders than the current one in the VM. You must therefore copy the configuration file into each project.