Skip to main content

Ollama and App Configuration

Ollama Host Configuration

Ensure that the Ollama host is installed on your local machine or available on your LAN. Then:

  1. Edit the systemd service file by running:

    sudo nano /etc/systemd/system/ollama.service
  2. Add the following environment variables in the [Service] section:

    Environment="OLLAMA_HOST=0.0.0.0"
    Environment="OLLAMA_ORIGINS=http://tauri.localhost"

    Note: The OLLAMA_HOST=0.0.0.0 setting is optional if the Ollama server is running on localhost and you do not need the Ollama server to be accessed from LAN.

    Note: The OLLAMA_ORIGINS=http://tauri.localhost setting is required only if you use the Windows app version of Agentic Signal to let Ollama know to accept requests from that origin.

  3. Save the file, then reload and restart the service:

    sudo systemctl daemon-reload
    sudo systemctl restart ollama.service

Configure Agentic Signal Client Settings

Current Time Workflow

  1. Open the Settings panel from the dock in the Agentic Signal client.

  2. Browser Executable Path: You need to provide the path to your Chrome browser executable. This is required for web browsing capabilities.

    Note: The Browser Executable Path setting is required only if you use the Windows app version of Agentic Signal.

  3. Ollama Host: Enter the URL of your Ollama server in the "Ollama Host" field (e.g., http://localhost:11434).

  4. In the Ollama Models section:

    • Add a model: Enter the model name (e.g., llama3.1:8b) and click the plus (+) button. Download progress will be shown.
    • Remove a model: Click the trash icon next to any installed model to delete it.

Advanced: Manual Ollama CLI

If you prefer, you can still use the Ollama CLI:

# Pull a lightweight model (recommended for testing but it will be more error prone)
ollama pull llama3.2:1b

# Or pull a more capable model
ollama pull llama3.1:8b