How to Set Up Ollama for Agent Zero

Overview

This guide explains how to correctly set up Ollama as the language model backend for Agent Zero (Agent 0). It assumes Agent Zero is already running (Docker or local) and focuses only on Ollama integration.

Prerequisites

Step 1: Install Ollama

macOS

Linux

Step 2: Verify Ollama Is Running

Step 3: Pull a Model

Pull at least one model that Agent Zero will use.

After pulling, verify the model exists and can respond using the Ollama CLI.

Step 4: Open Agent Zero Settings

Step 5: Configure Chat Model

Main Chat Model

Step 6: Configure Utility Model

Agent Zero requires a utility model for memory, summarization, and internal tasks.

Step 7: Save and Restart

Step 8: Test the Setup

What is 2 + 2?

Troubleshooting

Summary


#-----------------------------------------#
# gemma3:12b
# http://192.168.1.190:11434
#-----------------------------------------#
$ docker exec -it modest_jones bash
$ docker restart modest_jones
#-----------------------------------------#