ollama-client

🧠 How to Set Up Ollama for the Chrome Extension (Ollama Client)

This guide walks you through setting up the Ollama backend so it works seamlessly with the Ollama Client Chrome Extension. We’ll cover:


βœ… 1. Install Ollama

Visit the official website to download Ollama:

πŸ‘‰ Download Ollama

Once installed, start the Ollama server:

ollama serve

This launches the Ollama API at:

http://localhost:11434

Keep this running while using the extension.


πŸ€– 2. Pull a Model (e.g. Gemma 3B)

After installation, run a model of your choice:

ollama run gemma3:1b

Once downloaded, you’re ready to chat!

πŸ’‘ Replace gemma3:1b with other models like llama3, mistral, etc.


πŸ” Do You Need to Set Up CORS?

Yes, if you're using Firefox. Probably not for Chrome or other Chromium browsers (click to expand) Whether you need to configure CORS depends on your browser: ### βœ… Chrome / Chromium (e.g., Brave, Edge) If you're using **Chrome-based browsers** and extension version **0.1.3 or later**, you likely **do not need to set any CORS headers**. Ollama Client uses **Chrome’s Declarative Net Request (DNR)** API to rewrite `Origin` headers in requests to `localhost`, which lets it **bypass CORS errors without backend changes**. --- ### 🦊 Firefox Firefox does **not support Chrome’s DNR API**, so **manual configuration is required**. If you're using Firefox, set this in your environment: ```bash OLLAMA_ORIGINS=moz-extension://* ``` --- ### When should you manually set `OLLAMA_ORIGINS`? - You're using **Firefox** - You're on **extension version < 0.1.3** - You're calling from **localhost:3000** or another frontend - You're still getting ❌ `403 Forbidden` errors To cover both Chrome and Firefox, you can combine origins: ```bash OLLAMA_ORIGINS=chrome-extension://*,moz-extension://* ```

🚫 3. Fix ❌ 403 Forbidden: CORS Error

If you’re seeing CORS errors or using Firefox, follow these platform-specific instructions to set OLLAMA_ORIGINS:


πŸ–₯️ macOS (Launch Agent)

  1. Edit the plist file:

    nano ~/Library/LaunchAgents/com.ollama.server.plist
    
  2. Add inside <key>EnvironmentVariables</key>:

    <key>OLLAMA_ORIGINS</key>
    <string>chrome-extension://*,moz-extension://*</string>
    
  3. Reload the agent:

    launchctl unload ~/Library/LaunchAgents/com.ollama.server.plist
    launchctl load -w ~/Library/LaunchAgents/com.ollama.server.plist
    

🐧 Linux (systemd)

  1. Edit the service file:

    sudo systemctl edit --full ollama.service
    
  2. Add under [Service]:

    Environment="OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*"
    
  3. Reload and restart:

    sudo systemctl daemon-reload
    sudo systemctl restart ollama
    

πŸͺŸ Windows

  1. Press Win + R, type sysdm.cpl, press Enter.

  2. Go to Advanced β†’ Environment Variables.

  3. Add a new User variable:

    • Name: OLLAMA_ORIGINS
    • Value: chrome-extension://*,moz-extension://*
  4. Restart Ollama.


πŸ’‘ Multiple Origins Support

You can also allow local web apps like this:

OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*,http://localhost:3000

πŸ” 4. Verify Ollama is Running

Open this in your browser:

http://localhost:11434

Or use curl:

curl http://localhost:11434/api/tags

You should see a JSON response with available models.


βš™οΈ 5. Configure the Extension

  1. Click the βš™οΈ Settings icon in the extension popup.
  2. You can configure:

    • Base URL (http://localhost:11434)
    • Default model (gemma:3b, llama3, etc.)
    • Theme and other preferences

🧯 Troubleshooting