ollama-client

๐Ÿง  Ollama Client โ€” Chat with Local LLMs in Your Browser

Ollama Client is a powerful, privacy-first Chrome extension that lets you chat with locally hosted LLMs using Ollama โ€” no cloud, no tracking. Itโ€™s lightweight, open source, and designed for fast, offline-friendly AI conversations.


Chrome Supported Chromium Supported Brave Supported Edge Supported Opera Supported Arc Supported Firefox Experimental

โœ… Works with any Chromium-based browser: Chrome, Brave, Edge, Opera, Chromium, and Arc.
๐ŸฆŠ Firefox support available via temporary addon installation (manual permissions setup required).


๐Ÿš€ Get Started โ€” Install Now


โค๏ธ Upvote Us on Product Hunt!

Ollama Client - Chat with local LLMs โ€” right inside your browser | Product Hunt

๐ŸŒ Explore More

Landing Page Documentation

โœจ Features


๐Ÿงฉ Tech Stack


๐Ÿ› ๏ธ Quick Setup

โœ… 1. Install the Extension

๐Ÿ‘‰ Chrome Web Store

โœ… 2. Install Ollama on Your Machine

brew install ollama  # macOS
ollama serve         # starts at http://localhost:11434

More info: https://ollama.com

โœ… 3. Pull a Model

ollama pull gemma3:1b

Other options: mistral, llama3:8b, codellama, etc.

โš™๏ธ 4. Configure the Extension

Advanced parameters like system prompts and stop sequences are available per model.


๐Ÿ› ๏ธ Local Development Setup

Want to contribute or customize? You can run and modify the Ollama Client extension locally using Plasmo.

โš™๏ธ Prerequisites


๐Ÿ“ฆ 1. Clone the Repo

git clone https://github.com/Shishir435/ollama-client.git
cd ollama-client

๐Ÿ“ฅ 2. Install Dependencies

Using pnpm (recommended):

pnpm install

Or with npm:

npm install

๐Ÿงช 3. Run the Extension (Dev Mode)

Start development mode with hot reload:

pnpm dev

Or with npm:

npm run dev

This launches the Plasmo dev server and gives instructions for loading the unpacked extension in Chrome:


๐Ÿงช 4. Run in Firefox (Experimental)

pnpm dev --target=firefox

Load as a temporary extension.


๐Ÿ›  5. Build for Production

pnpm build

Output will be in the build/ or dist/ folder depending on your Plasmo version.


๐Ÿ“ Code Structure


โœ… Tips

System Specs Suggested Models
๐Ÿ’ป 8GB RAM (no GPU) gemma:2b, mistral:7b-q4
๐Ÿ’ป 16GB RAM (no GPU) gemma:3b-q4, mistral
๐ŸŽฎ 16GB+ with GPU (6GB VRAM) llama3:8b-q4, gemma:3b
๐Ÿ”ฅ RTX 3090+ or Apple M3 Max llama3:70b, mixtral

๐Ÿ“ฆ Prefer quantized models (q4_0, q5_1, etc.) for better performance.

Explore: Ollama Model Library


๐Ÿงช Firefox Support (Experimental)

Ollama Client is a Chrome Manifest V3 extension. To use in Firefox:

  1. Go to about:debugging
  2. Click โ€œLoad Temporary Add-onโ€
  3. Select the manifest.json from the extension folder
  4. Manually allow CORS access (see setup guide)

๐Ÿ› Known Issues


๐Ÿ”ฎ Roadmap / Upcoming



๐Ÿ“ข Spread the Word!

If you find Ollama Client helpful, please consider:

Built with โค๏ธ by @Shishir435