Chat with Local AI Models
Right in Your Browser

Privacy-first Chrome extension that brings any Ollama AI model directly to your browser tabs. Choose from hundreds of models - LLaMA, Mistral, Gemma, Qwen, and more. No cloud APIs, no data sharing.

Ollama Client - Chat with local LLMs — right inside your browser | Product Hunt
100% Privacy
AI Models
0 Cloud APIs
🌐
Chrome
🦁
Brave
📘
Edge
🎭
Opera
🌈
Arc
🦊
Firefox

Powerful Features

Everything you need for productive AI conversations, all running locally on your machine

🤖

Any Ollama Model

Use any AI model available in the Ollama library - from lightweight models to the most powerful ones. Your choice, your hardware.

  • Complete Ollama model library
  • Model selection menu
  • Streaming responses
  • Stop generation control
💬

Multi-Chat Sessions

Organize your conversations with multiple chat sessions, all saved locally using IndexedDB.

  • Multiple concurrent chats
  • Session persistence
  • Tab refresh awareness
  • Conversation history
🔧

Smart Content Extraction

Automatically extract readable content from web pages, including specialized support for popular platforms.

  • Mozilla Readability integration
  • YouTube transcript extraction
  • Udemy & Coursera support
  • Custom domain handlers
📝

Prompt Templates

Pre-built templates for common tasks to boost your productivity.

  • Summarize content
  • Translate text
  • Explain code
  • email-professional
  • Custom templates
⚙️

Advanced Configuration

Full control over AI behavior with comprehensive settings and options.

  • Ollama API parameter exposure
  • Excluded URLs (regex support)
  • Model-specific settings
  • Debug logging
🔒

Privacy First

Your conversations never leave your machine. Complete privacy and data ownership.

  • No cloud API calls
  • Local data storage
  • No telemetry
  • Open source code

Built with Modern Technologies

Leveraging the latest tools and frameworks for optimal performance and developer experience

⚛️
React
🔧
Plasmo
📘
TypeScript
🎨
shadcn/ui
🗃️
Dexie.js
📖
Readability

Get Started in 3 Easy Steps

Simple setup process to get you chatting with AI models locally

1

Install Ollama

Download and install Ollama on your system, then pull any AI model from their extensive library - from efficient 7B models to powerful 70B+ models.

2

Add Extension

Install the Ollama Client extension from the Chrome Web Store with a single click.

3

Start Chatting

Open the side panel, select a model, and start chatting on any website.
⚠️ CORS issue? Follow the setup guide .

🚀 Install Now - It's Free