Chat with Local LLMs in Your Browser β Introducing Ollama Client

Table of contents

π 100% local. No cloud. No API keys. Just AI running entirely on your machine.
If you're a developer or tech enthusiast using Ollama to run large language models like LLaMA 2, Mistral, or DeepSeek, this tool is for you.
I'm excited to introduce Ollama Client β a free, open-source Chrome extension that brings local LLM chat right into your browser.
π What is Ollama Client?
Ollama Client is a Chrome extension that connects to your local Ollama server and lets you chat with LLMs directly in the browser.
Thereβs no cloud, no sign-in, and no external API involved. Everything runs locally on your machine.
π§ Supports models like:
LLaMA 2
Mistral
CodeLLaMA
DeepSeek
And any model available via Ollama
π Quick Links
Install on Chrome: Chrome Web Store
Source Code (MIT): GitHub β shishir435/ollama-client
Landing Page: Ollma-client
π§ Key Features
π¬ Multi-Session Chat
Create and manage multiple chat sessions with persistent storage using IndexedDB.
ποΈ Full Model Configuration
Tweak:
temperature
,top_k
,top_p
repeat_penalty
stop sequences
,system prompt
All settings are stored per model.
π§ Prompt Templates
Built-in templates for:
Summarization
Code explanation
Translation
Custom system prompts
π Model Search & Pull
Search and install models from the Ollama library, and track pull progress right inside the extension.
π Streaming Metadata
Every message includes:
Generation duration
Prompt + response token counts
Model config snapshot
π Text-to-Speech (TTS)
Uses the browserβs Web Speech API to read AI messages aloud.
Advanced TTS options (coming soon):
Voice selection
Rate, pitch, and volume control
Interrupt vs queue playback
π§© Seamless CORS Handling
No need to manually set OLLAMA_ORIGINS=*
anymore. The extension uses the Chrome declarativeNetRequest (DNR) API to safely proxy requests to your local Ollama server.
π οΈ Built With
TypeScript + React
shadcn/ui + Tailwind CSS
Dexie.js for chat storage
Plasmo for Chrome extension development
Mozilla Readability for content parsing
π· UI Preview
π§ͺ How to Get Started
Install Ollama:
π https://ollama.comollama run llama2
Subscribe to my newsletter
Read articles from Shishir Chaurasiya directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Shishir Chaurasiya
Shishir Chaurasiya
A full-stack web developer from India.