Chat with Local LLMs in Your Browser β€” Introducing Ollama Client


πŸ”’ 100% local. No cloud. No API keys. Just AI running entirely on your machine.

If you're a developer or tech enthusiast using Ollama to run large language models like LLaMA 2, Mistral, or DeepSeek, this tool is for you.

I'm excited to introduce Ollama Client β€” a free, open-source Chrome extension that brings local LLM chat right into your browser.


πŸš€ What is Ollama Client?

Ollama Client is a Chrome extension that connects to your local Ollama server and lets you chat with LLMs directly in the browser.

There’s no cloud, no sign-in, and no external API involved. Everything runs locally on your machine.

🧠 Supports models like:

  • LLaMA 2

  • Mistral

  • CodeLLaMA

  • DeepSeek

  • And any model available via Ollama



πŸ”§ Key Features

πŸ’¬ Multi-Session Chat

Create and manage multiple chat sessions with persistent storage using IndexedDB.

πŸŽ›οΈ Full Model Configuration

Tweak:

  • temperature, top_k, top_p

  • repeat_penalty

  • stop sequences, system prompt

All settings are stored per model.

🧠 Prompt Templates

Built-in templates for:

  • Summarization

  • Code explanation

  • Translation

  • Custom system prompts

πŸ” Model Search & Pull

Search and install models from the Ollama library, and track pull progress right inside the extension.

πŸ“Š Streaming Metadata

Every message includes:

  • Generation duration

  • Prompt + response token counts

  • Model config snapshot

πŸ”ˆ Text-to-Speech (TTS)

Uses the browser’s Web Speech API to read AI messages aloud.

Advanced TTS options (coming soon):

  • Voice selection

  • Rate, pitch, and volume control

  • Interrupt vs queue playback

🧩 Seamless CORS Handling

No need to manually set OLLAMA_ORIGINS=* anymore. The extension uses the Chrome declarativeNetRequest (DNR) API to safely proxy requests to your local Ollama server.


πŸ› οΈ Built With

  • TypeScript + React

  • shadcn/ui + Tailwind CSS

  • Dexie.js for chat storage

  • Plasmo for Chrome extension development

  • Mozilla Readability for content parsing


πŸ“· UI Preview


πŸ§ͺ How to Get Started

  1. Install Ollama:
    πŸ‘‰ https://ollama.com

     ollama run llama2
    
0
Subscribe to my newsletter

Read articles from Shishir Chaurasiya directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Shishir Chaurasiya
Shishir Chaurasiya

A full-stack web developer from India.