DeepSeek: Your Local AI Powerhouse with Ollama


The world of Large Language Models (LLMs) is rapidly changing, and running them locally is becoming increasingly appealing. DeepSeek, a powerful open-source LLM, combined with Ollama, makes this process surprisingly simple. This guide will walk you through setting up DeepSeek on your own machine.
DeepSeek: A Powerful LLM
DeepSeek stands out for its strong coding abilities and general language understanding. It's open-source, allowing for transparency and community contributions, and importantly, it can be run locally, it’s also the first AI that took an AI’s Job xD(ChatGPT).
Ollama: Simplifying Local LLM Use
Ollama acts as a container and runtime for LLMs, making local deployment straightforward. It handles dependencies and configurations, allowing you to focus on using the model, Docker can also be used :)
Benchmark below:
Getting Started:
Install Ollama:
macOS: Open your terminal and run:
brew install ollama
Linux: Open your terminal and run:
curl -fsSL https://ollama.ai/install.sh | sh
Windows: Follow the installation instructions from the official Ollama website.
Pull DeepSeek:
- Open your terminal and run:
ollama pull deepseek-r1:1.5b
- Open your terminal and run:
Run DeepSeek:
- In your terminal, run:
ollama run deepseek-r1:1.5b
- In your terminal, run:
for example:
Interact in the Terminal:
- Ask questions or provide instructions. Example: "Write a python function that reverses a string."
Stop DeepSeek:
- Press
Ctrl+D
(orCmd+D
on macOS).
- Press
Using DeepSeek in a Script:
You can integrate DeepSeek into your Python scripts: Python
import ollama response = ollama.chat("deepseek-r1:1.5b", "Hello, how can you assist me?") print(response)
Using a Web UI:
For a better user experience, consider using Open WebUI: https://github.com/open-webui/open-webui
After installation, you can access the UI at
http://localhost:11434/api/chat
.
Important Considerations:
Running LLMs locally requires sufficient hardware resources.
Experimenting with different prompts will help you achieve better results.
DeepSeek and Ollama together provide a powerful and accessible way to use LLMs locally.
Models like deepseek will keep coming in the market, it’s us (coders) who have to improve :D, do share your opinion on this statement in the comments!!
Till then, explore, take care, learn and enjoy 🔥
Extra links:
Discord: https://discord.gg/hackunited
Thanks a lot for checking this out, we hope that you have fun! :D
Subscribe to my newsletter
Read articles from Hack United directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
