JAN AI - A vscode extension which can run LLM locally (100%)


It a alternative to GitHub copilot but it runs entirely local with the help of ollama now it has the support of DeepSeek R1 - 7B and DeepSeek R1 - 1.5B model which ensure security and privacy, its built inside the vscode which make is more accessible to developers without interrupt in their development workflow.
It’s an open-source project and actively monitored for development and always welcome contribution via GitHub and contributions details are available one kanban board on GitHub project section.
This project future vision is to support different types of models like gemma, llama, Qwen and more… with more accessibility by deep implementation into visual studio code IDE.
Security and privacy
One of the main core features of this project is security and privacy
Unlike traditional AI, there is No API calls.
LLM runs locally on the user system make sure there is no data leaks.
No data leaves your device so zero risk of exposing sensitive code or data to third-party service.
Open-source project
Its an open-source project, contributions are always welcome.
Projects development is done via GitHub kanban board in project section.
new bugs and issue stage will be monitored regularly.
Be a part of our JAN AI community.
Subscribe to my newsletter
Read articles from JAYASURYA R directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
