Chatbot:Run flowise and ollama locally

1 min read
Prerequisites
Server:
ollama run llama3.2
ollama run nomic-embed-text:la
Install Flowise locally using NPM:
npm install -g flowise
Start Flowise:
npx flowise start
if successful:
Steps:
1 . Create documents stores
2. Click Document store and select document loader (eg: pdf file)
- Preview and process
- Embedding using nomic-embed-text as we run from ollama library
5.Vector store
Select faiss vector store
- Run upsert and if successful as below
7 . The last step is start chat flow and add nodes. Now we have finished build chatbot using ollama 3.2 locally
0
Subscribe to my newsletter
Read articles from Fadzali Bakar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
