AI-Driven Dockerfile Generator: Leverage Ollama and Gemini's LLMs

Aditya KhadangaAditya Khadanga
2 min read

Introduction

In this tutorial, we'll explore how to leverage Large Language Models (LLMs) โ€” both locally and via the cloud โ€” to generate Dockerfiles intelligently using Python. This hybrid project integrates:

  • Meta's LLaMA 3.2 via Ollama

  • Google Gemini 2.0 Flash API via Google AI Studio


๐Ÿ“ Project Structure

```
llm-gen-ai-project/
โ”œโ”€โ”€ local-llm-ollama/
โ”‚   โ”œโ”€โ”€ dockerfile_generator.py
โ”‚   โ””โ”€โ”€ requirements.txt
โ”œโ”€โ”€ hosted-llm-ollama/
โ”‚   โ”œโ”€โ”€ dockerfile_generator_gemini.py
โ”‚   โ””โ”€โ”€ requirements.txt
โ”œโ”€โ”€ README.md
```

๐Ÿงฉ Project 1: Local LLM with Ollama

โœ… What is Ollama?

Ollama is like Docker but for local LLMs. It gives you a CLI and a model registry (like DockerHub) where you can pull and run LLMs on your machine.

๐Ÿ”ง Setup Instructions

# Install Ollama CLI
curl -fsSL https://ollama.com/install.sh | sh

# Start Ollama service
ollama serve

# Pull and run the LLaMA model
ollama pull llama3.2:1b
ollama run llama3.2:1b

# Set Up Python Environment & Trigger the Script
bash
Copy
Edit
sudo apt install python3
sudo apt install python3.12-venv
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python3 dockerfile_generator.py
deactivate

๐Ÿงฉ Project 2: Hosted LLM with Google Gemini 2.0 Flash

๐ŸŒ What is Gemini?

Google Gemini 2.0 Flash is a powerful, cloud-hosted LLM with token-based pricing. You can use Google AI Studio to generate a free API key and integrate it into your Python app.

๐Ÿ› ๏ธ Setup Steps

python3 -m venv gemini_venv
source gemini_venv/bin/activate
pip install -r requirements.txt
export GOOGLE_API_KEY="your-key-here"
pip install --upgrade google-generativeai
python3 dockerfile_generator_gemini.py
deactivate  #when your task is finished.

๐Ÿ“ฆ Features

๐Ÿ’ก Dynamic Dockerfile generation

โœ… Offline & online LLM integration

๐Ÿง  Natural language prompts inside Python

๐Ÿ” Secure API usage

๐Ÿ“‚ Clean code structure and virtual environments

๐Ÿ” SEO Keywords

AI Dockerfile generator, Ollama LLM tutorial, Google Gemini Python integration, Dockerfile automation with LLM, Meta LLaMA 3.2 Ollama, Google AI Studio API Key, offline AI tools, local LLM CLI, DevOps with AI, Python AI automation


๐Ÿ“Ž Conclusion

Whether you're working on cloud-native projects or developing offline tools, this project shows how LLMs can automate boilerplate tasks like writing Dockerfiles โ€” intelligently and efficiently.

๐Ÿ‘‰ [https://github.com/aditya-khadanga/llm-genAI-Project]

4
Subscribe to my newsletter

Read articles from Aditya Khadanga directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Aditya Khadanga
Aditya Khadanga

A DevOps practitioner dedicated to sharing practical knowledge. Expect in-depth tutorials and clear explanations of DevOps concepts, from fundamentals to advanced techniques. Join me on this journey of continuous learning and improvement!