Getting Started with Koog, Docker Model Runner, Agentic Compose and the Docker MCP Gateway


Today I wanted to check if I could use JetBrains' Koog framework with Docker Model Runner and it ended up going a bit further than I had planned. So in this new blog post, we'll see how to create a small Koog agent specializing in ratatouille recipes (disclaimer: I'm French 😂). We'll be using:
Koog: a framework for building AI Agents in Kotlin
Docker Model Runner: a Docker feature that allows deploying AI models locally, based on Llama.cpp
Agentic Compose: a Docker Compose feature to easily integrate AI models into your applications
Docker MCP Gateway: a gateway to access MCP (Model Context Protocol) servers from the Docker MCP hub - (you can also use it with your own MCP servers, but that will be the subject of another article).
Prerequisites: Kotlin project initialization
I use IntelliJ IDEA Community Edition to initialize the Kotlin project.
I use OpenJDK 23 and Gradle Kotlin DSL for project configuration.
Gradle Configuration
Here's my build configuration: build.gradle.kts
plugins {
kotlin("jvm") version "2.1.21"
application
}
group = "kitchen.ratatouille"
version = "1.0-SNAPSHOT"
repositories {
mavenCentral()
}
dependencies {
testImplementation(kotlin("test"))
implementation("ai.koog:koog-agents:0.3.0")
implementation("org.slf4j:slf4j-simple:2.0.9")
}
application {
mainClass.set("kitchen.ratatouille.MainKt")
}
tasks.test {
useJUnitPlatform()
}
tasks.jar {
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
manifest {
attributes("Main-Class" to "kitchen.ratatouille.MainKt")
}
from(configurations.runtimeClasspath.get().map { if (it.isDirectory) it else zipTree(it) })
}
kotlin {
jvmToolchain(23)
}
Docker Compose Project Configuration
The new "agentic" feature of Docker Compose allows defining the models to be used by Docker Compose services.
So with the content below, I define that I would use the hf.co/menlo/lucy-128k-gguf:q4_k_m
model from Hugging Face for my "Koog agent".
models:
app_model:
model: hf.co/menlo/lucy-128k-gguf:q4_k_m
And I make the "link" between the koog-app
service and the app_model
model and the Koog agent as follows at the service level:
models:
app_model:
endpoint_var: MODEL_RUNNER_BASE_URL
model_var: MODEL_RUNNER_CHAT_MODEL
Docker Compose will automatically inject the MODEL_RUNNER_BASE_URL
and MODEL_RUNNER_CHAT_MODEL
environment variables into the koog-app
service, which allows the Koog agent to connect to the model.
If you entered interactive mode in the koog-app
container, you could verify that the environment variables are properly defined with the command:
env | grep '^MODEL_RUNNER'
And you would get something like:
MODEL_RUNNER_BASE_URL=http://model-runner.docker.internal/engines/v1/
MODEL_RUNNER_CHAT_MODEL=hf.co/menlo/lucy-128k-gguf:q4_k_m
It's entirely possible to define multiple models.
The complete compose.yaml
file looks like this:
services:
koog-app:
build:
context: .
dockerfile: Dockerfile
environment:
SYSTEM_PROMPT: You are a helpful cooking assistant.
AGENT_INPUT: How to cook a ratatouille?
models:
app_model:
endpoint_var: MODEL_RUNNER_BASE_URL
model_var: MODEL_RUNNER_CHAT_MODEL
models:
app_model:
model: hf.co/menlo/lucy-128k-gguf:q4_k_m
Dockerfile
Next we'll need a Dockerfile
to build the Docker image of our Koog application. The Dockerfile
uses multi-stage build to optimize the final image size, so it's divided into two parts/stages: one for building the application (build
) and one for execution (runtime
). Here's the content of the Dockerfile
:
# Stage 1: Build
FROM eclipse-temurin:23-jdk-noble AS build
WORKDIR /app
COPY gradlew .
COPY gradle/ gradle/
COPY build.gradle.kts .
COPY settings.gradle.kts .
RUN chmod +x ./gradlew
COPY src/ src/
# Build
RUN ./gradlew clean build
# Stage 2: Runtime
FROM eclipse-temurin:23-jre-noble AS runtime
WORKDIR /app
COPY --from=build /app/build/libs/ratatouille-1.0-SNAPSHOT.jar app.jar
CMD ["java", "-jar", "app.jar"]
Kotlin side:
Connecting to Docker Model Runner
Now, here's the source code of our application, in the src/main/kotlin/Main.kt
file to be able to use Docker Model Runner. The API exposed by Docker Model Runner is compatible with the OpenAI API, so we'll use Koog's OpenAI client to interact with our model:
package kitchen.ratatouille
import ai.koog.prompt.executor.clients.openai.OpenAIClientSettings
import ai.koog.prompt.executor.clients.openai.OpenAILLMClient
suspend fun main() {
val apiKey = "nothing"
val customEndpoint = System.getenv("MODEL_RUNNER_BASE_URL").removeSuffix("/")
val model = System.getenv("MODEL_RUNNER_CHAT_MODEL")
val client = OpenAILLMClient(
apiKey=apiKey,
settings = OpenAIClientSettings(customEndpoint)
)
}
First Koog Agent
Creating an agent with Koog is relatively simple as you can see in the code below. We'll need:
a
SingleLLMPromptExecutor
that will use the OpenAI client we created previously to execute requests to the model.an
LLModel
that will define the model we're going to use.an
AIAgent
that will encapsulate the model and the prompt executor to execute requests.
Regarding the prompt, I use the SYSTEM_PROMPT
environment variables to define the agent's system prompt, and AGENT_INPUT
to define the agent's input (the "user message"). These variables were defined in the compose.yaml
file previously:
environment:
SYSTEM_PROMPT: You are a helpful cooking assistant.
AGENT_INPUT: How to cook a ratatouille?
And here's the complete code of the Koog agent in the src/main/kotlin/Main.kt
file:
package kitchen.ratatouille
import ai.koog.agents.core.agent.AIAgent
import ai.koog.prompt.executor.clients.openai.OpenAIClientSettings
import ai.koog.prompt.executor.clients.openai.OpenAILLMClient
import ai.koog.prompt.executor.llms.SingleLLMPromptExecutor
import ai.koog.prompt.llm.LLMCapability
import ai.koog.prompt.llm.LLMProvider
import ai.koog.prompt.llm.LLModel
suspend fun main() {
val apiKey = "nothing"
val customEndpoint = System.getenv("MODEL_RUNNER_BASE_URL").removeSuffix("/")
val model = System.getenv("MODEL_RUNNER_CHAT_MODEL")
val client = OpenAILLMClient(
apiKey=apiKey,
settings = OpenAIClientSettings(customEndpoint)
)
val promptExecutor = SingleLLMPromptExecutor(client)
val llmModel = LLModel(
provider = LLMProvider.OpenAI,
id = model,
capabilities = listOf(LLMCapability.Completion)
)
val agent = AIAgent(
executor = promptExecutor,
systemPrompt = System.getenv("SYSTEM_PROMPT"),
llmModel = llmModel,
temperature = 0.0
)
val recipe = agent.run(System.getenv("AGENT_INPUT"))
println("Recipe:\n $recipe")
}
Running the project
All that's left is to launch the project with the following command:
docker compose up --build --no-log-prefix
⏳⏳⏳ Then wait a moment, depending on your machine the build and completion times will be more or less long. I nevertheless chose Lucy 128k because it can run on small configurations, even without GPU. This model also has the advantage of being quite good at "function calling" detection despite its small size (however it doesn't support parallel tool calls). And you should finally get something like this in the console:
Recipe:
Sure! Here's a step-by-step guide to cooking a classic ratatouille:
---
### **Ingredients**
- 2 boneless chicken thighs or 1-2 lbs rabbit (chicken is common, but rabbit is traditional)
- 1 small onion (diced)
- 2 garlic cloves (minced)
- 1 cup tomatoes (diced)
- 1 zucchini (sliced)
- 1 yellow squash or eggplant (sliced)
- 1 bell pepper (sliced)
- 2 medium potatoes (chopped)
- 1 red onion (minced)
- 2 tbsp olive oil
- 1 tbsp thyme (or rosemary)
- Salt and pepper (to taste)
- Optional: 1/4 cup wine (white or red) to deglaze the pan
---
### **Steps**
1. **Prep the Ingredients**
- Dice the onion, garlic, tomatoes, zucchini, squash, bell pepper, potatoes.
- Sauté the chicken in olive oil until browned (about 10–15 minutes).
- Add the onion and garlic, sauté for 2–3 minutes.
2. **Add Vegetables & Flavor**
- Pour in the tomatoes, zucchini, squash, bell pepper, red onion, and potatoes.
- Add thyme, salt, pepper, and wine (if using). Stir to combine.
- Add about 1 cup water or stock to fill the pot, if needed.
3. **Slow Cook**
- Place the pot in a large pot of simmering water (or use a Dutch oven) and cook on low heat (around 200°F/90°C) for about 30–40 minutes, or until the chicken is tender.
- Alternatively, use a stovetop pot with a lid to cook the meat and vegetables together, simmering until the meat is cooked through.
4. **Finish & Serve**
- Remove the pot from heat and let it rest for 10–15 minutes to allow flavors to meld.
- Stir in fresh herbs (like rosemary or parsley) if desired.
- Serve warm with crusty bread or on the plate as is.
---
### **Tips**
- **Meat Variations**: Use duck or other meats if you don't have chicken.
- **Vegetables**: Feel free to swap out any vegetables (e.g., mushrooms, leeks).
- **Liquid**: If the mixture is too dry, add a splash of water or stock.
- **Serving**: Ratatouille is often eaten with bread, so enjoy it with a side of crusty bread or a simple salad.
Enjoy your meal! 🥘
As you can see, it's quite simple to create an agent with Koog and Docker Model Runner! 🎉
But we have a problem, I told you I was French and the ratatouille recipe proposed by Lucy 128k doesn't really suit me: there's no rabbit, chicken or duck in a ratatouille!!! 🤬🤬🤬. But let's see how to fix that.
Let's add superpowers to our Koog agent with the Docker MCP Gateway
What I'd like to do now is have my application first search for information about ratatouille ingredients, and then have the Koog agent use this information to improve the recipe. For this, I'd like to use the DuckDuckGo MCP server that's available on the Docker MCP Hub. And to make my life easier, I'm going to use the Docker MCP Gateway to access this MCP server.
Configuring the Docker MCP Gateway in Docker Compose
To use the Docker MCP Gateway, I'll first modify the compose.yml
file to add the gateway configuration.
Configuring the gateway in the compose.yaml
file
Here's the configuration I added for the gateway in the compose.yaml
file:
mcp-gateway:
image: docker/mcp-gateway:latest
command:
- --port=8811
- --transport=sse
- --servers=duckduckgo
- --verbose
volumes:
- /var/run/docker.sock:/var/run/docker.sock
This configuration will create an mcp-gateway
service that will listen on port 8811
and use the sse
(Server-Sent Events) transport to communicate with MCP servers.
Important:
with
--servers=duckduckgo
I can filter the available MCP servers to only use the DuckDuckGo server.the MCP Gateway will automatically pull the available MCP servers from the Docker MCP Hub.
The MCP Gateway is an open source project that you can find here: https://github.com/docker/mcp-gateway
Next, I'll modify the koog-app
service so it can communicate with the gateway by adding the MCP_HOST
environment variable that will point to the gateway URL as well as the dependency on the mcp-gateway
service:
environment:
MCP_HOST: http://mcp-gateway:8811/sse
depends_on:
- mcp-gateway
I'll also modify the system prompt and user message:
environment:
SYSTEM_PROMPT: |
You are a helpful cooking assistant.
Your job is to understand the user prompt and decide if you need to use tools to run external commands.
AGENT_INPUT: |
Search for the ingredients to cook a ratatouille, max result 1
Then, from these found ingredients, generate a yummy ratatouille recipe
Do it only once
So here's the complete compose.yml
file with the MCP Gateway configuration and the modifications made to the koog-app
service:
services:
koog-app:
build:
context: .
dockerfile: Dockerfile
environment:
SYSTEM_PROMPT: |
You are a helpful cooking assistant.
Your job is to understand the user prompt and decide if you need to use tools to run external commands.
AGENT_INPUT: |
Search for the ingredients to cook a ratatouille, max result 1
Then, from these found ingredients, generate a yummy ratatouille recipe
Do it only once
MCP_HOST: http://mcp-gateway:8811/sse
depends_on:
- mcp-gateway
models:
app_model:
# NOTE: populate the environment variables with the model runner endpoint and model name
endpoint_var: MODEL_RUNNER_BASE_URL
model_var: MODEL_RUNNER_CHAT_MODEL
mcp-gateway:
image: docker/mcp-gateway:latest
command:
- --port=8811
- --transport=sse
- --servers=duckduckgo
- --verbose
volumes:
- /var/run/docker.sock:/var/run/docker.sock
models:
app_model:
model: hf.co/menlo/lucy-128k-gguf:q4_k_m
Now, let's modify the Kotlin code to use the MCP Gateway and search for ratatouille ingredients.
Modifying the Kotlin code to use the MCP Gateway
The modification is extremely simple, you just need to:
define the MCP transport (
SseClientTransport
) with the gateway URL:val transport = McpToolRegistryProvider.defaultSseTransport(System.getenv("MCP_HOST"))
create the MCP tools registry with the gateway:
val toolRegistry = McpToolRegistryProvider.fromTransport(transport = transport, name = "sse-client", version = "1.0.0")
and finally add the tools registry to the Koog agent constructor:
toolRegistry = toolRegistry
Extremely important: I added capabilities = listOf(LLMCapability.Completion, LLMCapability.Tools)
for the LLM model, because we're going to use its "function calling" capabilities (the tools are defined and provided by the MCP server).
Here's the complete code of the Koog agent modified to use the MCP Gateway in the src/main/kotlin/Main.kt
file:
package kitchen.ratatouille
import ai.koog.agents.core.agent.AIAgent
import ai.koog.agents.mcp.McpToolRegistryProvider
import ai.koog.prompt.executor.clients.openai.OpenAIClientSettings
import ai.koog.prompt.executor.clients.openai.OpenAILLMClient
import ai.koog.prompt.executor.llms.SingleLLMPromptExecutor
import ai.koog.prompt.llm.LLMCapability
import ai.koog.prompt.llm.LLMProvider
import ai.koog.prompt.llm.LLModel
suspend fun main() {
val transport = McpToolRegistryProvider.defaultSseTransport(System.getenv("MCP_HOST"))
// Create a tool registry with tools from the MCP server
val toolRegistry = McpToolRegistryProvider.fromTransport(
transport = transport,
name = "sse-client",
version = "1.0.0"
)
println(toolRegistry.tools)
val apiKey = "nothing"
val customEndpoint = System.getenv("MODEL_RUNNER_BASE_URL").removeSuffix("/")
val model = System.getenv("MODEL_RUNNER_CHAT_MODEL")
val client = OpenAILLMClient(
apiKey=apiKey,
settings = OpenAIClientSettings(customEndpoint)
)
val promptExecutor = SingleLLMPromptExecutor(client)
val llmModel = LLModel(
provider = LLMProvider.OpenAI,
id = model,
capabilities = listOf(LLMCapability.Completion, LLMCapability.Tools)
)
val agent = AIAgent(
executor = promptExecutor,
systemPrompt = System.getenv("SYSTEM_PROMPT"),
llmModel = llmModel,
temperature = 0.0,
toolRegistry = toolRegistry
)
val recipe = agent.run(System.getenv("AGENT_INPUT"))
println("Recipe:\n $recipe")
}
Launching the project with the MCP Gateway
Let's launch the project again with the command:
docker compose up --build --no-log-prefix
And after a while you should get a new ratatouille recipe, but the LLM will have relied on the search results performed by the DuckDuckGo MCP server (via the MCP Gateway) to improve the recipe. The response time will be a bit longer, because the LLM will first query the MCP server to get the ratatouille ingredients, then it will generate the recipe. And the DuckDuckGo MCP server will search for links and then retrieve the content of those links (indeed, the DuckDuckGo MCP server exposes 2 tools: search
and fetch_content
).
Here's an example of what you might get with an improved and more "authentic" ratatouille recipe:
Recipe:
Here's a **complete and easy-to-follow version** of **Ratatouille**, based on the recipe you provided, with tips and variations to suit your preferences:
---
### 🔍 **What Is Ratatouille?**
A classic French vegetable stew, traditionally made with eggplant, tomatoes, zucchini, bell peppers, onions, and mushrooms. It's often seasoned with herbs like parsley, thyme, or basil and paired with crusty bread or a light sauce.
---
### 🍲 **Ingredients** (for 4 servings):
- **1/2 cup olive oil** (divided)
- **2 tbsp olive oil** (for the skillet)
- **3 cloves garlic**, minced
- **1 eggplant**, cubed
- **2 zucchinis**, sliced
- **2 large tomatoes**, chopped
- **2 cups fresh mushrooms**, sliced
- **1 large onion**, sliced
- **1 green or red bell pepper**, sliced
- **1/2 tsp dried parsley**
- **Salt to taste**
- **1/2 cup grated Parmesan cheese** (or pecorino, as you mentioned)
---
### 🍳 **How to Make Ratatouille**
**Preheat oven** to 350°F (175°C).
1. **Prepare the dish**: Coat a 1½-quart casserole dish with 1 tbsp olive oil.
2. **Cook the base**: In a skillet, sauté garlic until fragrant (about 1–2 minutes). Add eggplant, parsley, and salt; cook for 10 minutes until tender.
3. **Layer the vegetables**: Spread the eggplant mixture in the dish, then add zucchini, tomatoes, mushrooms, onion, and bell pepper. Top with Parmesan.
4. **Bake**: Cover and bake for 45 minutes. Check for tenderness; adjust time if needed.
**Cook's Note**:
- Add mushrooms (optional) or omit for a traditional flavor.
- Use fresh herbs like thyme or basil if preferred.
- Substitute zucchini with yellow squash or yellow bell pepper for color.
---
### 🍽️ **How to Serve**
- **Main dish**: Serve with crusty French bread or rice.
- **Side**: Pair with grilled chicken or fish.
- **Guilt-free twist**: Add black olives or a sprinkle of basil/others for a lighter version.
---
Conclusion
This blog post perfectly illustrates the modern containerized AI ecosystem that Docker is building. By combining Docker Model Runner, Agentic Compose, Docker MCP Gateway and the Koog framework (but we could of course use other frameworks), we were able to create an "intelligent" agent quite simply.
Docker Model Runner allowed us to use an AI model locally.
Agentic Compose simplified the integration of the model into our application, by automatically injecting the necessary environment variables.
The Docker MCP Gateway transformed our little agent into a system capable of interacting with the outside world.
The Koog framework allowed us to orchestrate these components in Kotlin.
In the near future, I'll go deeper into the MCP Gateway and how to use it with your own MCP servers, and not just with Koog. And I continue my explorations with Koog and Docker Model Runner.
Acknowledgments a big thank you to David Gageot and Guillaume Lours for their listening and especially their patience 😉.
The entire source code of this project is available here: https://github.com/Short-Compendium/docker-model-runner-with-koog/tree/main/ratatouille
Subscribe to my newsletter
Read articles from Philippe Charrière directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
