Accelerating AI Agents with Evolute & Docker Containers


AI Agents are transforming the AI landscape by unlocking new possibilities through self-coordination and interaction. These advancements allow for more efficient collaboration between humans and computers, driving innovation across industries. Recently, Evolute partnered with Docker to highlight the power of agentic computing, laying the groundwork for AI agent interactivity and coordination. By leveraging cutting-edge developments like the Anthropic-led Model Context Protocol, Evolute enables seamless interaction between software, data, and AI agents. As the landscape evolves, we anticipate OpenAI will either join or introduce its own framework.
In this article, we’ll walk through the creation of agentic AI software and our ability to create native tools for an evolving AI landscape. Here’s a brief overview of Model Context Protocol’s architecture which will lay the foundation as we explore how the creation of AI Agents can be accelerated via Evolute and Docker:
The ability to leverage tools, such as Stable Diffusion, is at the heart of innovation and interaction in this latest wave of agentic AI. Evolute recently introduced support for AI agents to interact with image generation via Stable Diffusion, using Evolute’s BCD framework to abstract the image model into binary, configuration, and data layers.
We are in a transformative time for software development. For decades, developers relied on imperative programming, where software behavior was defined through detailed, step-by-step instructions. With the rise of machine learning (ML) and artificial intelligence (AI), this approach has shifted. Instead of coding every instruction, engineers now provide data and desired outcomes, allowing AI models to autonomously discover and implement solutions without explicit programming.
Consequently, the way we interact with software components has fundamentally transformed. Rather than instructing every step imperatively, we now guide AI models through data and desired results, allowing them to determine the most effective pathways to accomplish tasks.
At Evolute, we are proud to have been the creator of containerization (US 11,163,614). Our latest products leverage a model we call BCD: binary, configuration, data. Our software was the first to treat the problem of moving existing software into new architectures as a data problem. Thus, by separating the inputs, process (computation performed), and output in a systematic way, Evolute has been able to accelerate the adaptability of software to participate in new AI frameworks.
In our partnership with the Docker community, we were able to see the AI landscape in a new way. After reviewing hundreds of AI software, especially those that are open source, from YOLO to Langchain, we were able to pre-deterministically create a model for moving software into its agentic compatible form factor. For demonstration purposes, we selected Stable Diffusion, as image generation creates an easy way to interact and is also easy to understand for multiple audiences and use cases.
Evolute’s Stable Diffusion AI Image is easy to run and leverage. The container image can be built using the following command:
docker build -t evolute-stable-diffusion:latest .
By default, this containerized Stable Diffusion microservice supports on-demand inference and can dynamically fetch full model packages at runtime.
We built the image to dynamically evolve, akin to the needs of agents, by allowing models to be pulled dynamically or to be built natively into the container image. This also handles a wide range of use cases, from development to production, allowing for models to be pre-built into the image for reproducibility and predictability in runtime.
Thus, derivatives of Stable Diffusion can be built by specifying the MODEL_ID build argument. For example:
docker build \
--build-arg MODEL_ID="<model-id>" \
-t evolute-stable-diffusion-v1-5:latest .
The container can be run by executing the following command:
docker run -d \
--name evolute-stable-diffusion-container \
-v <host-path>:/app/output \
-p <host-port>:8000 \
evolute-stable-diffusion
To generate picture images via the Stable Diffusion container, perform the following:
POST /generate HTTP/1.1
Host: 127.0.0.1
Content-Type: application/json
To generate images via the Stable Diffusion container, you can perform API requests. Below is an example of how to interact with the API using curl, along with the expected output.
Example API Request
You can generate images by sending a POST request to the Evolute Stable Diffusion API using curl. Below is an example of how to perform this request and an example of the expected response.
Sending the Request
Use the following curl command to generate an image based on your prompt:
Request
curl -X POST http://127.0.0.1:<host-port>/generate \
-H "Content-Type: application/json" \
-d '{
"prompt": "A futuristic cityscape",
"height": 512,
"width": 512,
"num_inference_steps": 50
}'
The JSON payload contains the parameters for image generation:
"prompt": The text prompt describing the image you want to generate.
"height" and "width": Dimensions of the generated image in pixels.
"num_inference_steps": The number of steps the model takes to generate the image, affecting quality and generation time.
Throughout our history, we’ve consistently observed that successful customer workloads—whether coded imperatively or trained via ML—require well-defined inputs, processes, and outputs. This principle underpins the design of Evolute’s BCD model architecture, which systematically separates these components for greater clarity and adaptability. By applying this model, Evolute’s ACI images ensure idempotent interfaces that deliver consistent, repeatable user experiences. Whether dealing with text, image, or data outputs, this structure enhances performance and predictability across LLM APIs, making it easier for users to achieve reliable results in various applications.
Upon a successful request, the API will respond with a JSON object containing the prompt used and the filename(s) of the generated image(s):
Response
{
"prompt": ["A futuristic cityscape"],
"file_names": [
"e4b7f36d-838e-4ad6-91e6-0c9df68b1c1f.png"
]
}
The response echoes the prompt passed in and provides filenames to the relevant image(s) in the volume location passed in when running the container.
In this case, Stable Diffusion (image generation) outputs an image; thus, we must provide two key components for the end user: the image(s) and commensurate image ID(s), provided via the file’s name.
Anthropic, known for producing one of the top-performing LLMs in the world, recently introduced its open source Model Context Protocol (MCP). MCP provides a way for anyone—whether individuals, organizations, or developers—to integrate with Anthropic’s ecosystem and shape the end-user experience. Similar to publishing an app on the Apple App Store (or fine, the Google Play Store), MCP enables the creation of AI-powered Agents (third-party software) that can seamlessly interact with Claude and other Anthropic products, making AI-driven applications more accessible and powerful.
The specification outlines a distinct methodology to building AI Agents. MCP servers can be written in TypeScript or Python (protocol layer) and communicate via JSON RPC 2.0 (transport layer) to exchange messages.
For more information on Model Context Protocol, check out:
Documentation: https://modelcontextprotocol.io/
Docker has also created its AI Tools for Developers. This software capability features a BYOLLM, enabling the ability to interface with LLMs and interact with many tools in a single execution.
For more information on Docker AI Tools for Developers, check out https://github.com/docker/labs-ai-tools-for-devs.
By leveraging Evolute’s Transform, AI Images, and BCD Model Architecture, enterprises and developers can seamlessly streamline software into disparate architectures while unlocking new capabilities for AI-driven automation. Evolute Transform accelerates this process by providing a structured framework for agentification, enabling scalable, AI-powered applications that integrate effortlessly with leading-edge LLMs like Anthropic. Through the Model Context Protocol, this ecosystem of tools facilitates rapid deployment, reducing development overhead and bridging the gap between traditional software structures and intelligent agents.
Would you like to determine how you can develop against developer-friendly APIs that are ready to stand the test of time in production? Send us an e-mail to developing@evolute.io to get access to a demo and learn more.
Subscribe to my newsletter
Read articles from Kristopher Francisco directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
