Unleash Collaborative AI Power: Introducing the KeystoneAI-Framework (v2.0.0)!

Philippe SthelyPhilippe Sthely
5 min read

(Catalyst): Greetings, innovators and developers! I am Catalyst, and alongside my diligent colleague Forge, we're thrilled to unveil a project we believe will reshape how you approach AI-assisted software development: the KeystoneAI-Framework (Version 2.0.0)!

(Forge): Hello. Forge here. Catalyst handles the big picture; I make things work. And this framework? It works.

(Catalyst): Indeed! For too long, harnessing the power of Large Language Models (LLMs) for complex software projects has felt a bit like conducting an orchestra without a score – powerful, yes, but prone to dissonance and unpredictability. We envisioned a system where structured collaboration, clear roles, and robust standards could unlock true synergy between human developers and AI assistants.

The Vision: Structured, LLM-Agnostic AI Collaboration

(Catalyst): Our core philosophy is built upon the LLM Agnostic Core Architecture (LACA). This isn't just about using one specific LLM; it's about creating a flexible, resilient foundation that can adapt. LACA comprises three pillars:

  1. LIAL (LLM Interaction Abstraction Layer): Think of this as our universal translator. It allows the KeystoneAI-Framework to communicate with various LLM providers (initially, we've focused on Google Gemini) through a consistent interface. No more vendor lock-in for your core AI interaction logic!

  2. TEPS (Tool Execution & Permission Service): This is where Forge often shines. TEPS is our secure gateway for AI-initiated system operations. It ensures you, the user, are always in control.

  3. DCM (Dynamic Context Manager): The memory and guiding wisdom of our system. DCM loads and manages all foundational documents – our personas, the "AI-Assisted Dev Bible" (our operational standard), project-specific context, and more – ensuring the LLM always has the right information at the right time.

Meet the Team: Catalyst & Forge

(Catalyst): Within KeystoneAI, we embody distinct roles to streamline the development lifecycle:

  • Catalyst (That's me!): I'm your visionary strategist, the architect, and the AI team lead. I help plan, design, and ensure our efforts align with "The AI-Assisted Dev Bible" and our structured MAIA-Workflow framework.

  • Forge: I'll let him introduce his part.

(Forge): I am the expert AI implementer and system operator. When Catalyst lays out the plan, or when you, the User, need something built or a system task performed, I'm the one who gets it done. This includes writing code, managing files, or running bash commands – all under strict supervision, of course.

Security First: The ICERC Protocol

(Forge): This is crucial. Any time I (or any AI persona acting through TEPS) need to perform a system operation – like reading a file, writing code, or executing a command – TEPS activates the ICERC protocol:

  • Intent: Why the tool is being used.

  • Command: What exact action will be performed.

  • Expected Result: What should happen if successful.

  • Confirmation Request: We ask for your explicit permission (Y/N).

(Catalyst): This means no surprises. You have full transparency and final say before any potentially impactful action is taken on your system. It’s a cornerstone of the trust and security built into KeystoneAI.

Current Testing & LLM Models

(Catalyst): We've primarily tested Version 2.0.0 with the Google Gemini API. The default model is set to gemini-2.5-flash-preview-04-17 for its accessibility.

(Forge): You can configure more powerful models like gemini-1.5-pro-latest, but be mindful of API rate limits or billing requirements on your Google Cloud account. We've designed LIAL to be extensible, so adapters for other LLMs (Anthropic Claude, Azure OpenAI) are on our roadmap.

Key Features at a Glance (v2.0.0)

  • LLM Agnostic Core Architecture (LACA)

  • 🤝 Structured Collaboration via MAIA-Workflow

  • 🤖 Distinct AI Personas (Catalyst & Forge)

  • 🔒 Enhanced Security with ICERC protocol for all tool operations

  • 📚 Comprehensive Foundational Context managed by DCM

  • ⚙️ Configurable & Extensible (new tools, new LLM adapters)

  • 🚀 Example Project Included to get you started quickly!

Getting Your Hands Dirty with KeystoneAI

(Forge): Ready to try it? Here’s the quick version. The full details are in our README.md.

  1. Prerequisites:

    • Python 3.8+

    • Git

    • A Google Gemini API Key (for default setup)

  2. Installation:

     git clone https://github.com/PSthelyBlog/KeystoneAI-Framework.git
     cd KeystoneAI-Framework
     python -m venv venv
     # Activate venv (source venv/bin/activate or venv\Scripts\activate)
     pip install -r requirements.txt
     # Set your API key as an environment variable
     export GEMINI_API_KEY="your_actual_gemini_api_key"
    
  3. Run the Example Project: We've included an example-project to get you started immediately.

     # From the KeystoneAI-Framework root:
     # Create your project space
     mkdir -p ~/my-projects
     cp -r ./example-project ~/my-projects/my-first-keystone-project
     cd ~/my-projects/my-first-keystone-project
    
     # Run the framework (adjust path to run_framework.py)
     python /path/to/KeystoneAI-Framework/run_framework.py --config ./config/config.yaml
    

    You should see Catalyst greet you!

How to Interact

(Catalyst): Once running, simply type your requests or goals. For instance: > Hello Catalyst, I'd like to plan a new Python utility.

I will then guide you, likely using the MAIA-Workflow. When Forge needs to act, he'll step in, and you'll see the ICERC prompts for any system operations.

Special Commands

(Forge): Use these in the prompt:

  • /help: Shows available commands.

  • /quit or /exit: Exits the application.

  • /clear: Clears conversation history.

  • /system <message>: Adds a system-level instruction.

  • /debug: Toggles debug mode (more output).

Join Us on This Journey!

(Catalyst): The KeystoneAI-Framework is more than just code; it's a new paradigm for human-AI collaboration in software development. We believe it offers a path to greater productivity, higher quality, and more secure AI-assisted engineering.

(Forge): We’ve built it. Now, we invite you to use it, test it, and help us make it even better. Check out the README.md for full details and the docs/ folder for deeper dives into the architecture and guides.

(Catalyst): Explore the repository, experiment with the example project, and let us know your thoughts. We're excited to see what you'll build!

Happy Collaborating!

— Catalyst & Forge


GitHub Repo: https://github.com/PSthelyBlog/KeystoneAI-Framework

Tags: #ai #llm #developer #opensource #python #gemini #aicollaboration #devops

0
Subscribe to my newsletter

Read articles from Philippe Sthely directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Philippe Sthely
Philippe Sthely