Getting Started with Microsoft Semantic Kernel: A Simple Guide to Prompts and Plugins in Python

Ever wished your app could think a little more like a human? Microsoft’s Semantic Kernel bridges the gap between traditional code and natural language. Let's talk about what it is, how it works, and how you can start building your own AI agents today with just a bit of C#, Python or even Java.

🧠 What is Semantic Kernel?

Semantic Kernel is an open-source SDK by Microsoft that lets you combine code (your logic and plugins) with language models (like OpenAI's GPT or Azure OpenAI) to build AI-first applications.

Think of it like this:

Code gives structure, LLM gives context.

With SK, you can do things like:

  • Chain prompts and functions together (like little workflows).

  • Inject plugins that are traditional code (think: math, file I/O, DB access).

  • Give memory to your app so it can remember stuff.

  • Easily swap between OpenAI, Azure OpenAI, and HuggingFace models.

Let’s get started! 🏊‍♂️

In this blog, I’ll walk you through how to:

✅ Set up Semantic Kernel in Python
📦 Load semantic prompts from files
🧠 Optionally wire up memory
🎨 Use it to summarize content and generate poetry

➡️ Hands-on time!

Create a new project folder called Ms-Semantic-kernel and use this structure:

(I have created a sample GitHub repo with all the code and prompt files mentioned in this blog)
👉 (Check out the repo here)

Ms-Semantic-Kernel/
├── app.py
├── kernel_config.py
├── .env
├── prompts/
│   └── summarize.txt
├── plugins/
│   └── time_plugin.py
├── memory/
│   └── memory_store.py
├── requirements.txt
└── README.md

🔑 .env File

Inside .env, place your OpenAI API key:

OPENAI_API_KEY=sk-your-openai-key
MODEL=gpt-3.5-turbo

📦 Installation

Install dependencies:

pip install -r requirements.txt

🧠 Code Walkthrough

📌 kernel_config.py:

his configures the core Kernel with OpenAI as the completion service:

import os
from semantic_kernel.kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

def build_kernel():
    kernel = Kernel()
    kernel.add_text_completion_service(
        "openai-gpt",
        OpenAIChatCompletion(
            service_id="openai-gpt",
            api_key=os.getenv("OPENAI_API_KEY"),
            model=os.getenv("MODEL", "gpt-3.5-turbo")
        )
    )
    return kernel

📝 prompts/summarize.txt

A semantic prompt that tells the AI to summarize content into bullet points.

{{$input}}

Summarize the above content in 5 bullet points.

🔌 plugins/time_plugin.py

A native plugin that returns today’s date:

🔌 plugins/time_plugin.py
A native plugin that returns today’s date.

🧠 Optional Memory (if you want to use it later)

memory/memory_store.py:

from semantic_kernel.memory import VolatileMemoryStore
from semantic_kernel.memory.semantic_text_memory import SemanticTextMemory

def get_memory():
    return SemanticTextMemory(VolatileMemoryStore())

🚀 Main Applicationapp.py

This ties everything together:

import asyncio
from dotenv import load_dotenv
import os

from kernel_config import build_kernel
from plugins.time_plugin import TimePlugin

load_dotenv()

async def main():
    kernel = build_kernel()

    # Load prompt
    with open("prompts/summarize.txt") as f:
        prompt = f.read()

    summarize = kernel.create_semantic_function(prompt_template=prompt, name="summarizeText")

    # Register native plugin
    kernel.import_plugin(TimePlugin(), "time")

    text = \"\"\"
    Microsoft Semantic Kernel is an SDK that integrates AI with conventional programming.
    It enables prompt chaining, memory, and native plugin support.
    With SK, you can define prompt templates in text files,
    inject native Python functions as plugins, and build full workflows.
    \"\"\"

    summary = await summarize.invoke_async(text)
    print("\\n Summary:")
    print(summary)

    # Prompt with native plugin (dynamic content)
    poetic_prompt = \"\"\"
    Today’s date is: {{time.get_current_date}}

    Write a short poem about the beauty of today.
    \"\"\"
    poet = kernel.create_semantic_function(poetic_prompt)
    result = await poet.invoke_async("")
    print("\\n Poem:")
    print(result)

if __name__ == "__main__":
    asyncio.run(main())

✅ Output Example

Here’s the output from running the app:

🔍 Summary:

• Microsoft Semantic Kernel is an SDK.
• It blends AI with traditional programming.
• Enables prompt chaining and plugin support.
• Offers memory and prompt templating.
• Helps build intelligent workflows with LLMs.

🎨 Poem:

Today’s date is: 2025-05-16

The sky is bright, the breeze is light,  
Everything feels perfectly right.  
Nature whispers, colors gleam,  
Today unfolds like a waking dream.  
A lovely day in every way,  
Enjoy the gift, don't let it stray.

💡 Tips for Using Semantic Kernel Effectively

  1. Keep prompts simple and clean – Break them into functions if they’re long.

  2. Name your plugins and functions clearly – You’ll thank yourself later.

  3. Combine native + semantic – LLMs hallucinate, code doesn’t.

  4. Use memory for continuity – Great for conversations or document chains.

  5. Experiment and iterate – Prompt engineering is more art than science.

📎 References 📖

🧵 Final Thoughts

Semantic Kernel is not just another AI wrapper. It’s a thoughtful, extensible framework for building next-gen applications where AI is a first-class citizen.

If you're a developer who's ever thought:

“I wish GPT could just talk to my code.”

This is your toolkit.

❤️ Let’s Chat!

If you're building with Semantic Kernel or thinking about it, let’s connect! I’d love to see what you're working on, or help debug your prompts!

Follow me X(Twitter) or drop a comment below.
Let’s make our apps smarter together.

0
Subscribe to my newsletter

Read articles from Harshal Rembhotkar directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Harshal Rembhotkar
Harshal Rembhotkar