MCP Server with Azure Functions (SQL edition): Part 3 Plus Bonus

Mustahid AhmedMustahid Ahmed
7 min read

In Part 1 of this series, we discussed the basics of deploying Azure Functions both locally and in the cloud, as well as introduced the Model Context Protocol (MCP) inspector. In Part 2, we took a step further and described how to define an MCP tool that can take arguments, as an Azure Function. Now, in Part 3, we'll dive into how to build a set of MCP tools on Azure Functions that interact with a PostgreSQL database. We will also discuss how postgre SQL need to be configured to enable smooth access of MCP tools from azure function.

These tools will be used by an external system—like an AI agent or an MCP inspector—to query and understand the structure and data of a PostgreSQL database hosted on Azure or any compatible Postgres instance. All codes are available in this github repo. Clone this repo and use it as the working directory to understand the explanation in this article.


🧠 What is the Model Context Protocol (MCP)?

The Model Context Protocol (MCP) is a specification designed to enable communication between language models and backend systems. It allows tools to expose capabilities via standardized interfaces, which can be discovered and invoked programmatically.

In our case, each Azure Function serves as an MCP Server, exposing a specific capability related to interacting with a PostgreSQL database:

  • Get list of databases

  • Get schema information

  • Execute read-only queries

  • Retrieve keys and constraints

Each function includes metadata like toolName, description, and toolProperties so that the MCP inspector can discover and use them.


🛠️ Azure Function Python Model V2 Overview

This implementation uses the Azure Functions Python v2 programming model, where functions are defined using decorators and are more Pythonic compared to the v1 model.

We define generic triggers of type "mcpToolTrigger" to represent each tool. These triggers accept a context string containing input parameters and return JSON-formatted responses.


📦 Tools Overview

Here's a summary of the available tools:

Tool NameDescription
get_databases_toolRetrieves all non-template databases from the PostgreSQL server.
get_schemas_toolGets column-level schema info for tables in the public schema.
query_data_toolExecutes read-only SQL queries provided by the user.
get_all_keys_toolRetrieves key and constraint information for tables in the public schema.

🔧 Key Components

1. Parsing Context Arguments

To handle dynamic inputs aka the context argument passed via the MCP inspector, we use _parse_context_args():

def _parse_context_args(context_str: str, expected_arg_names: list) -> dict:
    ...

It parses the incoming JSON string, validates required fields, and returns extracted arguments.

2. Executing Queries

The _execute_query() function handles actual interaction with the database:

def _execute_query(query: str) -> str:
    ...

It connects to the PostgreSQL database using a shared connection manager (db_manager), executes the query, and formats the result as JSON.

✅ This ensures consistent handling of connections, execution, and logging.


💡 Example: get_databases_tool

This function retrieves all non-template databases:

@app.generic_trigger(
    arg_name="context",
    type="mcpToolTrigger",
    toolName="get_databases_tool",
    description="Gets the list of all databases...",
    toolProperties="[]"
)
def get_databases_tool(context: str) -> str:
    query = "SELECT datname FROM pg_database WHERE datistemplate = false;"
    return _execute_query(query)

Note: toolProperties="[]" means it doesn’t require any input parameters.


💬 Example: query_data_tool

This tool allows users to run custom SELECT queries. These queries can be generated by the LLM:

_QUERY_SQL_PROPERTY = "sql_query"

tool_properties_query_data_object = [
    ToolProperty(_QUERY_SQL_PROPERTY, "string", "The SQL SELECT query to execute.")
]
tool_properties_query_data_json = json.dumps([prop.to_dict() for prop in tool_properties_query_data_object])

@app.generic_trigger(
    arg_name="context",
    type="mcpToolTrigger",
    toolName="query_data_tool",
    description="Runs read-only SQL queries...",
    toolProperties=tool_properties_query_data_json,
)
def query_data_tool(context: str) -> str:
    args = _parse_context_args(context, [_QUERY_SQL_PROPERTY])
    if "error" in args:
        return json.dumps(args)

    sql_query = args[_QUERY_SQL_PROPERTY]

    if not sql_query.strip().upper().startswith("SELECT"):
        return json.dumps({"error": "This tool is for SELECT queries only..."})

    return _execute_query(sql_query)

🔐 Security: We enforce that only SELECT queries are allowed to prevent accidental mutations.


🧪 Local Testing & Debugging

You can find the details of the local deployment in part 1: Deploying the MCP Server Code Locally You can test these functions locally by running:

source .venv/bin/activate
func start

Then invoke the endpoints manually or via a local MCP inspector client. Details of MCP inspector usage has been mentioned in part 1.


☁️ Cloud Deployment

Refer to Part 1: Deploy to Azure Function (Cloud) for general cloud deployment.

Ensure your Azure App Service has access to the PostgreSQL database and that environment variables are configured properly in the Azure portal. If you are using Connection String for postgre SQL, make sure it has been added to the Settings>Environment Variables>App Setting section of your azure function Overview page.


🔎 MCP Inspector

Now we can fully view the deployed tools from the MCP inspector by inserting our deployed URI into it:

We can write our sql query code in the cell, `Run Tool` and observe the result.

📚 Bonus: Database Connection Manager

When building serverless applications like Azure Functions, it’s crucial to manage external resources such as database connections efficiently. The following shared module sets up a connection pool to a PostgreSQL database and provides utilities to get and release connections. This ensures that we reuse connections effectively, which is especially important in a stateless, event-driven environment like Azure Functions.

✅ Full Code Overview

import logging
import os
import psycopg2
from psycopg2 import pool

db_pool = None


def init_db_pool():
    global db_pool
    if db_pool is None:
        try:
            conn_str = os.environ.get("POSTGRES_CONNECTION_STRING")
            if not conn_str:
                logging.error(
                    "DB_CONNECTION_STRING environment variable not set."
                )
                raise ValueError(
                    "Database connection string is not configured."
                )

            db_pool = pool.SimpleConnectionPool(1, 20, conn_str)
            logging.info("Database connection pool initialized successfully.")
        except (Exception, psycopg2.Error) as error:
            logging.error(
                f"Error while connecting to PostgreSQL or initializing pool: {error}"
            )
            db_pool = None
    return db_pool


def get_db_connection():
    """Gets a connection from the pool."""
    global db_pool
    if db_pool is None:
        init_db_pool()
        if db_pool is None:
            raise ConnectionError(
                "Database pool is not initialized. Check logs for errors."
            )
    try:
        conn = db_pool.getconn()
        if conn:
            logging.debug("Retrieved a connection from the pool.")
            return conn
        else:
            logging.error(
                "Failed to get connection from pool, pool might be exhausted or broken."
            )
            raise ConnectionError("Failed to get connection from pool.")
    except (Exception, psycopg2.Error) as error:
        logging.error(f"Error getting connection from pool: {error}")
        raise


def release_db_connection(conn):
    """Releases a connection back to the pool."""
    global db_pool
    if db_pool and conn:
        try:
            db_pool.putconn(conn)
            logging.debug("Released a connection back to the pool.")
        except (Exception, psycopg2.Error) as error:
            logging.error(f"Error releasing connection to pool: {error}")


init_db_pool()

📌 Key Components

1. db_pool – Global Connection Pool

A single instance of a connection pool is created and reused across function invocations. This helps reduce the overhead of establishing a new connection every time the function runs.

2. init_db_pool() – Initializes the Pool

  • Checks if a pool already exists.

  • If not, reads the POSTGRES_CONNECTION_STRING from environment variables.

  • Creates a SimpleConnectionPool with a minimum of 1 and maximum of 20 connections.

  • Logs success or failure accordingly.

⚠️ Important: Never hardcode credentials. Always use secure environment variables for secrets like connection strings.

3. get_db_connection() – Acquire a Connection

This function retrieves a connection from the pool. It ensures that the pool is initialized before trying to fetch a connection. If no connection can be retrieved, it raises an appropriate error.

4. release_db_connection(conn) – Return a Connection

After using a database connection, it must be returned to the pool using this function. Failing to do so may cause the pool to become exhausted over time.


🔁 Why Use a Connection Pool with Azure Functions?

Azure Functions are stateless, short-lived, and often run concurrently. Each invocation may need to access the database. Without a connection pool:

  • Every function call opens and closes a new database connection.

  • Opening too many connections can exhaust database resources.

  • Frequent connection setup/teardown increases latency and decreases performance.

With a connection pool, you can:

  • Reuse existing connections instead of creating new ones.

  • Reduce latency by avoiding repeated TCP handshakes and authentication.

  • Prevent hitting connection limits on the database side.

  • Improve scalability and reliability under concurrent load.

💡 Tip: Since Azure Functions may scale out to multiple instances, ensure your database allows enough concurrent connections to handle the expected number of function instances.


🧰 Future Enhancements

  • Add support for INSERT/UPDATE/DELETE operations via separate tools

  • Implement pagination and filtering for large datasets

  • Integrate authentication and role-based access control

  • Use OpenTelemetry for tracing and monitoring


✅ Conclusion

By building these MCP-compliant tools with Azure Functions (Python v2 model), you're enabling seamless integration between your PostgreSQL database and intelligent agents or inspectors that follow the Model Context Protocol.

Thank you for following along! If you found this series helpful, consider liking and sharing it with your team or community. Stay tuned — there’s more cloud, Python, and serverless content coming your way soon!

🚀 Happy coding!


📚 Resources

2
Subscribe to my newsletter

Read articles from Mustahid Ahmed directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Mustahid Ahmed
Mustahid Ahmed