Connecting OpenAPI Spec Files to Azure AI Foundry Agents: Fixing the 400 Error

Ojangole JordanOjangole Jordan
3 min read

The Problem

When working with Azure AI Foundry Agents, a common challenge developers face is connecting external APIs through OpenAPI specification files. Many developers report encountering a 400 error when trying to integrate their OpenAPI specs, even when the API endpoints are working correctly in other contexts. The error looks something like this.

The Solution

After extensive testing and troubleshooting, I discovered that Azure AI Foundry has specific requirements for OpenAPI specifications that aren't immediately obvious:

  1. OpenAPI Version: The specification must use version 3.0.0

  2. Operation IDs: Every path operation must have a unique operationId

  3. Server Configuration: The server URL must be properly configured

Example Implementation

Here's a working example of an OpenAPI specification for the nationalize API that successfully integrates with Azure AI Foundry:

{
  "openapi": "3.0.0",
  "info": {
    "title": "Nationalize API",
    "version": "1.0.0"
  },
  "servers": [
    { "url": "https://api.nationalize.io" }
  ],
  "paths": {
    "/": {
      "get": {
        "operationId": "predictNationality",
        "summary": "Predict the nationality of a name",
        "parameters": [
          {
            "name": "name",
            "in": "query",
            "required": true,
            "description": "A name to predict nationality for",
            "schema": {
              "type": "string"
            }
          }
        ],
        "responses": {
          "200": {
            "description": "Prediction response",
            "content": {
              "application/json": {
                "schema": {
                  "type": "object",
                  "properties": {
                    "name": { "type": "string" },
                    "country": {
                      "type": "array",
                      "items": {
                        "type": "object",
                        "properties": {
                          "country_id": { "type": "string" },
                          "probability": { "type": "number" }
                        }
                      }
                    }
                  }
                }
              }
            }
          }
        }
      }
    }
  }
}

Results from Playground.

Now, if you ask the agent to nationalize a given name, it’ll call the action, run the nationalize api, and return the data back to you in the chat window like in the screenshot below.

Key Requirements for Azure AI Foundry Integration

1. OpenAPI Version

  • Must use "openapi": "3.0.0"

  • Earlier versions (like 2.0) will not work

  • The version must be explicitly set in the specification

2. Operation IDs

  • Every path operation must have a unique operationId

  • The operationId should be descriptive and follow a consistent naming convention

  • Example: "operationId": "predictNationality"

3. Server Configuration

  • The servers array must be properly configured

  • Include the base URL of your API

  • Ensure the URL is accessible from Azure AI Foundry

4. Response Schema

  • Define clear response schemas for all operations

  • Include proper content types (e.g., application/json)

  • Specify the structure of the response data

Common Pitfalls to Avoid

  1. Missing Operation IDs: This is the most common cause of 400 errors

  2. Incorrect OpenAPI Version: Using anything other than 3.0.0 did not work in my case.

  3. Improper Server Configuration: Missing or incorrect server URLs

  4. Incomplete Response Definitions: Missing response schemas or content types

Best Practices

  1. Start with a Working Example: Use the provided example as a template

  2. Validate Your Spec: Use tools like Swagger Editor to validate your OpenAPI spec

  3. Test Incrementally: Add one endpoint at a time and test

  4. Document Your API: Include clear descriptions and examples

  5. Use Consistent Naming: Follow a consistent naming convention for operation IDs

Troubleshooting Steps

If you're still encountering issues:

  1. Verify the OpenAPI version is 3.0.0

  2. Check that all operations have unique operation IDs

  3. Validate your server URL configuration

  4. Ensure your response schemas are properly defined

  5. Test the API endpoint independently using tools like Postman

  6. Check Azure AI Foundry logs for more detailed error messages

Conclusion

By following these guidelines and using the provided example as a template, you should be able to successfully integrate your OpenAPI specifications with Azure AI Foundry Agents. Remember that the key to success is ensuring your specification meets all the requirements, particularly the OpenAPI version and operation IDs.

For more complex APIs, you might need to break down your specification into smaller, more manageable parts and test each one individually. This approach will help you identify and resolve any issues more efficiently.

0
Subscribe to my newsletter

Read articles from Ojangole Jordan directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Ojangole Jordan
Ojangole Jordan