🚀 Unleash the Power of Your API Gateway with AI : A simple guide to create your first AI Gateway 🤖

Thomas DelafayeThomas Delafaye
5 min read

In today’s world, AI is revolutionizing the way we interact with technology.

One of the most powerful applications of AI is our API gateways, where AI can enhance functionality, security, and user experience.

In this new article, we’ll walk you through the steps to create your first AI provider and context entities, enabling you to leverage AI in your API management.

Step-by-Step Guide to create your AI Provider and Context Entities

1. Create your Serverless Project

First, you need to sign up on Cloud APIM and create your first project to use our AI plugins.

If you haven't created one yet, follow these steps :

2. Clone or Fork our Git Repository

Go to our AI plugins template repository https://github.com/cloud-apim/otoroshi-llm-extension-serverless-example

Clone it or fork it to your personal repository.

git clone https://github.com/cloud-apim/otoroshi-llm-extension-serverless-example.git

This repository will host all your Serverless project's configuration 😁

3. Choose Your AI Provider

Cloud APIM offers multiple providers to use through its plugins :

  • OpenAI

  • Azure OpenAI

  • Ollama

  • Mistral

  • Anthropic

Get your API Token in order to configure your AI Provider entity in your Cloud APIM Serverless project.

4. Set up your environment variables

Now let's secure your AI Provider API Token.

To do it we will use an environment variable defined as a secret variable for your project.

Go to your project's dashboard and click on 'Create a new environment variable' Name it, for example AI_PROVIDER_TOKEN and paste your AI Provider Token in the environment variable value box below the name.

And just save it !

5. Set Up Your AI Provider Entity

Now, navigate to your entities/ai.json file.

Open it and you will have to set up your provider name and its associated token.

Update the "provider" property, by default we set up "provider" : "openai"

Next, update the connection property by adding the API endpoint URL and the token property linked to your environment variable name.

For example : "token": "${environment.AI_PROVIDER_TOKEN}"

Finally, define the model you would like to use by updating options property.

Example with GPT 3.5 Turbo model : "model": "gpt-3.5-turbo"

Here is the example of the 'AiProvider' entity :

{
        "_loc": {
            "tenant": "default",
            "teams": ["default"]
        },
        "id": "provider-openai",
        "name": "OpenAI",
        "description": "",
        "metadata": {},
        "tags": [],
        "provider": "openai",
        "connection": {
            "base_url": "https://api.openai.com",
            "token": "${environment.AI_PROVIDER_TOKEN}",
            "timeout": 30000
        },
        "options": {
            "model": "gpt-3.5-turbo",
            "frequency_penalty": null,
            "logit_bias": null,
            "logprobs": null,
            "top_logprobs": null,
            "max_tokens": null,
            "n": 1,
            "presence_penalty": null,
            "response_format": null,
            "seed": null,
            "stop": null,
            "stream": false,
            "temperature": 1,
            "top_p": 1,
            "tools": null,
            "tool_choice": null,
            "user": null
        },
        "kind": "AiProvider"
    }

Do not forget to change and create your environment variable to use your AI provider's token.

In our demonstration case, we've chosen AI_PROVIDER_TOKEN as env variable name.

You could also save in your environment variables the API base URL such as https://api.openai.com for OpenAI or also the model you need to use.

6. Create the Context Entity

In your entities/ai.json file add an 'AiPromptContext' entity.

Here is an example of an 'AiPromptContext' entity :

{
    "_loc": {
        "tenant": "default",
        "teams": ["default"]
    },
    "id": "ai-context",
    "name": "AI Context",
    "description": "",
    "metadata": {},
    "tags": {},
    "messages": [
        {
            "role": "system",
            "content": "Cloud APIM is a french company that provides API Management as a Service through managed Otoroshi instances, Serverless projects, etc"
        }
    ],
    "kind": "AiPromptContext"
}

7. Create your API endpoint

Don't forget to add the LLM proxy plugin to your Context route !

Here is an OpenAPI example to use 'AiPromptContext' plugin.

{
    "openapi": "3.1.0",
    "info": {
        "title": "AI Plugins for Cloud-APIM Serverless Projects",
        "version": "1.0.0",
        "description": "An easy getting started project template to use AI in your Cloud-APIM Serverless projects",
        "contact": {
            "name": "Cloud APIM Default contact address",
            "url": "https://www.cloud-apim.com",
            "email": "contact@cloud-apim.com"
        },
        "x-logo-none": {
            "url": "https://www.cloud-apim.com/assets/logo/cloud-apim-logo-inverted.png"
        }
    },
    "tags": [],
    "paths": {
        "/context": {
            "post": {
                "tags": [],
                "summary": "",
                "operationId": "getAIContext",
                "x-cloud-apim-backend": {
                    "$ref": "#/components/x-cloud-apim-backends/mirror"
                },
                "x-cloud-apim-plugins": {
                    "$ref": "#/components/x-cloud-apim-plugins/context"
                }
            }
        }
    },
    "components": {
        "x-cloud-apim-plugins": {
            "context": [
                {
                    "enabled": true,
                    "debug": false,
                    "plugin": "cp:otoroshi.next.plugins.OverrideHost",
                    "include": [],
                    "exclude": [],
                    "config": {}
                },
                {
                    "enabled": true,
                    "debug": false,
                    "plugin": "cp:otoroshi_plugins.com.cloud.apim.otoroshi.extensions.aigateway.plugins.AiLlmProxy",
                    "include": [],
                    "exclude": [],
                    "config": {
                        "ref": "provider-openai"
                    }
                },
                {
                    "enabled": true,
                    "debug": false,
                    "plugin": "cp:otoroshi_plugins.com.cloud.apim.otoroshi.extensions.aigateway.plugins.AiPromptContext",
                    "include": [],
                    "exclude": [],
                    "config": {
                        "ref": "ai-context"
                    }
                }
            ]
        },
        "x-cloud-apim-backends": {
            "mirror": {
                "targets": [
                    {
                        "hostname": "mirror.otoroshi.io",
                        "port": 443,
                        "tls": true
                    }
                ],
                "root": "/",
                "rewrite": false
            }
        }
    }
}

8. Go further

You can go further by adding API Keys to secure your AI endpoints and use much more plugins provided by Cloud APIM.

Customization of your API endpoints is unlimited.

✨ Conclusion

Creating your first AI provider entity and context entity can significantly enhance the capabilities of your API gateway.

By integrating AI, you can provide smarter, more secure, and personalized experiences for your users.

Follow the steps outlined in this guide to get started on your AI journey and unlock the full potential of your API management.

To help you using our new set of plugins, we created a YouTube playlist containing all our AI Gateway videos.

🚀 Get Started Now

Ready to get started with Cloud APIM Serverless ?

Sign up now and take the first step towards secure and efficient API management !

📡 Stay Connected

Follow our blog for the latest updates, tips, and best practices for Cloud APIM Serverless and API management.

🏢 About Cloud APIM

Cloud APIM provides cutting-edge, managed solutions for API management, enabling businesses to leverage the full power of their APIs with ease and efficiency. Our commitment to innovation and excellence drives us to offer the most advanced tools and services to our customers, empowering them to achieve their digital transformation goals.

0
Subscribe to my newsletter

Read articles from Thomas Delafaye directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Thomas Delafaye
Thomas Delafaye