AI Function Calling with APEX

Jon DixonJon Dixon
9 min read

Introduction

One of the upcoming features described by the APEX Development team at KSCOPE25 was Custom Tools (also known as Functions). Although this feature is not yet available, this post describes how AI tools like OpenAI utilize functions and how we can implement them in our APEX Apps today.

Before I begin, I would like to clarify the distinction between Tools and Functions in the context of AI. LLMs utilize tools to perform external actions that assist the LLM in answering a question. Functions are a type of Tool that allows you to run custom code and return the results to the LLM.

How Functions Work

💡
The purpose of using functions is to allow the LLM to utilize your business logic and data when answering a user’s question.

When I first looked into functions, I thought that the LLM directly called the function. This put me off looking into functions because, for most business use cases, you would not want a public LLM to be able to access your business data via an API call.

It was at KSCOPE that I realized that this is how functions work:

  1. Pass the LLM the user’s question and a list of potential functions.

  2. The LLM returns a list of functions (and their corresponding parameter values) that it wants you to run to help it answer the user’s question.

  3. You run the requested functions(s) and call the LLM a second time with the user’s questions and the results of running the function(s).

  4. The LLM uses the results from the function calls (and it’s general knowledge) to answer the question.

The diagram below shows an example process flow:

Diagram showing how AI Function Calling Works

💡
Note that functions don’t have to be about returning data. You can create transactions, start workflows, and call web services. Anything you can do from PL/SQL, you can expose via a function.
💡
The LLM does not directly invoke any code. Instead, it proposes function calls. Your server-side code must interpret these proposals, validate inputs, execute the logic, and return the results to the LLM.

Why not use RAG?

Retrieval Augmented Generation (RAG) involves searching for and sending a subset of your data to the LLM to use for context when answering a question.

Functions offer an alternative to RAG solutions, enabling LLMs to act on your business data. RAG solutions often rely on passing a significant amount of your data to the LLM for context. Functions can significantly reduce the amount of data (and therefore the cost) by allowing the LLM to request only the data it needs to answer the question.

Example

For my example, I would like to request information about a Sales Order. I will be using the OpenAI Responses API. The documentation on functions can be found here.

PL/SQL Function

I created a simple PL/SQL function that returned information about a Sales Order. The function was compiled in my database.

CREATE OR REPLACE FUNCTION get_order_info(p_order_id NUMBER, p_info_type VARCHAR2) RETURN VARCHAR2 IS
    l_order_id        sales_orders.order_id%TYPE;
    l_order_date      sales_orders.order_date%TYPE;
    l_customer_number sales_orders.customer_number%TYPE;
    l_total_amount    sales_orders.total_amount%TYPE;
    l_status          sales_orders.status%TYPE;
BEGIN
    SELECT order_id, order_date, customer_number, total_amount, status
      INTO l_order_id, l_order_date, l_customer_number, l_total_amount, l_status
      FROM sales_orders
     WHERE order_id = p_order_id;

    IF LOWER(p_info_type) = 'order_id' THEN
        RETURN l_order_id;
    ELSIF LOWER(p_info_type) = 'order_date' THEN
        RETURN TO_CHAR(l_order_date, 'YYYY-MM-DD');
    ELSIF LOWER(p_info_type) = 'customer_number' THEN
        RETURN l_customer_number;
    ELSIF LOWER(p_info_type) = 'total_amount' THEN
        RETURN TO_CHAR(l_total_amount);
    ELSIF LOWER(p_info_type) = 'status' THEN
        RETURN l_status;
    ELSE
        RETURN 'Invalid info_type';
    END IF;
EXCEPTION WHEN NO_DATA_FOUND THEN
  RETURN 'Order not found.';
END;

End-To-End Example in JSON

Let’s walk through an example illustrating the JSON that is passed back and forth at each step. Note: In the JSON examples, I have omitted parts of the payload (e.g., model, temperature) that are not explicitly related to functions.

For each of the calls to the LLM, we can use apex_web_service.make_rest_request.

  apex_web_service.set_request_headers 
   (p_name_01   => 'Content-Type', 
    p_value_01  => 'application/json',
    p_name_02   => 'Authorization', 
    p_value_02  => 'Bearer YOURAPIKEYGOESHERE',
    p_reset     => TRUE);

  l_response := apex_web_service.make_rest_request
   (p_url         => 'https://api.openai.com/v1/responses',
    p_http_method => 'POST',
    p_body        => l_json);

JSON for Initial LLM Call with Function Definitions (Call 1)

{
  "input": [
    {
      "role": "system",
      "content": [
        {
          "type": "input_text",
          "text": "Answer the users question"
        }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "input_text",
          "text": "what is the status of order 1"
        }
      ]
    }
  ],
  "tools": [
    {
      "type": "function",
      "name": "get_order_info",
      "description": "Retrieves order information based on the provided order ID and information type.",
      "parameters": {
        "type": "object",
        "required": [
          "p_order_id",
          "p_info_type"
        ],
        "properties": {
          "p_order_id": {
            "type": "number",
            "description": "The unique identifier for the order"
          },
          "p_info_type": {
            "type": "string",
            "description": "The type of information to retrieve (e.g., order_id, order_date, customer_number, total_amount, status)"
          }
        },
        "additionalProperties": false
      },
      "strict": true
    }
  ]
}
  • The input array contains the system prompt and the user’s question.

  • The tools array contains a list of the functions I want the LLM to consider when answering the user’s question. The tool type in this case is function.

  • Function definitions require a specific JSON format, which can be viewed here. The documentation also specifies best practices for defining a function, which I advise you to read.

💡
The more detail you include in the function and parameter descriptions, the easier it makes it for the LLM to understand the function, determine when to use it, and determine what parameters to pass.

Response from Call 1

{
  "output": [
    {
      "id": "fc_6858c0ab059c819b89594dc4d64dd4df03a1897b9ddea35b",
      "type": "function_call",
      "status": "completed",
      "arguments": "{\"p_order_id\":1,\"p_info_type\":\"status\"}",
      "call_id": "call_fyixqaqJGHwVsOada86TkvTs",
      "name": "get_order_info"
    }
  ],
  "parallel_tool_calls": true,
  "tool_choice": "auto",
  "tools": [
    {
      "type": "function",
      "description": "Retrieves order information based on the provided order ID and information type.",
      "name": "get_order_info",
      "parameters": {
        "type": "object",
        "required": [
          "p_order_id",
          "p_info_type"
        ],
        "properties": {
          "p_order_id": {
            "type": "number",
            "description": "The unique identifier for the order"
          },
          "p_info_type": {
            "type": "string",
            "description": "The type of information to retrieve (e.g., order_id, order_date, customer_number, total_amount, status)"
          }
        },
        "additionalProperties": false
      },
      "strict": true
    }
  ]
}
  • If the LLM finds a function that it wants you to run, it will be listed in the output array. This is signified by the output type of function_call.

  • In the above example, it is asking us to run a function with the name get_order_info and pass parameters as follows:

    • p_order_id = 1

    • p_info_type = status

Call our Function

All we need to do now is run the function:


  -- Parse the Response from call 1
  l_response_obj := JSON_OBJECT_T.parse(l_response);
  l_output_arr := l_response_obj.get_Array('output');
  l_output_obj := JSON_OBJECT_T(l_output_arr.get(0));
  l_name      := l_output_obj.get_String('name');  
  l_arguments := l_output_obj.get_String('arguments');

  -- Call the function.
  IF l_name = 'get_order_info' THEN
    l_order_info := get_order_info
     (p_order_id  => JSON_OBJECT_T.parse(l_arguments).get_Number('p_order_id'),
      p_info_type => JSON_OBJECT_T.parse(l_arguments).get_String('p_info_type'));
  END IF;

JSON Payload for the Second LLM Call with Function Results (Call 2)

{
  "input": [
    {
      "role": "system",
      "content": [
        {
          "type": "input_text",
          "text": "Answer the users question"
        }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "input_text",
          "text": "what is the status of order 1"
        }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "input_text",
          "text": "The result from tool get_order_info is: Pending"
        }
      ]
    }
  ]
}
  • I included the result from the function call as a second user message in the input array.

  • When we call the LLM with this JSON, it now has everything it needs to answer the user’s question.

Response from Call 2

{
  "output": [
    {
      "id": "msg_6858c2e3439c8199a71400ac3c3b38ba020aa24e30c9a174",
      "type": "message",
      "status": "completed",
      "content": [
        {
          "type": "output_text",
          "annotations": [],
          "text": "The status of order 1 is currently pending."
        }
      ],
      "role": "assistant"
    }
  ]
}
  • You can see the answer to the question in the output.content.text field.

Additional Thoughts

Multiple Tool Calls

As I alluded to in the example, the LLM will request multiple function calls if it requires them. For example, if the user asks the question ‘what is the status of order 1 and what is its value', then we can expect the LLM to ask us to call the function twice with different parameters. The following JSON is an excerpt from the response, where the LLM instructs us to call the function once to retrieve the status and again to get the amount.

"output": [
  {
    "id": "fc_685997def9b88199977de8d1d817431603004e30121b785e",
    "type": "function_call",
    "status": "completed",
    "arguments": "{\"p_order_id\":1,\"p_info_type\":\"status\"}",
    "call_id": "call_X38rKT3SQhefM5561hvxKlTk",
    "name": "get_order_info"
  },
  {
    "id": "fc_685997df3c7c819993541e73df119cf503004e30121b785e",
    "type": "function_call",
    "status": "completed",
    "arguments": "{\"p_order_id\":1,\"p_info_type\":\"total_amount\"}",
    "call_id": "call_lwcVs312CMVjz1TRex9To09i",
    "name": "get_order_info"
  }

This means your code must:

  • Loop through and call all of the requested functions.

  • Concatenate responses from the tool calls and send them back in the 2nd user message of the second call to the LLM, e.g., ‘Function Name: get_order_info - status = Pending \n Function Name: get_order_info - total_amount = 250 \n’.

Security

Of course, security is a primary concern here. Assuming you are calling the LLM from an APEX Application, then you should provide that user context to your function calls to make sure the user has access to the data in question.

Other security considerations:

  • Make sure you are OK with the results of your function calls being consumed by the LLM and potentially ending up in remote log files, etc.

  • Validate parameters received from the LLM against database constraints before calling functions.

  • Log each function invocation with inputs and user context for auditing.

  • Avoid exposing high-sensitivity operations (e.g., finance approvals) directly via function calls.

Additional Use Cases

I purposely chose a simplified example for this post, but there are many other potential use cases for functions. Here are three to get your thinking:

  • Customer-Facing ERP Chatbot

    • Provide users of your customer portal with a way to get their order status using natural language. If the customer wants to change or cancel an order, initiate an APEX workflow to obtain approval for and take action on the change.
  • Project Status Assistant

    • Let project managers ask “Show me open risks for Project Delta” or “What tasks are overdue for Milestone 3?” with dynamic responses from your project tables.
  • AI-Assisted Form Auto-Fill

    • On a data-entry screen, users can say, “Pre-fill this form using the last order I created,” and a function retrieves and injects historical values into the current APEX session state.

Conclusion

As you can see, incorporating functions in your apps is not straightforward. You must provide a robust wrapper to handle zero or multiple function call requests, invalid parameters, route to the correct PL/SQL code, etc. That being said, function calling provides a structured approach to integrating business logic with LLMs, allowing for the safe handling of sensitive data. By handling function execution server-side and passing only the necessary results to the LLM, this approach strikes a balance between capability and control.

💡
While still emerging, this method enables APEX developers to build secure, intelligent assistants that can efficiently interact with enterprise data and processes.
16
Subscribe to my newsletter

Read articles from Jon Dixon directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Jon Dixon
Jon Dixon

Hi, thanks for stopping by! I am focused on designing and building innovative solutions using the Oracle Database, Oracle APEX, and Oracle REST Data Services (ORDS). I hope you enjoy my blog.