OpenAI Agent SDK | Context

What is Agent? (in the context of Agent SDK)

In the new SDK called the Agent SDK, OpenAI provides separate Agents. These offer flexibility — imagine having an LLM instance configured once to perform a specific kind of task using a particular module and being allowed to use predefined tools (including other Agents as tools).


What is Context?

The official OpenAI SDK documentation describes 2 types of context:

TypeWho uses it?Example
LLM ContextThe AI model (GPT)Instructions, tools, user messages
Local ContextYour Python code/toolsUser info, config, helper functions

Agent context refers to the data you provide to the LLM, which the Agent uses to generate a response. However, be careful — this context isn’t sent directly to the LLM; instead, it is used to guide the response generation process.

Local context is data your own code uses while running — the AI doesn’t see it directly, but your tools (functions) can access it.


Example:

Let’s start with defining local context, which will be user preferences in food.

@dataclass
class UserPreferences:
    name: str
    favorite_cuisine: str

Here we can see @dataclass decorator which defines class for storing data about user: name and favourite cuisine. The AI doesn’t see it directly — it’s for your tools to use.

Next, we are going to create a tool that the Agent will use to access the user’s preferences.

from openai import function_tool, RunContextWrapper

@function_tool
async def recommend_restaurant(wrapper: RunContextWrapper[UserPreferences]) -> str:
    user = wrapper.context
    recommendations = {
        "ukrainian": "Borscht House",
        "mexican": "Mexico City",
        "japanese": "Sakura Sushi",
        "indian": "Mumbai",
        "chinese": "Golden Dragon",
    }

    cuisine = user.favorite_cuisine.lower()
    restaurant = recommendations.get(cuisine)

    if restaurant:
        return f"Hi {user.name}! Since you like {cuisine.title()}, I recommend '{restaurant}'."
    else:
        return f"Hi {user.name}! I couldn't find a match for '{cuisine}', but how about trying something new today?"

Now we have defined tool for our Agent, so it can call this function, and it uses the local context.

Let’s setup Agent:

from openai import Agent

agent = Agent[UserPreferences](
    name="Restaurant Recommender",
    instructions="You're a friendly assistant that helps users find places to eat based on their preferences.",
    tools=[recommend_restaurant],
)

This is LLM context — what the AI sees: its name, tools, and instructions.

Let’s finally test it:

from openai import Runner

user_data = UserPreferences(name="Dan", favorite_cuisine="Ukrainian")

result = await Runner.run(
    starting_agent=agent,
    input="Can you suggest a place to eat?",
    context=user_data
)

print(result.final_output)

Output:

Hi Dan! Since you like Ukrainian, I recommend trying out 'Borscht House' downtown. 🍝

Full Code:

from dataclasses import dataclass
from openai import Agent, Runner, function_tool, RunContextWrapper

@dataclass
class UserPreferences:
    name: str
    favorite_cuisine: str


@function_tool
async def recommend_restaurant(wrapper: RunContextWrapper[UserPreferences]) -> str:
    user = wrapper.context
    recommendations = {
        "ukrainian": "Borscht House",
        "mexican": "Mexico City",
        "japanese": "Sakura Sushi",
        "indian": "Mumbai",
        "chinese": "Golden Dragon",
    }

    cuisine = user.favorite_cuisine.lower()
    restaurant = recommendations.get(cuisine)

    if restaurant:
        return f"Hi {user.name}! Since you like {cuisine.title()}, I recommend '{restaurant}'."
    else:
        return f"Hi {user.name}! I couldn't find a match for '{cuisine}', but how about trying something new today?"

async def main():
    agent = Agent[UserPreferences](
        name="Restaurant Recommender",
        instructions="You're a friendly assistant that helps users find places to eat based on their preferences.",
        tools=[recommend_restaurant],
    )

    user_data = UserPreferences(name="Dan", favorite_cuisine="Ukrainian")

    result = await Runner.run(
        starting_agent=agent,
        input="Can you suggest a place to eat?",
        context=user_data
    )

    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

If you’re interested in learning more, take a look at the official documentation: Agent SDK Documentation

0
Subscribe to my newsletter

Read articles from Yelyzaveta Dymchenko directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Yelyzaveta Dymchenko
Yelyzaveta Dymchenko