Integrate Azure OpenAI with Blazor Maps for Smarter, AI-Powered Mapping


TL;DR: This blog post walks you through building an AI-powered mapping app using Azure OpenAI and Syncfusion Blazor Maps. It covers setting up credentials, creating a data service, and adding features like search and tooltips for a rich user experience.
Welcome to our Weekly Data Visualization blog series!
In today’s fast-paced digital world, combining advanced technologies like artificial intelligence (AI) with interactive mapping solutions is key for building dynamic and engaging web applications. A powerful integration that can significantly elevate the functionality of your mapping applications is the combination of Azure OpenAI with Syncfusion Blazor Maps.
This blog will explore how to enhance a mapping application by integrating Microsoft Azure OpenAI with Syncfusion Maps in a Blazor application. The process begins with collecting user input through an input box, where users can specify locations or types of places, such as “Hospitals in New York.” Based on this input, a prompt is generated and sent to Azure OpenAI, which returns a JSON response containing location data. This data is then deserialized and used to dynamically set the marker data source for the Map. When the map is loaded, the markers are placed at the retrieved locations. As users hover over each marker, additional information such as images and place descriptions appears in the tooltip, offering a rich, interactive user experience.
Prerequisites
Before getting started, ensure you have access to Azure OpenAI and have set up a deployment in the Azure portal. You will need the following credentials for the integration:
Azure OpenAI API key
Model name (We’ll be using gpt-4o-mini for this example).
Endpoint URL
Additionally, make sure the component is installed and configured in your Blazor application.
Let’s get started!
Step 1: Configure Azure OpenAI
To integrate Azure OpenAI with your Blazor application, you first need to configure your Azure OpenAI credentials. These will be added to your appsettings.json file to allow secure communication between your application and Azure OpenAI. Add the following configuration, replacing placeholders with your actual Azure OpenAI credentials:
{
"AzureOpenAI": {
"Endpoint": "your-endpoint-url",
"ApiKey": "your-api-key",
"DeploymentId": "model_name"
}
}
Replace your-endpoint-url, your-api-key, and model_name with your actual Azure OpenAI endpoint URL, API key, and deployed model name.
Step 2: Create a service to connect Azure OpenAI
Now, let’s create a service that will establish a connection with Azure OpenAI to retrieve AI-generated responses based on user input. We have created a new class called AISampleService, which acts as this service. This service will take the user’s query, frame a detailed prompt internally to generate the required JSON data in a specified format, and send it to the Azure OpenAI API. Once the response is received, the service will process and return the data, which can be used to dynamically update the Map with relevant markers and information.
using System.Text.Json;
public class AISampleService
{
private readonly HttpClient _httpClient;
private readonly IConfiguration _configuration;
private readonly string _endpoint;
private readonly string _apiKey;
private readonly string _deploymentId;
public AISampleService(HttpClient httpClient, IConfiguration configuration)
{
_httpClient = httpClient;
_configuration = configuration;
_endpoint = _configuration["AzureOpenAI:Endpoint"];
_apiKey = _configuration["AzureOpenAI:ApiKey"];
_deploymentId = _configuration["AzureOpenAI:DeploymentId"];
}
public async Task<string> GetAIResponse(string prompt)
{
var requestBody = new
{
messages = new[]
{
new { role = "user", content = prompt }
},
max_tokens = 2000
};
var request = new HttpRequestMessage(HttpMethod.Post, $"{_endpoint}/openai/deployments/{_deploymentId}/chat/completions?api-version=2023-07-01-preview")
{
Content = JsonContent.Create(requestBody)
};
request.Headers.Add("api-key", _apiKey);
var response = await _httpClient.SendAsync(request);
if (response.IsSuccessStatusCode)
{
var responseJson = await response.Content.ReadAsStringAsync();
using var doc = JsonDocument.Parse(responseJson);
return doc.RootElement.GetProperty("choices")[0].GetProperty("message").GetProperty("content").GetString();
}
else
{
var errorContent = await response.Content.ReadAsStringAsync();
return $"Error: {response.StatusCode} - {errorContent}";
}
}
}
Step 3: Fetch location data from Azure OpenAI
We will create a method to fetch location data for map markers based on the user’s input. The query sent to Azure OpenAI will provide details about important places, including latitude, longitude, place names, and more.
Here is the method that fetches location data:
public async Task GetMarkerData(string value)
{
if (!string.IsNullOrEmpty(value))
{
string result = await OpenAIService.GetAIResponse("Generate" + value + "for 15 important cities as a JSON object, with fields such as 'city_name', 'place_name', 'latitude', 'longitude', 'place_details', and 'address'. Provide the simple address. The information about the place with minimum 150 characters. Strictly provide a minimum of a flat JSON list without nested objects. Strictly provide information in English Language.");
if (result.Contains("```json"))
{
string cleanedResponseText = result.Split("```json")[1].Trim();
result = cleanedResponseText.Split("```")[0].Trim();
if (!string.IsNullOrEmpty(result))
{
MarkerCollection = JsonConvert.DeserializeObject<ObservableCollection<Markers>>(result);
}
}
else
{
MarkerCollection.Clear();
}
}
else
{
MarkerCollection.Clear();
}
}
This method sends a query to Azure OpenAI to generate a list of places and their details (such as latitude, longitude, address, and information about the places) in JSON format. Once the response is received, it parses the JSON and stores it in an ObservableCollection object called MarkerCollection.
Step 4: Display markers in the Maps component
The Blazor Maps component must be initialized to display the retrieved list of objects as markers. To initialize the Maps component, you can follow the steps provided in the Blazor Maps Getting Starteddocumentation.
After initializing the Maps component, the AI response will be received and converted into a data source for the markers. This marker data will be assigned to the property of the MapsMarker in the Maps component. You can find this step in the Blazor Maps Markers documentation.
Here’s how to load the marker data when the Maps component is initialized:
<SfMaps Height="600px">
<SfSpinner @bind-Visible="@SpinnerVisibility"></SfSpinner>
<MapsEvents Loaded="Loaded"></MapsEvents>
<MapsZoomSettings Enable="true" MaxZoom="19" ZoomFactor="1" ShouldZoomInitially="true">
<MapsZoomToolbarSettings>
<MapsZoomToolbarButton ToolbarItems="new List<ToolbarItem>() { ToolbarItem.Zoom, ToolbarItem.ZoomIn, ToolbarItem.ZoomOut, ToolbarItem.Pan, ToolbarItem.Reset }">
</MapsZoomToolbarButton>
</MapsZoomToolbarSettings>
</MapsZoomSettings>
<MapsLayers>
<MapsLayer UrlTemplate="https://a.tile.openstreetmap.org/level/tileX/tileY.png" TValue="string">
<MapsMarkerSettings>
@if (MarkerCollection.Count > 0)
{
<MapsMarker TValue="Markers" Visible="true" DataSource="MarkerCollection" AnimationDuration="0" Width="30" Height="30" Shape="Syncfusion.Blazor.Maps.MarkerType.Image" ImageUrl="map_pin.png">
</MapsMarker>
}
</MapsMarkerSettings>
</MapsLayer>
</MapsLayers>
</SfMaps>
@code {
public bool SpinnerVisibility { get; set; } = true;
ObservableCollection<Markers> MarkerCollection = new ObservableCollection<Markers>();
private string SearchQuery { get; set; } = "Hospitals in New York";
private async Task Loaded()
{
if (MarkerCollection.Count == 0)
{
await GetMarkerData(SearchQuery);
}
}
}
Step 5: Add a Search Box for user queries
To allow users to interact with the Maps dynamically, we will add a search box on the page using Syncfusion Blazor TextBox where they can type their queries (e.g., “Hospitals in New York”).
Here is the HTML and CSS code for the search box:
<style>
#search-container {
position: fixed;
top: 90px;
left: 300px;
z-index: 10;
background: transparent;
padding: 5px;
border-radius: 5px;
}
</style>
<div id="search-container">
<SfTextBox @ref="TextBox" Created="AddSearchIcon" Width="200px" Value="@SearchQuery" ValueChanged="ValueChanged" ShowClearButton="true</SfTextBox>
</div>
@code {
SfTextBox TextBox;
public void AddSearchIcon()
{
TextBox.AddIconAsync("append", "fas fa-search");
}
}
This search box enables users to search for places. Once they submit the query, the map updates with new markers based on their input.
Step 6: Customize map markers with tooltips
To enhance the user experience, we will display additional information about each place when users hover over the map markers. Tooltips will display details of the place, including an image and description.
Here’s how to customize the map markers with tooltips
<SfMaps Height="600px">
<SfSpinner @bind-Visible="@SpinnerVisibility"></SfSpinner>
<MapsEvents Loaded="Loaded"></MapsEvents>
<MapsZoomSettings Enable="true" MaxZoom="19" ZoomFactor="1" ShouldZoomInitially="true">
<MapsZoomToolbarSettings>
<MapsZoomToolbarButton ToolbarItems="new List<ToolbarItem>() { ToolbarItem.Zoom, ToolbarItem.ZoomIn, ToolbarItem.ZoomOut, ToolbarItem.Pan, ToolbarItem.Reset }">
</MapsZoomToolbarButton>
</MapsZoomToolbarSettings>
</MapsZoomSettings>
<MapsLayers>
<MapsLayer UrlTemplate="https://a.tile.openstreetmap.org/level/tileX/tileY.png" TValue="string">
<MapsMarkerSettings>
@if (MarkerCollection.Count > 0)
{
<MapsMarker TValue="Markers" Visible="true" DataSource="MarkerCollection" AnimationDuration="0" Width="30" Height="30" Shape="Syncfusion.Blazor.Maps.MarkerType.Image" ImageUrl="map_pin.png">
<MapsMarkerTooltipSettings Visible="true">
<TooltipTemplate>
@{
Markers Data = context as Markers;
if (SearchQuery.Contains("Hospital"))
{
string selectedImage = hospitalImageList[_random.Next(hospitalImageList.Length)];
<div style="position: relative;width:220px;padding: 20px;background-color: white;border-radius: 10px;box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);text-align: center;">
<img src="@selectedImage" alt="Hospital" style="width: 100%;height: auto;border-radius: 5px;margin-bottom: 10px;" />
<p style="margin: 0; font-size: 14px; line-height: 1.5">@Data.PlaceDetails
</p>
</div>
}
else
{
<div style="position: relative;width:220px;padding: 20px;background-color: white;border-radius: 10px;box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);text-align: center;">
<p style="margin: 0; font-size: 14px; line-height: 1.5">@Data.PlaceDetails
</p>
</div>
}
}
</TooltipTemplate>
</MapsMarkerTooltipSettings>
</MapsMarker>
}
</MapsMarkerSettings>
</MapsLayer>
</MapsLayers>
</SfMaps>
Refer to the following output image.
Integrating Azure OpenAI with Maps in Blazor
GitHub reference
For more details, refer to the Integrating azure openAI with Syncfusion Maps in Blazor GitHub demo.
Conclusion
By following the steps outlined in this blog, you have successfully integrated Azure OpenAI with the Syncfusion Maps component in a Blazor application. This integration allows you to generate dynamic, AI-powered map markers based on user queries. The markers display detailed information, including place names, addresses, and images, providing users with an interactive map experience. This integration opens up endless possibilities for creating intelligent, AI-powered Maps in your Blazor applications..
For existing Syncfusion® customers, the newest version of Essential Studio® is available from the License and Downloads page. If you are not a customer, try our 30-day free trial to check out these new features.
If you need a new widget for the Flutter framework or new features in our existing widgets, you can contact us through our support forum, support portal, or feedback portal. As always, we are happy to assist you!
Related Blogs
Subscribe to my newsletter
Read articles from syncfusion directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

syncfusion
syncfusion
Syncfusion provides third-party UI components for React, Vue, Angular, JavaScript, Blazor, .NET MAUI, ASP.NET MVC, Core, WinForms, WPF, UWP and Xamarin.