Introduction to Event Hub with Node.js (Beginner Friendly Guide)


Azure Event Hub is a big data streaming platform and event ingestion service—capable of receiving and processing millions of events per second. Think of it as a high-speed highway for streaming data, ideal for telemetry, logs, and real-time analytics.
In this article, you’ll learn:
What is Event Hub and how it works
How to set it up in the Azure Portal
How to send and receive messages using Node.js
How to store the event stream into Blob Storage using Capture
Let’s break it down step-by-step.
🧠 What is Azure Event Hub?
Azure Event Hub works like a message pipeline where:
Producers send messages (called events)
Event Hub receives and stores them temporarily
Consumers process those messages in real-time
It is ideal for:
IoT telemetry data
Application logs
Clickstream or app usage analytics
Real-time monitoring pipelines
🛠️ Step-by-Step Setup in Azure Portal
1. Create an Event Hub Namespace
This is like a container for your Event Hubs.
Go to Azure Portal → Search for Event Hubs
Click + Create
Choose your subscription, resource group, and region
Create a Namespace (e.g.
event-hub-testing-01
)Choose Pricing Tier (start with Basic)
2. Create an Event Hub (under that namespace)
After the namespace is deployed, click + Event Hub
Name it something like
eventhub01
Leave default partition count (2)
Click Create
3. Get the Connection String
You’ll need this to send events.
Go to the Shared Access Policies tab
Click on RootManageSharedAccessKey
Copy the Connection string–primary key
💻 Sending Events with Node.js
Install the package:
npm install @azure/event-hubs
send.js
const { EventHubProducerClient } = require("@azure/event-hubs");
const connectionString = "<YOUR_CONNECTION_STRING>";
const eventHubName = "eventhub01";
async function main() {
const producer = new EventHubProducerClient(connectionString, eventHubName);
const eventDataBatch = await producer.createBatch();
eventDataBatch.tryAdd({ body: "Hello from Node.js!" });
await producer.sendBatch(eventDataBatch);
console.log("✅ Event sent!");
await producer.close();
}
main().catch((err) => {
console.error("❌ Error sending event:", err);
});
Run it:
node send.js
📥 Receiving Events with Node.js
receive.js
const { EventHubConsumerClient } = require("@azure/event-hubs");
const connectionString = "<YOUR_CONNECTION_STRING>";
const eventHubName = "eventhub01";
const consumerGroup = "$Default"; // or your custom one
async function main() {
const client = new EventHubConsumerClient(consumerGroup, connectionString, eventHubName);
const subscription = client.subscribe({
processEvents: async (events, context) => {
for (const event of events) {
console.log("📩 Received:", event.body);
}
},
processError: async (err, context) => {
console.error("❌ Error:", err);
}
});
// Stop after 30s
setTimeout(async () => {
await subscription.close();
await client.close();
console.log("⏹ Stopping subscription...");
}, 30000);
}
main().catch(console.error);
📦 Enable Capture to Store Events in Blob Storage
This helps in storing a backup or analyzing events later.
Steps:
Go to your Event Hub Instance
Click on Capture in the left menu
Choose:
Capture Provider: Azure Storage Account
Select a valid Storage Account + Container (create one if needed)
Use default file format templates
⚠️ If you're getting an error like “Unknown capture destination type”, it means you haven't created a proper Storage Account yet.
🔄 Verify Event Flow
You can check:
Incoming events under the Metrics tab
Stored
.avro
files in the Blob Storage container
To convert .avro
to JSON, you can use avro-tools or online tools.
✅ Summary
You’ve now:
Created and configured Event Hub
Sent and received events using Node.js
Captured events to Azure Blob Storage
This gives you a solid foundation for building real-time analytics pipelines, telemetry systems, or log processors.
Subscribe to my newsletter
Read articles from Shikhar Shukla directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
