Designing and Sizing a Global Scale Telemetry Platform on Azure Event Hubs

Play Designing and Sizing a Global Scale Telemetry Platform on Azure Event Hubs
Sign in to queue


Building a global scale telemetry pipeline used to be the preserve of the largest internet companies, but today it is common for even moderately sized businesses to receive a high volume of telemetry across the world. The cloud and Internet of Things are making this even more common. This session will cover building a global scale telemetry pipeline on Azure Event Hubs and including how to size and plan for your platform. We will show and discus how we see customers successfully create telemetry pipelines that can handle billions of events per day all around the world. Speaker: Dan Rosanova



Session Type:






Download this episode

The Discussion

  • User profile image

    A very valuable and thorough presentation on key aspects of Event Hubs (and Stream Analytics). I like the work stealing algorithm implemented in the EventProcessorHost API - there's a lot of coordination going on which usually is an obstacle for a large group of developers.

    I'm wondering if one can pipe events from the EventHub to Azure storage and use the EventProcessorHost in parallel (over the same stream of events). So computation and storage at the same time - using an Azure provided service for storage, and a custom impemented EventProcessorHost.

  • User profile image

    Indeed! Event Hubs is designed to have multiple groups of readers (consumer groups) each read the entire stream. This way you can have different paths for your data running in parallel and each at their own pace.

Add Your 2 Cents