Send data from Logstash to Azure Event Hubs

In this short tutorial I will try to summarize few simple steps to configure Logstash to emit data to Azure Event Hubs Before starting just few words to describe the two components we will work on:

Image for post
Image for post
“brown tree log lot” by Dimitri Tyan on Unsplash

Logstash — it’s a data processing pipeline that ingests, as rapidly as it can, data from a multitude of sources simultaneously, (slightly) transforms it, and then sends it to your favorite “stash”. It is designed to read logs, wherever they are hosted, and send them over, as fast it is possible, to the broker that will memorize, manipulate, display them. The general advice is to keep the pipeline as light is possible (i.e. coarse grain filtering, clean up date format etc…) by delegating heavy lift (i.e. regex, correlations between different log lines etc…) to dedicated components, for example Elasticsearch or to Azure Event Hubs

Azure Event Hubs — Azure Event Hubs is a streaming platform and event ingestion service, capable of receiving and processing millions of events per second. It is a PaaS available on Microsoft Azure and is greatly scalable with a pay as you go model.

Event Hubs are similar to Apache Kafka.

The problem we intend to solve in this tutorial is to connect Logstash to Event Hubs: the task is pretty straightforward but there are at least two ways of doing this: (a) to use Event Hubs http endpoint; (b) use the experimental (at the time of writing) Kafka interface of Event Hubs. The general idea is to keep this simple so we go for (a). The next picture represents the (simple) architecture of the system:

Image for post
Image for post
Architecture of the system
  • Let start from Event Hub: the first step is to create an Event Hubs Namespace and within it create an Event Hubs. Event Hubs will collect different hubs. This hierarchy will be translated to a proper URL for each hub according to the following schema:
  • Commonly to many other Azure PaaS components the access to the service is ruled by randomly generated key. These can be found in the SAS (Shared access policy) page on your Event Hubs Namespace. The Keys will come handy when you have to generate the URL to be called within the Logstash configuration. In the next image is shown the SAS Policy page showing the (redacted) keys for my namespace.
Image for post
Image for post
SAS Policy page
  • To verify that your installation works and it is reachable from your network (not so obvious considering enterprises’ proxies) you can use the Service Bus Explorer. By using the connection string from the previous step, you will be able to connect to the Event Hub and verify that everything is working.
Image for post
Image for post
  • The next step is to generate the SAS token that is the real chunk of byte useful to actually connect on the REST interface. To do so go to this page, choose your language and generate the SAS token. This process is pretty straight forward: it will just use the data you can find in the SAS Policy page (previous item) and fill the fields in the code. As and example following you find the Powershell chunk of code to generate the SAS token, the code is pretty simple but you must take care of some details: replace the and with the proper strings from your Event Hubs; the variable contains the name o the policy, the value you find in the code is fine in the most common case, in case you have more complex policy schema you have to replace this with the policy you are going to use; the variable will contain the SAS token (!) to be used in your Logstash configuration: it has an expiration date that is set to seconds, as stated in the code, modify the 300 value to give the token a longer expiration date.

SAS token usually tend to expires the morning you have the big demo so carefully set this value to sometimes in a distant future.

[Reflection.Assembly]::LoadWithPartialName(“System.Web”)| out-null
#Token expires now+300
$SignatureString=[System.Web.HttpUtility]::UrlEncode($URI)+ “`n” + [string]$Expires
$HMAC = New-Object System.Security.Cryptography.HMACSHA256
$HMAC.key = [Text.Encoding]::ASCII.GetBytes($Access_Policy_Key)
$Signature = $HMAC.ComputeHash([Text.Encoding]::ASCII.GetBytes($SignatureString))
$Signature = [Convert]::ToBase64String($Signature)
$SASToken = “SharedAccessSignature sr=” + [System.Web.HttpUtility]::UrlEncode($URI) + “&sig=” + [System.Web.HttpUtility]::UrlEncode($Signature) + “&se=” + $Expires + “&skn=” + $Access_Policy_Name
  • The key you will generate in the previous step is in WRAP format, whatever it is. For the purposes of this article it is enough to know that it means will have this format (beware of the placeholders):
SharedAccessSignature sr={yournamespace}{yourhub}&sig={redactedkey}&se={expirationtime}&skn=RootManageSharedAccessKeyù
  • Now we can work on the Logstash side. Once again, for the purposes of the article I will keep this as simple is possible so I will just configure Logstash to take input from (your keyboard) and send them to Event Hub with any further elaboration. Following you find the Logstash configuration file (beware of the placeholders), please note that it is particularly important to properly set the of the http request.
input { stdin{} }
filter {}
output {
stdout { codec => rubydebug }
http {
url =>
content_type =>
http_method =>
format =>
message =>
'%{@timestamp} : %{message}'
headers => {
"Host" => "{yournamespace}"
"Authorization" => "SharedAccessSignature sr={yournamespace}{yourhub}&sig={redactedkey}&se={expirationtime}&skn=RootManageSharedAccessKey"
  • Now all should be set to test the whole thing with . Please not that in the configuration we have left among the output, allowing us to print on the console what we are going to send to Event Hub. If everything is correctly set then you should see something like this every time you type some text and press Enter:
Image for post
Image for post
  • In the Service Bus Explorer window you should be able to see the messages coming through as along as you type them in the console:
Image for post
Image for post
Messages finally reaching Event Hub

Written by

Data Masseur, Distributed Systems Sculptor, and Scalability Evangelist

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store