Azure Event Hub Producer

The Azure Event Hub Producer is available upon request. Contact us at support@xapix.io for more information.

An Azure Event Hub Producer allows you to stream event data from Xapix pipelines into an Azure Event Hub. The Microsoft Azure Event Hubs documentation page can be found here.

Prerequisites:

  • Microsoft Azure account subscription and portal access

  • Azure Event Hub Namespace and Instance for sending events to

  • Azure Event Hub Connection String for authentication

Configuration
Authentication
Implementation
Testing in Azure
Configuration
  1. On the Connectors page, select the New Azure Event Hub Producer option.

  2. Enter any desired name for the producer and click Create Event Producer to continue.

  3. Within the Edit Connector page, enter the Azure Event Hub Name. This has to correspond with the name that can be found within your Azure portal's Event Hub Namespace. It is sometimes referred to as the Event Hubs Instance in the portal.

  4. Finally, to help build the pipeline later on, you need to define the structure of the event data that will be sent to your Azure Event Hub and include some sample values. There are 2 options for doing this:

    • Either use the Parameters section to manually enter the properties and sample values in line with your target schema

    • Or use the Message Sample to paste in a message containing your target schema and sample values, then click Generate schema from sample to auto-populate the Parameters section. Here is a Message Sample to help with this option:

{
"Machine": "Machine1",
"Source": "XYZ",
"Timestamp": "2020-06-23 16:24:16 +0000",
"Tank_Level_Value": 489.75999999999993,
"Tank_Level_TagName": "TANK-123"
}
Authentication
  1. To enable Authentication, first go to the Authentication Credentials page. It can be reached from within the Edit Azure Event Hub Producer page by following the relevant link.

  2. On the Authentication Credentials page, select the New Credential Connection String option to enter the credential configuration page.

  3. Enter any desired name for the credential and your Azure Event Hub Connection String. The structure of the string itself should follow the below template. More information about Event Hub connection strings can be found in the Microsoft documentation here.

Endpoint=sb://<FQDN>/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>
Implementation
  1. Having configured your Event Producer and set up the Authentication, it can now be added to a Xapix Pipeline. On your chosen Pipeline Dashboard, click Add Unit to open the pipeline overlay.

  2. The Event Hub Producer that you configured should now show under the list of available Connectors. Click on the relevant producer to be taken to the credential selection dialogue.

  3. Now, select the relevant Azure Event Hub Connection String which you created earlier and return to the pipeline view.

  4. You should now see the producer on your pipeline dashboard in the form of a connector unit, already connected to a Secure Store unit containing the credential.

  5. You are now ready to use this event producer in your pipeline by connecting it to other units and combining it with the existing Xapix low-code toolset to solve your data integration challenge.

  6. Don't forget to publish your project in order to start sending events to your Azure Event Hub and use the Insights section of Xapix to verify that requests are going through as desired.

Testing in Azure
  1. Once you are sending data to your Azure Event Hub from Xapix, it's easy to automatically forward the streaming data on to an Azure Blob Storage or Azure Data Lake Store account of your choice within Azure. To do so, go to the relevant Event Hub Namespace in your Azure Portal and select the Capture Feature.

  2. Then, on the Azure Event Hubs Capture Page, enable the feature via the on/off toggle and define time/size windows, containers and file formats for the event stream destination.

  3. To test whether events have reached the location in the format desired and retrieve some examples, go to the relevant Storage Account

  4. Then, use the Storage Explorer for navigating to the file location, e.g. the Blob Storage Container location. Once you have navigated through the folders, you should see a file in avro format, e.g. 31.avro

  5. Download the avro file, and use the strings command in your terminal to view the data within, as shown in the example below.

strings 31.avro
avro.codec
null
avro.schema
{"type":"record","name":"EventData","namespace":"Microsoft.ServiceBus.Messaging","fields":[{"name":"SequenceNumber","type":"long"},{"name":"Offset","type":"string"},{"name":"EnqueuedTimeUtc","type":"string"},{"name":"SystemProperties","type":{"type":"map","values":["long","double","string","bytes"]}},{"name":"Properties","type":{"type":"map","values":["long","double","string","bytes","null"]}},{"name":"Body","type":["null","bytes"]}]}
4294993392*12/3/2020 10:02:53 AM
{"ExtRef":"Machine1","ExtRefSrc":"XYZ","Tank_Level_TagName":"TANK-123","Tank_Level_Timestamp":"2020-12-03 10:02:52 +0000","Tank_Level_Value":824.2399999999999,"Timestamp":"2020-12-03 10:02:52 +0000"}