Jump to content

Capture Event Hubs data in Delta Lake format with Stream Analytics no-code editor


Recommended Posts

Posted

We are happy to announce that the public preview of capturing Event Hubs data in Delta Lake format with Stream Analytics no-code editor.

 

 

 

What is Delta Lake?

 

 

Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake extends Parquet data files with a file-based transaction log to provide ACID transactions, scalable metadata handling, and unify streaming and batch data processing. The Delta Lake transaction log has a well-defined open protocol that can be used by any system to read the log.

 

 

 

What is Stream Analytics no-code editor?

 

 

The Stream Analytics no-code editor is a drag and drop design tool that helps customers to develop the Stream Analytics jobs without writing a single line of code. The experience provides a canvas that allows you to connect to input sources to quickly see your streaming data. Then you can transform and preview it before writing to your destination of choice in Azure. To learn more, see No-code stream processing through Azure Stream Analytics | Microsoft Learn.

 

 

 

How to use no-code editor to capture Event Hubs data in Delta Lake format?

 

 

To capture your event hub data into ADLS Gen2 storage with Delta Lake format, you will need to have:

 

  • Event hub with data available
  • Azure Data Lake Storage Gen2 resource available, i.e., ADLS Gen2 account

 

 

 

To access this capability, simply go to your Event Hubs in portal -> Features -> Process data or Capture:

 

largevv2px999.png.55bbc5370d02c43cebd0797e2ec89617.pngThe data capture entry points.

 

 

 

You will be asked to provide a Stream Analytics job name to create the job. Once it is created, you will enter the no-code editor canvas to configure your data capturing as below.

 

 

 

largevv2px999.png.03da7a6db11fa930b11feb25cffb25a4.pngThe no-code editor with tile configuration

 

 

 

Select Azure Data Lake Storage Gen2 tile, on Azure Data Lake Storage Gen2 configuration page, you can configure the necessary parameters, including the “Delta table path”, ADLS Gen2 account, etc.

 

 

 

Once everything is configured with what you want, you can start the job by clicking “Start” on the ribbon. Data will start to be captured into ADLS Gen2 storage with Delta Lake format shortly.

 

 

 

Next step

 

 

To learn more about the Stream Analytics no-code editor and Delta Lake output of Stream Analytics, see these documents below:

 

 

Continue reading...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...