Jump to content

Featured Replies

Posted

Azure Messaging and Stream Processing updates will cover recent releases from Azure Messaging and Streaming. In this blog we will cover releases from Microsoft Fabric event streams, Azure Event Hubs, Azure Event Grid, and Azure Stream Analytics.

 

 

 

Microsoft Fabric event streams

 

 

You can now ingest, capture, transform and route real-time events to various destinations in Microsoft Fabric with a no-code experience using Microsoft Fabric event streams. It enables customers to ingest real-time event data from external event sources into the data stores in Fabric. The events could be transformed into the native formats required for the target destination. For example, Microsoft Fabric event streams could transform the events into Delta Lake format for the Lakehouse, into SQL columns based on the table schema, or filter events so that homogenous data can be sent to a KQL table. To find out more about event streams please see this blog.

 

 

 

Azure Event Hubs

 

 

Earlier this year, Azure Event Hubs launched their Dedicated self-serve clusters capability to GA. This new capability has been well received and provides high performance and low-latency event streaming. Now at BUILD, Azure Event Hubs has a number of additional changes to announce that make the overall platform even better.

 

  • Compaction GA for Kafka and AMQP – When configured, your Event Hub will only keep the latest message that has the same key.
  • Mirror Maker 2 support GA – Use of Mirror Maker 2 is now supported with Azure Event Hubs. This makes it much easier to migrate from your Apache Kafka compatible service to Event Hubs and for you to run hybrid configurations where your source system is Apache Kafka compatible.
  • Kafka Connect support GA - Use of Kafka Connect is now supported with Azure Event Hubs.
  • JSON Schema in schema registry for Kafka – The Event Hubs schema registry now supports JSON Schema.
  • Managed Identity for streaming capture – You can use Managed Identity when configuring the storage account to capture your streams.
  • Data Generator public preview – This release of the data generator is geared towards early users of Event Hubs and allows users to send user defined or canned events to your Event Hub.

 

Each of these items are covered in greater depth in their respective announcement blog:

Apache Kafka related changes

Managed Identity for streaming capture

JSON Schema in schema registry for Kafka

Data generator public preview

 

 

 

Azure Event Grid

 

 

Azure Event Grid now supports bi-directional communication via MQTT version 5 and MQTT version 3.1.1 protocols in Public Preview. This makes Azure Event Grid the first of its kind, MQTT broker in Azure, enabling users to send billions of messages from millions of simultaneously connected devices. Customers can now leverage the lightweight MQTT protocol to publish and subscribe to messages for telemetry ingestion, client command and control, and broadcast scenarios to build the next generation of applications faster through this fully managed service. MQTT messages can be processed further with powerful analytics services in Azure, for example, Azure Stream Analytics, Azure Data Explorer etc., to derive insights from large volume of data quickly.

 

911x164vv2.png.7482f2fc0bf812056c3f79fdf9354a38.png

 

 

 

 

 

Azure Event Grid has always offered push delivery of these events, through which events are pushed to the destination through an event subscription. Our customers are looking for processing events from highly secure environments without configuring a public end point, control over the rate and volume of messages consumed, much larger throughput and more out-of-the-box functionality from a single service. Today, we are also announcing the public preview of pull delivery of messages on custom topics. This gives more flexibility to the applications handling the events and enables the use of private links when receiving events.

 

 

 

Azure Stream Analytics

 

 

In response to valuable feedback from our customers and the evolving market demands, we have been diligently working on enhancing our product to better align with the requirements. We are thrilled to announce today our new pricing model and exciting new features that will significantly improve the productivity, affordability, and security of processing streaming data.

 

 

 

Announcing Up to 80% Pricing Decrease

 

Customers are enjoying excellent performance, high availability, low-code/no-code experiences, and rich development tools. To make streaming analytics with Azure Stream Analytics an even easier for our customers, we are happy to announce a new pricing model which offers discounts up to 80%, with no changes to the full suite of capabilities. The new pricing model offers graduated pricing where eligible customers receive discounts as they grow their usage. This new transparent and competitive pricing model supports the team’s longtime mission of democratizing stream processing for all. New pricing model will take effect on July 1st, 2023. More information is available in this blog.

 

 

 

New features

 

  • Public Preview of VNet support: Virtual Network Integration support is available for Standard Jobs. With a few clicks, lock down access and take advantage of private endpoints, service endpoints, and rip the benefits of full network isolation by deploying a containerized instance of Azure Stream Analytics job inside the Virtual Network.
  • General Availability of Autoscale: The Autoscale capability is highly customizable and enables customers to set specific trigger rules based on their own requirements. Azure Stream Analytics will optimize the costs by running the right number of resources.
  • Private Preview of Kafka input & output: Customers can connect an Azure Stream Analytics job directly to Kafka clusters to ingest natively and output data. The Kafka Adapters are backward compatible (from Kafka 0.10) and support all Kafka versions with the latest client release.
  • General Availability of Exactly Once for EventHub and ADLS Gen2: end-to-end exactly once semantics is supported when reading any streaming input and writing to Azure Data Lake Storage Gen2 and Event Hub. It guarantees no data loss and no duplicates being produced.
  • Public Preview of Event Hub Schema Registry integration: With the integration with Event Hubs Schema Registry, schema can be retrieved from the Schema Registry and data can be deserialized from Event Hub input. This reduces per-message overhead and enables efficient schema validation.
  • General Availability of Job Simulation in Azure Stream Analytics VSCode extension: customers can fine-tune the number of streaming units and simulate the job running topology with this feature. Editing suggestions are also provided to enhance the query and unlock the job's full performance potential.

 

 

 

Each of these items are covered in greater depth in their respective announcement blog/documentation:

 

 

The full set of new features of Azure Stream Analytics are listed in Exciting announcements by Azure Stream Analytics at Build 2023.

 

Continue reading...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...