Jump to content

Bridging Cloud services and AI solutions at the Edge


Recommended Posts

Guest TonySampige
Posted

Microsoft is excited to announce the open-source release of Azure DeepStream Accelerator (ADA) in collaboration with Neal Analytics and NVIDIA.

 

 

 

Creating an integrated edge and cloud AI solution is difficult. To tackle this challenge Microsoft partnered with Neal Analytics and NVIDIA to build an open-source solution that bridges the gap between Cloud services and AI solutions deployed on the edge; enabling developers to easily build Edge AI solutions with native Azure Services integration. Our goal is to support 1) Developers utilizing existing Azure services to leverage the power of computer vision at the edge via DeepStream “and” 2) Enable the DeepStream developers to leverage the power of Azure Services on the cloud.

 

 

 

Through ADA we aim to provide developers with the ability to create NVIDIA DeepStream AI based solutions and integrate them with a multitude of Azure services such as Blob Storage and Monitor. The open-source project includes tools to ease the developer journey including a region of interest widget and supplementary developer tools that developers can leverage to build, manage, and deploy their AI solutions to NVIDIA’s AGX Orin edge devices and more. Additionally, ADA provides support for 30+ pre-built AI models out of the box box (Nvidia, ONNX, TF, Caffee, Pytorch, Triton models) and the ability to bring your own Model/Container for deployment to IoT edge devices.

 

 

 

largevv2px999.png.92fec30d2467f487394aba0cdf9def5d.png

 

 

 

Azure Deepstream Accelerator features

 

 

 

 

  • Simplified development process

    • Auto selection of AI model execution and inference provider: One of several execution providers, such as ORT, CUDA, and TENSORT, are automatically selected to simplify the development process.

    [*]
    Customizing Region of Interest (ROI) to enable custom use cases

    • Region of Interest (ROI) configuration widget: a web app widget, is included for customizing ROIs to enable event detection for any use case.

    [*]
    Simplified configuration for pre/post processing


    • Developers can add a Python-based model/parser using a configuration file, instead of hardcoding it into the pipeline.

    [*]
    Offering a broad Pre-built AI model framework


    • Support for many of the most common CV models in use today, for example NVIDIA TAO, ONNX, CAFFE, UFF (TensorFlow), and Triton.

    [*]
    Bring Your Own Model

    • Support for model/container customization, USB/RTSP camera and pre-recorded video stream(s), event-based video snippet storage in Azure Storage and Alerts, and AI model deployment via Azure IoT Module Twin update.

    [*]
    Multiple trackers

    • Support for Nvidia tracker and 1P
      for tracking use cases.

    [*]
    with DeepStream and TAO Toolkit

    • An example tutorial that illustrates how to build your own custom model using Nvidia's TAO toolkit and to integrate the model with DeepStream. The example uses a MaskRCNN model provided in the TAO toolkit for performing transfer learning.

 

To check out the full project visit GitHub

 

 

 

Our team is partnering with Neal Analytics to support this project and we can’t wait to see what the community builds!

 

 

 

If you are new to DeepStream and want to learn more about it, we recommend checking out Nvidia’s documentation as well DeepStream SDK | NVIDIA Developer.

 

Continue reading...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...