MMS • Steef-Jan Wiggers
Microsoft recently announced the open-source release of Azure DeepStream Accelerator (ADA) in collaboration with Neal Analytics and NVIDIA, allowing developers to build Edge AI solutions with native Azure Services integration quickly.
Specifically, the Azure DeepStream Accelerator provides a simplified developer experience for deploying accelerated computer vision workloads at the edge. Microsoft aims to provide developers with the ability to create NVIDIA DeepStream AI-based solutions and integrate them with a multitude of Azure services, such as Blob Storage and Monitor.
The Azure DeepStream Accelerator consists of several components:
- An AI Pipeline Container, ingesting USB or RTSP camera streams and applying AI models to the video frames. The outputs of the models are multiplexed together and then sent out to the Business Logic Container for user logic to handle.
- A Controller Module responsible for configuring the whole system.
- A Business Logic Container (BLC) where a user’s application logic resides and is configured through the Controller Module.
- A Video Uploader Container responsible for taking inferences and MP4 video snippets and uploading them to the cloud.
Tony Sampige, a principal program manager at Microsoft, explains in a Tech Community blog post:
The open-source project includes tools to ease the developer journey, including a region of interest widget and supplementary developer tools that developers can leverage to build, manage, and deploy their AI solutions to NVIDIA’s AGX Orin edge devices and more. Additionally, ADA provides support for 30+ pre-built AI models out of the box (Nvidia, ONNX, TF, Caffee, Pytorch, and Triton models) and the ability to bring your own Model/Container for deployment to IoT edge devices.
ADA will give your team a jump start and can save your company tons of time and money. ADA provides a low-code environment and allows any developer to build complex video analytics applications without mastering all the underlying technologies.