#MSBuild Announcements: Making your AI Easier with updates to Azure Machine Learning, #MLOps, or #DevOps for Machine Learning

I’ve been using Azure Machine Learning for some time, and I’m excited about the new possibilities with the new updates to Azure Machine Learning to make it easier for AI novices to build, train and deploy machine learning models. I believe in the power of AI, and I believe that everyone should get a chance to use it.

Microsoft are helping businesses to have easy access to AI, from conception to modelling through to business value creation and sustainability by making it production-ready. Azure Machine Learning helps data scientists and developers build and train AI models faster, then easily deploy those models to the cloud or the edge. By simplifying AI, it makes it easier to derisk to get started with it.

Microsoft have announced a new automated machine learning user interface which is zero-cde, meaning that you create your models visually by dragging and dropping. I’m also pleased to be delivering a Build session on Wednesday at 2pm, where I get to show off the new machine learning notebooks for code-first model development.

Right now, these capabilities are available in preview, so why not head over and have a play?

There are also new capabilities to help you to transition your models to production, at scale. New MLOps, or DevOps for Machine Learning capabilities, simplifies the end-to-end lifecycle from model creation to deployment. In order to help make AI easier for the business to manage, it’s also possible to monitor it with Azure DevOps integration. If you want to know more about MLOps, check out this video with Seth Juarez (Twitter), who is fantastic at explaining things and the video is well worth checking out.

At Build, it was also announced that there is now high-speed inferencing from cloud to edge. This enables low-latency and lowcost inferencing with the general availability of hardware-accelerated models that run on FGPAs in Azure. This capability is also available in preview in Data Box edge. ONNX Runtime support for  NVIDIA TensorRT and Intel nGraph enables high-speed inferencing on NVIDIA and Intel chipsets.

To summarise, Microsoft are helping businesses to have easy access to AI, from conception to modelling through to business value creation and sustainability by making it production-ready. Azure Machine Learning helps data scientists and developers build and train AI models faster, then easily deploy those models to the cloud or the edge.
The updates in Azure Machine Learning are all currently available.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s