Marek Linka, Senior Software Developer

Marek Linka

Senior Software Developer

Operationalizing machine learning models 1/3 – Azure ML Workbench

Machine learning and AI are everywhere. More and more people are discovering the advantages that learning algorithms can provide – decreasing costs, improving efficiency, eliminating menial tasks, etc. Knowing at least a little bit about how these algorithms work is quickly becoming a necessity. Read it now.

Introduction

Machine learning and AI are everywhere. More and more people are discovering the advantages that learning algorithms can provide – decreasing costs, improving efficiency, eliminating menial tasks, etc. Knowing at least a little bit about how these algorithms work is quickly becoming a necessity.

While the actual ML model construction, training, and evaluation is a topic of numerous courses, talks, and tutorials, one crucial area of working with ML is much less discussed – bringing the trained models to production (also known as operationalization).

There are many ways of accomplishing this, depending on your technological stack of choice. A lot of workflows are currently moving to cloud and it’s no surprise machine learning is one of them – having on-demand compute makes scaling demanding machine learning models a breeze.

Microsoft Azure is currently in the process of testing a new set of capabilities for bringing models to production. Seeing this in action at the WeAreDevelopers conference in Vienna, I wanted to get some hands-on experience with it before it goes to general availability. This set of articles is about what I learned along the way.

The scenario

A customer I’m working for is currently exploring a way of finding and extracting a piece of information from a photo of an official document. I created a small convolutional neural network for this purpose and reached a very good level of accuracy (error below 3px for the localization of the information).

The model was written using python, Keras and Google Tensorflow and was trained in-house (not in the cloud) on a single GPU. When saved to disk, the model is about 1.2 MB.

I wanted to operationalize this model using Azure to see how would the performance be in the wild.

Enter Azure Machine Learning Workbench

Azure ML Workbench (AMLW), as the name suggests, is a set of tools and libraries for creating, testing, and deploying Azure-based ML models. Since our model is already created and trained, we don’t really need the experimentation features and the actual workbench application – for us, the most important part of the toolset is the Azure ML CLI.

This CLI contains the necessary commands and integrations that will allow us to take our pre-trained model, expose it via a web service, package the whole thing into a Docker image, and either test it locally or run it in an Azure Kubernetes cluster for production-grade deployment at scale.

Quite a few steps are required to go from a local model to a Kubernetes-deployed model cluster. I will go over each step in detail and explain the how and why of it.

Getting the required tools

First things first – in order to follow the steps, we need to install the necessary software. If you make use of Azure Data science virtual machines, all these tools should already be available there so you can skip to the next section.

Since we are going to use Azure, first thing you need is an Azure account. A free one should be sufficient, but if you want to do the Kubernetes deployment, a paid subscription works much better.

In the Azure portal, we create a new Machine Learning Experimentation resource, including a Model Management account. We will need to provide names for various resources – keep these at hand as we will need them later. There is a free tier of the Model Management service that should serve well for testing the process.

Next, download and install the Azure Machine Learning Workbench CLI (you need to have python installed for this work):
pip install -r https://aka.ms/az-ml-o16n-cli-requirements-file

After the installation finished, install Docker. Since the final output of the whole process is a docker image, having Docker locally will be help with debugging and testing.

Next, we need to log into the Azure account:

az login

This will display a link for you to follow and enter a registration code displayed in the command line.

Setting up the CLI environment

We now need to bind the CLI to the Azure account and resource groups:

az provider register -n Microsoft.MachineLearningCompute
 az provider register -n Microsoft.ContainerRegistry
 az provider register -n Microsoft.ContainerService

Registering those three providers might take a minute or two, we can monitor the progress using

az provider show -n [provider name]

Once all three providers show as “Registered”, we can proceed to the Azure environment setup:

az ml env setup -l westeurope -n amlwenv -g AMLW

If you have more than one Azure subscription available, the tool will ask which one to use. The example command line above will create a new environment called “amlwenv” in the West Europe Azure region and within a resource group called AMLW (it’s the same resource group we used for the Experimentation resource above). Provisioning a new environment will take a minute or two.

Once the provisioning succeeds, we need to activate the environment:

az ml env set -n amlwenv -g amlw

You should get a message similar to this one:

Setup phase complete!

Conclusion

In this initial part of a series of articles I attempted to introduce the Azure Machine Learning Workbench as a new way of bringing machine learning to production.

We saw how to setup an Azure Machine Learning Experimentation resource and its Model Management account. We also installed the required CLI tools and created a new ML environment we will use for testing the model’s service locally and activated it.

In the next part of the series, we will see how to package, describe, containerize, and test the ML service, complete with code examples.

 

Read next article: Operationalizing machine learning models 2/3 – From model to service

News from ERNI

In our newsroom, you find all our articles, blogs and series entries in one place.

> Load more

ERNI Schweiz

Casinoplatz 2

3011 Bern

Phone: +41 58 268 12 00

Email: [email protected]

ERNI Suisse

Bâtiment L

Route des Acacias 43

1227 Geneva

Phone: +41 58 268 11 03

Email: [email protected]

ERNI Suisse

Voie du Chariot 3

1003 Lausanne

Phone: +41 58 268 11 03

Email: [email protected]

ERNI Schweiz

Brünigstrasse 18

6005 Lucerne

Phone: +41 58 268 11 03

Email: [email protected]

ERNI Schweiz

Geschäftshaus Airgate

Thurgauerstrasse 40

8050 Zürich

Phone: +41 58 268 12 00

Email: [email protected]

ERNI Deutschland

Trakehner Str. 7-9

60487 Frankfurt am Main

Phone: +49 69 79 53 28 01

Email: [email protected]

ERNI Deutschland

Design Offices München – Arnulfpark

Luise-Ullrich-Str. 20

80636 München

Phone: +49 89 55 06 28 208

Email: [email protected]

ERNI Slovakia

Ševčenkova 34

851 01 Bratislava

Phone: +421 2 32 55 37 37

Email: [email protected]

ERNI España

Edificio El Triangle

Plaça Catalunya 1-4, 3º planta, Módulo A y B

08002 Barcelona

Phone: +34 93 667 77 76

Email: [email protected]

ERNI España

Carrer Pallars, 208, Bajos

08005 Barcelona

Phone: +34 93 667 77 76

Email: [email protected]

ERNI España

Calle de Alfonso XII 62

Oficina 3101

28014 Madrid

Phone: +34 901 848 787

Email: [email protected]

ERNI España

Sant Cugat ERNI Office

Plaça Xavier Cugat, 2 EDIF B Planta Baja

08174 Sant Cugat del Vallès

Phone: +34 93 667 77 76

Email: [email protected]

ERNI Romania

Calea Dorobantilor no. 98-100

3rd floor, in Olimpia Business Center

400609 Cluj-Napoca

Phone: +40 744 319 228

Email: [email protected]

ERNI Singapore

7 Straits View

Marina One East Tower #05-01

Singapore 018936

Phone: +65 9161 9863

Email: [email protected]

ERNI Philippines

9th Floor, 500 Shaw Zentrum Building

500 Shaw Boulevard

Mandaluyong City, Philippines 1555

Phone: +63 2 531 59 82

Email: [email protected]