Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

  1. Blog
  2. Article

Valentin Viennot
on 3 December 2021

How to colourise black & white pictures: OpenVINO™ on Ubuntu containers demo (Part 1)


Christmas is coming, but you don’t have a present on hand for your (grand)parents (Mom, Dad, if you’re reading this – I promise this post isn’t drawn from real life!). Looking for a solution? If your loved ones happened to live through the era of monochrome photography, keep reading. You can work some magic with Intel® Distribution of OpenVINO™ Toolkit on Ubuntu containers to give their old pictures new life. Hopefully, this blog will save Christmas!

OpenVINO on Ubuntu containers: making developers’ lives easier

Also, suppose you’re curious about AI/ML and what you can do with OpenVINO on Ubuntu containers. In that case, this blog is an excellent read for you too.

Docker image security isn’t only about provenance and supply chains; it’s also about the user experience. More specifically, the developer experience.

Removing toil and friction from your app development, containerisation, and deployment processes avoids encouraging developers to use untrusted sources or bad practices in the name of getting things done. As AI/ML development often requires complex dependencies, it’s the perfect proof point for secure and stable container images.

Why Ubuntu Docker images?

As the most popular container image in its category, the Ubuntu base image provides a seamless, easy-to-set-up experience. From public cloud hosts to IoT devices, the Ubuntu experience is consistent and loved by developers.

One of the main reasons for adopting Ubuntu-based container images is the software ecosystem. More than 30.000 packages are available in one “install” command, with the option to subscribe to enterprise support from Canonical. It just makes things easier.

In the next and final blog (coming soon, keep posted…), you’ll see that using Ubuntu Docker images greatly simplifies components containerisation. We even used a prebuilt & preconfigured container image for the NGINX web server from the LTS images portfolio maintained by Canonical for up to 10 years.

Beyond providing a secure, stable, and consistent experience across container images, Ubuntu is a safe choice from bare metal servers to containers. Additionally, it comes with hardware optimisation on clouds and on-premises, including Intel hardware.

Why OpenVINO?

When you’re ready to deploy deep learning inference in production, binary size and memory footprint are key considerations – especially when deploying at the edge. OpenVINO provides a lightweight Inference Engine with a binary size of just over 40MB for CPU-based inference. It also provides a Model Server for serving models at scale and managing deployments.

OpenVINO includes open-source developer tools to improve model inference performance. The first step is to convert a deep learning model (trained with TensorFlow, PyTorch,…) to an Intermediate Representation (IR) using the Model Optimizer. In fact, it cuts the model’s memory usage in half by converting it from FP32 to FP16 precision. You can unlock additional performance by using low-precision tools from OpenVINO. The Post-training Optimisation Tool (POT) and Neural Network Compression Framework (NNCF) provide quantisation, binarisation, filter pruning, and sparsity algorithms. As a result, Intel devices’ throughput increases on CPUs, integrated GPUs, VPUs, and other accelerators.

Open Model Zoo provides pre-trained models that work for real-world use cases to get you started quickly. Additionally, Python and C++ sample codes demonstrate how to interact with the model. More than 280 pre-trained models are available to download, from speech recognition to natural language processing and computer vision.

For this blog series, we will use the pre-trained colourisation models from Open Model Zoo and serve them with Model Server.

OpenVINO and Ubuntu container images

The Model Server – by default – ships with the latest Ubuntu LTS, providing a consistent development environment and an easy-to-layer base image. The OpenVINO tools are also available as prebuilt development and runtime container images.

To learn more about Canonical LTS Docker Images and OpenVINO™, read:

Neural networks to colourise a black & white image

Now, back to the matter at hand: how will we colourise grandma and grandpa’s old pictures? Thanks to Open Model Zoo, we won’t have to train a neural network ourselves and will only focus on the deployment. (You can still read about it.)

Architecture diagram of the colouriser demo app running on MicroK8s

Our architecture consists of three microservices: a backend, a frontend, and the OpenVINO Model Server (OVMS) to serve the neural network predictions. The Model Server component hosts two different demonstration neural networks to compare their results (V1 and V2). These components all use the Ubuntu base image for a consistent software ecosystem and containerised environment.

A few reads if you’re not familiar with this type of microservices architecture:

gRPC vs REST APIs

The OpenVINO Model Server provides inference as a service via HTTP/REST and gRPC endpoints for serving models in OpenVINO IR or ONNX format. It also offers centralised model management to serve multiple different models or different versions of the same model and model pipelines.

The server offers two sets of APIs to interface with it: REST and gRPC. Both APIs are compatible with TensorFlow Serving and expose endpoints for prediction, checking model metadata, and monitoring model status. For use cases where low latency and high throughput are needed, you’ll probably want to interact with the model server via the gRPC API. Indeed, it introduces a significantly smaller overhead than REST. (Read more about gRPC.)

OpenVINO Model Server is distributed as a Docker image with minimal dependencies. For this demo, we will use the Model Server container image deployed to a MicroK8s cluster. This combination of lightweight technologies is suitable for small deployments. It suits edge computing devices, performing inferences where the data is being produced – for increased privacy, low latency, and low network usage.

Ubuntu minimal container images

Since 2019, the Ubuntu base images have been minimal, with no “slim” flavours. While there’s room for improvement (keep posted), the Ubuntu Docker image is a less than 30MB download, making it one of the tiniest Linux distributions available on containers.

In terms of Docker image security, size is one thing and reducing the attack surface is a fair investment. However, as is often the case, size isn’t everything. In fact, maintenance is the most critical aspect. The Ubuntu base image, with its rich and active software ecosystem community, is usually a safer bet than smaller distributions.

A common trap is to start smaller and install loads of dependencies from many different sources. The end result will have poor performance, use non-optimised dependencies, and not be secure. You probably don’t want to end up effectively maintaining your own Linux distribution… So, let us do it for you.

Colourise black & white pictures: what’s next?

In the next and final blog in this series, we’ll start coding. I promise you’ll come out of reading it with a concrete solution for a Christmas present. You can already begin scanning the old photo books to craft their coloured version.

In the final blog, we will:

  • Prepare the backend and frontend code (source code included!)
  • Craft Dockerfiles based on Ubuntu and LTS images for each component
  • Give an overview of container images best practices using multi-stage builds
  • Deploy the OpenVINO Model Server on MicroK8s
  • Relate all these microservices to complete the whole picture
  • Colourise some black and white photos!

You can also sign up for the on-demand version of our joint OpenVINO demo webinar with Intel.

Keep reading, the next blog is live!

Related posts


Canonical
5 September 2023

도커(Docker) 컨테이너 보안: 우분투 프로(Ubuntu Pro)로 FIPS 지원 컨테이너 이해하기

FIPS Security

오늘날 급변하는 디지털 환경에서 강력한 도커 컨테이너 보안 조치의 중요성은 아무리 강조해도 지나치지 않습니다. 컨테이너화된 계층도 규정 준수 표준의 적용을 받기 때문에 보안 문제 및 규정 준수 요구 사항이 발생합니다. 도커 컨테이너 보안 조치는 경량의 어플라이언스 유형 컨테이너(각 캡슐화 코드 및 해당 종속성)를 위협 및 취약성으로부터 보호하는 것을 수반합니다. 민감한 개인 데이터를 처리하는 데 의존하는 ...


Valentin Viennot
2 June 2023

Docker container security: demystifying FIPS-enabled containers with Ubuntu Pro

container Article

In today’s rapidly changing digital environment, the significance of robust Docker container security measures cannot be overstated. Even the containerised layer is subject to compliance standards, which raise security concerns and compliance requirements. ...


Mita Bhattacharya
6 November 2024

Meet Canonical at KubeCon + CloudNativeCon North America 2024

Cloud and server Article

We are ready to connect with the pioneers of open-source innovation! Canonical, the force behind Ubuntu, is returning as a gold sponsor at KubeCon + CloudNativeCon North America 2024.  This premier event, hosted by the Cloud Native Computing Foundation, brings together the brightest minds in open source and cloud-native technologies. From ...