anastasiavalti
on 20 July 2020
MicroK8s was first released in late 2018 and has seen significant adoption rates from developers and enterprises alike ever since. Taking increasing demand and curiosity around the topic, we have already given an introduction to MicroK8s as well as covered how to deploy MicroK8s locally in previous blog posts. This time, we’ll take a look at MicroK8s’ applied value, by examining common MicroK8s use cases. Among others, this includes AI/ML workflows.
What is MicroK8s?
For those who are still new to MicroK8s, let’s start by defining it. Microk8s is a lightweight, pure-upstream Kubernetes aiming to reduce the barriers to entry for K8s and cloud-native application development. It comes in a single package that installs a single-node (standalone) K8s cluster in under 60 seconds. You can also use it to create a multi-node cluster with just a few commands. MicroK8s has all the Kubernetes core components and it is also opinionated. What this means is that a lot of the add-ons that you would typically look for in Kubernetes, such as DNS and the Dashboard are a single command away.
MicroK8s is available for most popular Linux distributions and also for Windows and Mac workstations, through a native installer for both operating systems. On Windows, you also have the option to get MicroK8s on WSL.
It uses the snap packaging mechanism, which is really convenient, as this brings automatic updates. This means that as soon as a new stable Kubernetes version is available upstream your MicroK8s cluster will be automatically updated. You will similarly get all available security patches for your K8s.
What can you use MicroK8s for?
So who is MicroK8s for? MicroK8s can be equally a developer’s friend as well as the go-to Kubernetes for enterprises. You just need a Kubernetes? MicroK8s is the easiest way to get one. You can use it on your local workstation for development and testing.
Its low resource footprint and support for both ARM and Intel architectures make MicroK8s ideal for the edge, IoT and appliances. We see a lot of people using MicroK8s on their Raspberry PIs and we also have a few great tutorials for that, which you can access through the ‘attachments’ tab beneath this video.
What is particularly convenient about MicroK8s is the Zero-ops experience. Even for those with very little hands-on experience with Kubernetes, you can have your Kubernetes cluster up and running with a single command, get security updates without lifting a finger and run your add-ons without having to worry about configuration. Remember, however: if you want to configure it you CAN – at the end of the day, it’s just a Kubernetes.
The latest improvement MicroK8s brings is the HA clustering. Get a fully resilient K8s cluster by joining three or more nodes to your cluster and have MicroK8s automatically distribute the control plane across the nodes.
MicroK8s gets full enterprise support from Canonical and 10-year commitment for critical deployments. We are actively working with all major clouds to simplify multi-cloud and hybrid-cloud deployments and ensure our users get a care-free, production-grade Kubernetes.
MicroK8s features
Here you can see the complete set of features and add-ons that are currently supported by MicroK8s.
- Use Juju, JAAS or Helm to automate operations.
- Use DNS, ingress, Flannel and Cilium for your project’s networking
- Use GPU acceleration for your deep learning projects and, if you are an AI/ML geek, combine it with Kubeflow as your home edition HPC.
- Use Istio or Linkerd for service mesh and MetalLb for load ballancing
- Use KNative to write serverless applications
And if you need logging and monitoring for your projects, we have a full toolbox that includes Prometheus, Grafana, Elastic, Fluentd, Kibana and Jaeger.
MicroK8s use case #1: DevOps & CI/CD pipelines
Now that we’ve seen what MicroK8s is, let’s talk a bit about how it can be used by taking a look at some common use cases. The first one would be DevOps and setting up of CI/CD pipelines. Doesn’t matter if you need Kubernetes to run your tests on a local or a production CI/CD pipeline. MicroK8s is the easiest way to get Kubernetes. Start by building your containerised apps on local using MicroK8s with for example Gitlab. Then move your apps to your test environment that is also built on another MicroK8s instance and finally deploy your containers in production. What I just described is a multi-step process that requires a lot of configuration to be set-up. With MicroK8s that is no longer the case.
After you finish with your tests you can, with a single command, either reset your K8s environment or even stop the MicroK8s service to save resources.
MicroK8s use case #2: AI/ML and GPU acceleration
Another significant use case is building AI/ML pipelines on top of MicroK8s. When we look at machine learning teams, data scientists often end up working in silos – meaning they each create different parts of the workflow resulting in hard-to-share and hard-to-maintain notebooks and scripts. This is because some of the tools they use do not enable multi-user environments.
Further, setting up the right infrastructure for cost-effective AI/ML is cumbersome and often data scientists lack the necessary know-how to do the setup themselves.
MicroK8s addresses both silos and infrastructure challenges, thanks to Kubeflow and GPU acceleration. Kubeflow bundles one of the latest and greatest collections of AI/ML tools that enable sharing of AI workflows between teams. Running Kubeflow on top of MicroK8s as easy as “microk8s.enable kubeflow”. GPU acceleration allows for model training compute requirements to be taken care of. Again, enabling GPU acceleration with MicroK8s is just a command away.
You can deploy MicroK8s clusters on top of any infrastructure, such as a laptop, workstation, a cloud VM, or an on-prem server. This abstracts the complexity of setting up the substrate and lets data scientists focus on their models instead.
Key takeaways and resources on MicroK8s
- Production-ready, fully compatible K8s on rails with selected add-ons
- MicroK8s is available for Linux, Windows and macOS
- Use it for dev & testing, clusters, DevOps, AI/ML, IoT, Edge and appliances
Want a more thorough walkthrough of MicroK8s and a quick demo? Take a look at our Introduction to MicroK8s webinar.