Smooth sailing with RAIN and Mirantis k0s
Originally developed at Google, Kubernetes was released as an open-source software solution for distributed computing orchestration in 2015. Google partnered with the Linux Foundation to form the Cloud Native Computing Foundation (CNCF), which has since maintained Kubernetes.
The concept was groundbreaking in that it enables computing functions to run within so-called containers on any sort of networked computing infrastructure, but in different physical locations, in a coordinated manner. In today’s terms and in the context of RAIN, that means it can orchestrate distributed computing resources across the edge-to-cloud continuum.
The recent emergence of cloud-native, distributed computing is part of a long evolution in information technology (IT) and operational technology (OT). For more context, please read one of our earlier articles, ‘How to ride the ‘edge-to-cloud continuum’’.
The whole idea is reminiscent of Sun Microsystems’ John Cage’s famous truism: “The network is the computer.”
A proliferation of distros
It has always been complex software to develop, deploy and configure, and to build applications for. But as Kubernetes, or k8s or “Kube” for short, has matured, it has gradually become easier for developers to create separate distributions.
In recent years there’s been a bit of a proliferation of distros with different flavors and characteristics. Red Hat’s OpenShift, SUSE’s k3s, and Canonical’s MicroK8s are among the better-known examples.
The Big Three hyperscalers offer distributed computing platforms with managed Kubernetes services under the hood: Microsoft’s Azure Kubernetes Service (AKS), Amazon’s Elastic Kubernetes Service (EKS), and Google Kubernetes Engine (GKE).
Many enterprise customers have committed to one or more of the Big Three, which means that they don’t need to worry about Kubernetes distros and versions or how to manage them.
Removing unnecessary complexity
At RAIN we are completely fine with whatever computing infrastructure our customers use and whichever Kube flavor runs on it. But having said that, we’ve lately been testing a fairly new and interesting distribution called k0s (pronounced: “koss”) in our lab and some pilot environments.
Mirantis’ K0s was announced in November 2020. Over the years, the company has built a product portfolio with an emphasis on virtualization and containerized computing services.
The 0 in k0s reflects that it’s a fairly stripped-down version compared to many other distros. But that’s only part of the story. With k0s, Mirantis prouds itself in offering a Kubernetes distro with zero friction, zero dependencies, and zero cost.
“We wanted to create something easy to use by removing any unnecessary complexity around Kubernetes, so that our customers can avoid the need for deep technical expertise,” says Mikko Viitanen, Senior Product Manager for k0s at Mirantis.
“It’s only a matter of minutes to get a full Kubernetes cluster up and running,” he says, “after which it is empty, waiting for something to do; waiting for applications.”
“It’s fully open and free to use. Anyone can download it from GitHub. What we sell to our customers is support, professional services, and training. k0s is very easy for customers to try out and when they become more serious about using it, they may purchase a support subscription.”
“It suits us very well”
“Our initial experience with k0s is that it suits us very well,” says RAIN’s Director of Engineering, Mikko Suominen. “We want to spend as little time as possible configuring the Kube layer or managing Kube clusters. Instead, we want to deploy and develop RAIN, our platform on top of Kubernetes, and RAIN’s distributed applications.”
“k0s is light enough to run on our developers’ laptops and then run RAIN on top of that. It appears to have the elasticity to scale both up and down: from large distributed networks to very small infrastructure like a laptop or a tiny Jetson Nano computer.”
“Some of the AI model calculations that we have in RAIN are quite compute-heavy,” he continues. “We want to run those models on GPUs (Graphical Processing Units) rather than CPUs (Central Processing Units). Those two types of processors handle data differently because they have different architectures. While CPUs may be fast at calculus, GPUs are specifically designed to render graphics. It turns out that in this context, graphics and AI models have a lot in common. We quite like the fact that k0s is so easy to deploy and that it enables exposing GPUs smoothly for RAIN to run AI models on them.”
Security and scalability
“Another thing is, when we say that RAIN was designed with security and scalability in mind, the Kubernetes layer underneath RAIN is an important part in that equation. For example, an important security measure is to separate and isolate the control plane from worker nodes. With some manual programming this can be achieved in any Kube distro, but with k0s it is a standard feature.”
Mikko Viitanen: “That is correct. And I’d like to add that k0s is distributed as a single binary file. As there are no dependencies on external components, whenever we have a security patch we can simply update the k0s binary and make it available for k0s users. With the single binary, users can keep the whole Kubernetes stack always up-to-date, to keep their clusters secured.”
Many of Mirantis’ references are large enterprises who haven’t wanted to move their business to any of the Big Three public cloud providers. Among them are Volkswagen, Visa, Paypal and several Fortune-500 companies, for example in the finance and telecom sectors. Says Viitanen: “By providing them with cloud-native technology, we give our customers a public cloud -like experience on their own infrastructure.”
Follow the data
At RAIN we see an ever increasing role for smart data collection, edge AI data reduction, availability and control across the edge-to-cloud continuum. Data is our bread and butter.
RAIN is a distributed computing solution that enables manufacturing, utilities, logistics, telecom, retail and hospitality organizations…
- to collect, reduce, fuse, process and control any data across any number of information systems - online/offline, in the cloud and on edge devices;
- to create, test, deploy and manage data applications faster, more affordably, more flexibly and with greater ease of use than anything else on the market today.
Forget data application development ‘projects’. Reduce your time to deployment from months to minutes. Reduce your cost from tens or hundreds of thousands of euros to mere hundreds or even tens of euros per application.
If you’d like to explore how your data could improve your business, feel free to book a call with our CEO Henri Kivioja anytime, with no strings attached.