What a Managed Container Environment should offer How Managed Kubernetes relieves the IT team
Kubernetes is considered complex and difficult to use, so that the workload often increases rather than decreases. Managed Kubernetes environments can bring benefits – and thus really contribute to the relief of the IT department.
A managed Kubernetes solution should, among other things, make it possible to create an initial container infrastructure in a few simple steps.
Kubernetes has become a quasi-standard for the orchestration of cloud structures in recent years. The open source project supports developers in managing, maintaining and publishing a growing number of containers in their cloud environments. The community that uses and develops the open source initiative is growing rapidly and is supported by numerous well-known companies.
E-book on the topic
Kubernetes has become a powerful but also complex tool in a relatively short time, thanks to a dedicated community and many important organizations in the background. In the meantime, numerous ecosystems have emerged around Kubernetes, which promise more functions and interoperability. However, this is often at the expense of clarity and operability.
Thus, it can sometimes be difficult to set up a Kubernetes environment spontaneously. Kubernetes is open source technology and interested parties can experiment with it on their own. The testing is simplified by the fact that Kubernetes does not rely on a special operating environment and runs on many conventional Linux distributions and Windows systems.
However, the fine adjustment and larger environments make working with the system more complicated. Kubernetes offers numerous functions for container orchestration, but when provisioning new servers or providing persistent storage, a corresponding infrastructure is required in the background. This is especially the case when – as in most cases – the number of containers and deployments is increasing rapidly and is increasingly nested within each other.
The advantages of the cloud
If you do not want to use Kubernetes in your own data center, numerous cloud providers now offer solutions that you can fall back on – with all the advantages that the cloud offers: the latest version is always available on a managed and freely scalable, highly available infrastructure.
Therefore, developers do not have to bother with updates. Companies usually only have to pay for the cloud resources that are actually in use. The costs thus remain largely transparent and calculable.
Kubernetes does not require a dedicated operating system environment, so the risk of vendor lock-ins is initially low. This means that the entire workload can be transferred relatively easily when changing providers. Kubernetes provides an API for the configuration, which can be addressed directly via HTTPS or via the kubectl command line.
Finding the right provider
The range of potential providers is huge and the selection of the right service is not easy, as the offers differ greatly. For example, there is an opportunity to do everything except the installation itself. If the Kubernetes internal functions are sufficient for your own purposes, a pure hosting offer can be used.
Without additional services or support, however, there are now costs for Kubernetes, which would probably have been lower with an on-prem installation. That’s why pure hosting is an option that only makes sense for a few use cases. Managed offers appear more attractive.
The offer here is also wide-ranging: from ecosystems of large hyperscalers to container-as-a-service packages to managed Kubernetes from cloud-native providers. It is therefore worth taking a look at the details. What should a Kubernetes platform offer?
Easy to set up and intuitive operation of the platform
In order not to have to deal with administrative activities down to the smallest detail, it should be possible with the envisaged solution to create an initial container infrastructure with just a few clicks. Even if it doesn’t seem important at the beginning to have a click-based and intuitive interface, it proves to be very helpful in the course of the project.
This makes it easier and faster to make adjustments, and it is easier to keep an eye on the varying costs. In addition, Kubernetes has no particularly pronounced logging and monitoring features. If this is possible via the user interface of the Kubernetes platform, developers save a lot of time that would otherwise be better off.
Balanced preconfiguration possibilities
Kubernetes offers countless possibilities for configuration. A platform for managing Kubernetes should therefore offer both, if possible: the possibility to configure every detail yourself as well as suitable presets. For example, frequently selected options such as load balancer, persistent storage and scheduler could be sensibly preset. In this way, users can start directly and, if necessary, adjust the configuration step by step.
Automatic provisioning and deprovisioning
Cloud services offer the great advantage that there is a flexibly scalable infrastructure in the background. And it’s the essence of a managed service to automate things. Automatic provisioning is therefore – as a scale-in or scale–out – a basic function that every Kubernetes platform should fulfill. After all, orchestration software such as Kubernetes is used precisely to scale containers and deploy container-based applications in clusters. When Kubernetes scales, IT infrastructure is needed that can be put into operation and shut down again as needed. Especially in hybrid environments, it can be interesting to be able to add cloud infrastructure if necessary.
Cluster updates and Auto-healing
Kubernetes clusters need regular updates. A Kubernetes platform can support this, for example by allowing updates to be scheduled or by performing them automatically. Auto-healing is used whenever instances in the cloud are not fully functional, for example due to maintenance work or reduced performance (e.g. high latency). If everything that is currently running on these instances had to be restored manually, this would be a high effort. Providers that offer auto-healing detect runtime problems and automatically rebuild the clusters and applications.
Integration of IAM
Kubernetes does not provide native identity and access management (IAM): user data is stored in a separate database. Therefore, it is advantageous if the Kubernetes platform has corresponding functions or at least the extensive integration of an identity tool.
Faster start-up and better handling thanks to Managed Kubernetes
The installation of Kubernetes – and even more so the operation – already require a lot of special know-how. If there are no restrictive reasons against the use of cloud services, developers for Kubernetes should (may) use them.
Managed Kubernetes solutions simplify the user’s work by pre-configured basics, such as cluster equipment, storage and load balancers. The operation of the Kubernetes platform by an operator is also more convenient and largely automatically scalable. However, choosing the right platform is not easy – the offer on the market is extensive and diverse, the costs incurred vary greatly. In addition to hyperscalers, specialized, local cloud providers often offer better support and more flexible solutions.
* Henrik Hasenkamp is the founder and CEO of gridscale.