Google Cloud launched the Anthos platform in April 2019, promising customers a way to run Kubernetes workloads on-premises, in the Google Cloud, and, crucially, in other major public clouds including Amazon Web Services (AWS) and Microsoft Azure.
That crucial last part has taken Google Cloud some time to achieve. The company finally announced Anthos support for AWS in April 2020, while Azure support remains in preview with a select batch of customers for now.
Speaking at Google Cloud Next in San Francisco in 2019, Google CEO Sundar Pichai said the idea behind Anthos is to allow developers to “write once and run anywhere”—a promise to simplify the development, deployment, and operation of applications across hybrid and multiple public clouds by bridging incompatible cloud architectures.
The previously released Google Kubernetes Engine (GKE) and GKE On-Prem allowed for hybrid Kubernetes deployments, yet customers continued to demand a platform that made it simple to span multiple, rival cloud providers as well.
By providing a single platform for the management of all Kubernetes workloads, Google Cloud Anthos allows customers to focus their skills on a single technology, rather than relying on certified experts in a multitude of proprietary cloud technologies.
Similarly, Anthos provides operational consistency across hybrid and public clouds, with the ability to apply common configurations across infrastructures, as well as custom security policies linked to certain workloads and namespaces, regardless of where those workloads are running.
Google Cloud Anthos components
Anthos is the natural evolution of the Cloud Services Platform the vendor was building before 2019. Anthos combines the Google Cloud managed service Google Kubernetes Engine (GKE), GKE On-Prem, and the Anthos Config Management console for unified administration, policies, and security across hybrid and multicloud Kubernetes deployments.
Add Stackdriver for observability, GCP Cloud Interconnect for high-speed connectivity, the Anthos Service Mesh (based on Google’s open source Istio project), and the Cloud Run serverless deployment service (based on the open source Knative) into the mix, and Google Cloud is looking to provide a seamless, one-stop shop for managing Kubernetes workloads regardless of where they reside.
Being based on GKE, Anthos takes care of any Kubernetes updates and security patches automatically as they are released.
GKE On-Prem installation currently requires VMware vSphere, though Google Cloud has announced it will enable GKE On-Prem to run without a third-party hypervisor later this year. At launch, partners VMware, Dell EMC, HPE, Intel, and Lenovo committed to deliver Anthos on hyperconverged infrastructures.
Google Cloud Anthos competitors
Fear of vendor lock-in is very real for enterprise customers. Providing a flexible and open route to move to the cloud is something of a holy grail for the cloud vendors today. But some want to have their cake and eat it, by trapping those customers within their own ecosystem when customers do decide to move workloads to the cloud.
Amazon Web Services finally relented on the hybrid cloud front when it announced AWS Outposts to help customers bridge on-prem and cloud workloads. An extension of the AWS cloud to on-premises data centers, AWS Outposts combines AWS-configured hardware and AWS-managed services and APIs.
Oracle Cloud at Customer and Microsoft Azure Stack are similar hybrid cloud offerings from other major players, while the Red Hat OpenShift and VMware Tanzu platform-as-a-service offerings, both underpinned by Kubernetes, allow containerized enterprise workloads to run in hybrid and public clouds.
In its bid to topple these big rivals, Google Cloud is making a big bet on Kubernetes being the future of enterprise infrastructure. Of course, Google’s rivals are also pushing aggressively into the managed Kubernetes world, but as the petri dish where Kubernetes was grown, Google has a strong claim to being the best way to run that technology.
Migrate for Anthos
To help customers get started, Google launched Migrate for Anthos off the back of the 2018 acquisition of Velostrata, an Israeli company specializing in cloud migration by cleverly decoupling storage and compute, allowing companies to leave storage on-premises and run compute the cloud. Migrate for Anthos allows workloads to be converted into containers for Kubernetes directly from physical servers and virtual machines.
How does it work? Migrate for Anthos parses the file system of a server or virtual machine and converts that to a Kubernetes persistent volume. The application containers, service containers, networking, and persistent volumes wind up in a Kubernetes pod, which is a group of containers that are deployed together on the same host.
For Google Cloud Platform customers, getting started with Anthos is as simple as creating a new GKE cluster, with the Istio service mesh enabled, in the console.
For on-prem customers, the first step to running Anthos involves setting up a GKE On-Prem cluster and migrating over an existing application. Once this cluster is registered with GCP, you can install Istio to achieve workload visibility across all of your clusters. Then, by enabling Anthos Config Management across your GKE clusters, all Kubernetes and Istio policies can be managed in one place.
What’s next for Google Cloud Anthos?
Config Manager was given a handy boost of its own in April 2020, when Google Cloud announced that Anthos users could now run the same configuration management for virtual machines on Google Cloud as they used for containers.
Google Cloud is also working on bringing support for applications running on VMs into the Anthos Service Mesh, which would allow for consistent security and policy management across workloads in Google Cloud, on-premises, and in other clouds.
Google Cloud Anthos pricing
Anthos is sold through Google Cloud’s enterprise sales team as a monthly term-based subscription with a minimum one-year commitment. It’s then priced on incremental blocks of 100 vCPUs, starting at $10,000 per block, regardless of where that workload runs.
Copyright © 2020 IDG Communications, Inc.