Integrating serverless container deployment solutions with K8s enables developer productivity and faster time to market. In this article series, we’ll explore Cloud Run and learn how it integrates with Google Kubernetes Engine (GKE).

Kentaro Wakayama
23 October 2022

Due to their flexibility and agility, containers have become the preferred deployment option in the cloud. Though it was initially developed and used by Google as an internal platform code, Kubernetes has undoubtedly become the market leader among container orchestration solutions.
Today, all leading cloud service providers deliver multiple flavors of K8s and supporting services on their platforms. Customers have no shortage of options: managed K8s services, serverless containers, hosted container registries, specialized security services for containers, and more. These suites of products have made containerized application deployment in the cloud more robust than ever.
Integrating serverless container deployment solutions with K8s takes this one step further. In the first installment of this two-part article series we’ll explore Cloud Run , the Knative-based serverless deployment solution for containers. In part two, we’ll look at how it integrates with Google Kubernetes Engine (GKE), the native, managed K8s solution on GCP.
Serverless has gained traction over the past few years because it enables organizations to generate value for their core business applications without the overhead of managing complex underlying infrastructure.
Knative is an open-source platform that helps run enterprise-grade serverless applications on Kubernetes. Knative manages the complexities related to running the application, including configuring the network, autoscaling, revision tracking, eventing framework for cloud-native services, and more.
Organizations turn to Knative for these key benefits:
Knative consists of three main components:
Knative has an active open-source community and is supported by more than 100 organizations, including industry leaders like IBM, Red Hat, SAP, Pivotal, and more. It has gained popularity thanks to the flexibility and portability that it offers for serverless deployments.
Cloud Run in GCP is a serverless container hosting solution based on Knative. It enables workload portability and a seamless developer experience, whether you’re using GKE in the cloud or running the cluster on-premises using Anthos.
Cloud Run can be used for the high-speed deployment of applications in a serverless environment. It also provides the flexibility to use the framework and language of your choice. The service is well integrated with GCP container ecosystem tools and services like Cloud Build, Artifact Registry, and Cloud Code to name a few. Cloud Run also integrates well with multiple partner solutions, including GitLab, Sentry, HashiCorp, Datadog, and more.
Developers can quickly deploy and manage applications on Cloud Run from the command-line interface or from GCP’s UI. And because it’s Knative-based, Cloud Run supports fast autoscaling for pods based on incoming traffic. Cloud Run also leverages GCP’s built-in regional redundancy, where the services are replicated across multiple zones within a region.
All applications deployed in managed Cloud Run receive a stable HTTPS endpoint, autoconfigured with TLS termination for secure access. Customers can also invoke Cloud Run services and connect to it through protocols like gRPC, WebSocket, HTTP/1.* and HTTP/2.
GCP offers a fully managed Cloud Run service, and Cloud Run for Anthos. Both services use Knative in the backend and customers can choose based on their deployment preference.
Cloud Run is fully managed, and serverless by design. It helps organizations deploy containers in a matter of seconds and access their application through an out-of-the-box HTTPS URL. The underlying infrastructure management is done by the platform, and it ensures automated scaling out and scaling in of workloads based on app ingress traffic.
The fully managed Cloud Run service is deployed in sandboxed environments and provides strict container isolation, ensuring the security of critical workloads. Applications deployed in Cloud Run can communicate with resources in your VPC (like VM instances or Memorystore instances) with a private IP through Serverless VPC Access connector.
Cloud Run can support up to 1,000 containers by default, and can be further scaled using a quota increase request. It uses pay-per-use billing for CPU, memory, and access requests rounded to the nearest 100 milliseconds.
Anthos, GCP’s hybrid/multi-cloud application management solution, extends the power of GKE to an environment of your choice and provides a unified development, management, and operations experience for clusters.
Cloud Run for Anthos takes this one step further and abstracts the concepts of K8s so that developers can build and deploy serverless applications without deep K8s expertise. It also helps organizations leverage their existing investments in K8s. Services deployed using Cloud Run for Anthos can benefit from all of the features available on every server that is part of the Anthos cluster, including GPUs. The cost is also included in the GKE cluster usage charges.
Cloud Run-based microservices can coexist with other microservices in the same Anthos cluster and interact with each other over a private network. The services can be made accessible over the internet or published to a private VPC network. They will also be part of Istio service mesh, which enables unified service management.
Cloud Run for Anthos provides the additional flexibility of modernizing your application right where it is. This is especially useful for legacy applications deployed on-premises. It can be used to deploy serverless applications on your on-premises Anthos clusters, and they can later be moved back to the fully managed version of Cloud Run on the cloud or in a different environment compatible with Knative.
Cloud Run and Cloud Run for Anthos are GCP’s serverless container deployment options. Powered by Knative—the trusted choice for serverless container deployment—Cloud Run enables developer productivity and faster time to market by abstracting the infrastructure management activities.
While Cloud Run is delivered as a fully managed service, Cloud Run for Anthos helps leverage existing K8s investments for serverless deployments. It provides the flexibility to deploy applications without fear of vendor-lock in, and ensures seamless integration with the GCP ecosystem to support your application stack. Cloud Run brings together the power of Kubernetes and serverless to deliver the best of both worlds.
In the next part of this blog series, we’ll show you how to get started with Cloud Run to deploy serverless applications.