deploying-a-multi-container-application-with-kubernetes-on-google-cloud.html

Deploying a Multi-Container Application with Kubernetes on Google Cloud

In today's cloud-centric world, deploying applications in a scalable, resilient, and efficient manner is crucial for success. Kubernetes has emerged as a leading orchestration platform, especially for multi-container applications. In this article, we’ll walk through the process of deploying a multi-container application on Google Cloud using Kubernetes. Whether you are a seasoned developer or just starting, this step-by-step guide will provide you with actionable insights, coding examples, and troubleshooting tips.

What is Kubernetes?

Kubernetes (often abbreviated as K8s) is an open-source container orchestration platform that automates the deployment, scaling, and management of containerized applications. It enables you to manage clusters of hosts running Linux containers, providing a robust framework for managing applications in a microservices architecture.

Key Features of Kubernetes

  • Automated Scaling: Adjusts the number of active containers based on traffic.
  • Load Balancing: Distributes traffic evenly across containers.
  • Self-Healing: Automatically restarts containers that fail or become unresponsive.
  • Declarative Configuration: Allows you to define the desired state of your application.

Why Google Cloud?

Google Cloud Platform (GCP) provides a fully managed Kubernetes service called Google Kubernetes Engine (GKE). GKE simplifies cluster management, integrates seamlessly with other GCP services, and offers auto-scaling and monitoring capabilities out of the box.

Use Case: Deploying a Multi-Container Application

For this tutorial, we'll deploy a simple multi-container application consisting of a frontend and a backend service. The frontend will be a React application, while the backend will be a Node.js Express server.

Prerequisites

Before we begin, ensure you have the following installed:

  • Google Cloud SDK: To interact with GCP.
  • Docker: For building container images.
  • kubectl: The command-line tool for Kubernetes.

Step 1: Set Up Your Google Cloud Project

  1. Create a New Project:
    Go to the Google Cloud Console, and create a new project.

  2. Enable Kubernetes Engine API:
    In the API library, enable the Kubernetes Engine API for your project.

  3. Set Up Billing:
    Ensure your account is linked to a billing method.

Step 2: Create a GKE Cluster

Run the following command in your terminal to create a GKE cluster:

gcloud container clusters create my-cluster --num-nodes=3

This command creates a cluster named my-cluster with three nodes.

Step 3: Build Docker Images

Next, we need to create Docker images for both the frontend and backend applications.

  1. Create a Dockerfile for the Backend:
# Backend Dockerfile (backend/Dockerfile)
FROM node:14

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .

EXPOSE 3000
CMD ["node", "server.js"]
  1. Create a Dockerfile for the Frontend:
# Frontend Dockerfile (frontend/Dockerfile)
FROM node:14 AS build

WORKDIR /app

COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build

FROM nginx:alpine
COPY --from=build /app/build /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

Step 4: Build and Push Docker Images

  1. Build the Images:

Run the following commands in each directory:

# For Backend
docker build -t gcr.io/[YOUR_PROJECT_ID]/backend:latest backend/

# For Frontend
docker build -t gcr.io/[YOUR_PROJECT_ID]/frontend:latest frontend/
  1. Push the Images to Google Container Registry:
# For Backend
docker push gcr.io/[YOUR_PROJECT_ID]/backend:latest

# For Frontend
docker push gcr.io/[YOUR_PROJECT_ID]/frontend:latest

Step 5: Create Kubernetes Deployment Files

  1. Create a Deployment for the Backend:
# backend-deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: backend
spec:
  replicas: 2
  selector:
    matchLabels:
      app: backend
  template:
    metadata:
      labels:
        app: backend
    spec:
      containers:
      - name: backend
        image: gcr.io/[YOUR_PROJECT_ID]/backend:latest
        ports:
        - containerPort: 3000
  1. Create a Deployment for the Frontend:
# frontend-deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: frontend
spec:
  replicas: 2
  selector:
    matchLabels:
      app: frontend
  template:
    metadata:
      labels:
        app: frontend
    spec:
      containers:
      - name: frontend
        image: gcr.io/[YOUR_PROJECT_ID]/frontend:latest
        ports:
        - containerPort: 80

Step 6: Deploy to Kubernetes

Run the following commands to deploy your applications:

kubectl apply -f backend-deployment.yaml
kubectl apply -f frontend-deployment.yaml

Step 7: Expose Your Services

  1. Expose the Backend Service:
# backend-service.yaml
apiVersion: v1
kind: Service
metadata:
  name: backend
spec:
  type: ClusterIP
  ports:
  - port: 3000
  selector:
    app: backend
  1. Expose the Frontend Service:
# frontend-service.yaml
apiVersion: v1
kind: Service
metadata:
  name: frontend
spec:
  type: LoadBalancer
  ports:
  - port: 80
  selector:
    app: frontend

Deploy the services:

kubectl apply -f backend-service.yaml
kubectl apply -f frontend-service.yaml

Step 8: Access Your Application

After deploying, you can check the status of your services:

kubectl get services

The frontend service will provide an external IP address that you can use to access your application.

Troubleshooting Tips

  • Check Logs: Use kubectl logs [POD_NAME] to check the logs of your pods for any errors.
  • Describe Resources: Use kubectl describe pod [POD_NAME] to get detailed information about specific pods and their states.
  • Scaling: If you need to scale the application, you can run:
kubectl scale deployment frontend --replicas=3

Conclusion

Deploying a multi-container application with Kubernetes on Google Cloud using GKE is a powerful way to manage your applications. By following the steps outlined in this article, you can leverage the benefits of containerization and Kubernetes orchestration to create scalable, resilient applications. Whether you're building a microservices architecture or a standalone application, understanding how to deploy with Kubernetes is an essential skill for modern developers. Happy coding!

SR
Syed
Rizwan

About the Author

Syed Rizwan is a Machine Learning Engineer with 5 years of experience in AI, IoT, and Industrial Automation.