DEV Community

Cover image for End-to-End DevOps Project: Building, Deploying, and Monitoring a Full-Stack Application
Haswanth Kondamadugula
Haswanth Kondamadugula

Posted on

End-to-End DevOps Project: Building, Deploying, and Monitoring a Full-Stack Application

Table of Contents
Introduction
Project Overview
Prerequisites
Step 1: Infrastructure Setup on AWS
1.1 Setting Up the VPC and Networking
1.2 Provisioning EC2 Instances
1.3 Setting Up an RDS Database
Step 2: Installing and Configuring Jenkins
2.1 Jenkins Installation
2.2 Configuring Jenkins for GitHub Integration
2.3 Setting Up Jenkins Pipelines
Step 3: Containerizing the Application with Docker
3.1 Writing a Dockerfile
3.2 Building and Pushing Docker Images
3.3 Docker Compose for Local Development
Step 4: Deploying to Kubernetes (Amazon EKS)
4.1 Setting Up the EKS Cluster
4.2 Creating Kubernetes Manifests
4.3 Deploying the Application on EKS
Step 5: Implementing Continuous Monitoring with Prometheus and Grafana
5.1 Installing Prometheus
5.2 Configuring Grafana Dashboards
5.3 Setting Up Alerts
Step 6: Securing the CI/CD Pipeline
6.1 Scanning for Vulnerabilities with Trivy
6.2 Integrating SonarQube for Code Quality
Step 7: Automating Infrastructure with Terraform
7.1 Writing Terraform Scripts
7.2 Managing Infrastructure as Code
7.3 Terraform State Management
Step 8: Implementing Blue-Green Deployments
8.1 Setting Up Blue-Green Deployments
8.2 Automating Traffic Shifts
8.3 Rollback Strategies
Conclusion
Further Reading and Resources
Introduction
DevOps is about automating processes, improving collaboration between development and operations teams, and deploying software more quickly and reliably. This project guides you through the creation of a comprehensive CI/CD pipeline using industry-standard tools. You will deploy a full-stack application on AWS using Jenkins, Docker, Kubernetes (Amazon EKS), Prometheus, Grafana, Trivy, SonarQube, and Terraform. This hands-on experience will help you master key DevOps concepts and tools.

Project Diagram
+------------------------+
| Developer Workstation |
| |
| - Code Repository |
| - Local Build & Test |
+-----------+------------+
|
v
+------------------------+
| Jenkins |
| |
| - CI/CD Pipeline |
| - Build & Test |
| - Docker Build |
| - Push Docker Image |
+-----------+------------+
|
v
+------------------------+ +----------------------+
| Docker Hub | | AWS EKS |
| | | |
| - Docker Image | | - Kubernetes Cluster |
| | | |
+-----------+------------+ +-----------+----------+
| |
v |
+------------------------+ +----------------------+
| Kubernetes Deployment| | Prometheus & Grafana|
| | | |
| - Deployment | | - Monitoring |
| - Service | | - Dashboards |
| | | |
+------------------------+ +----------------------+
|
v
+------------------------+
| Amazon RDS |
| |
| - MySQL Database |
| |
+------------------------+
Project Overview
Objectives
Infrastructure Setup: Provision AWS resources including VPC, EC2 instances, and RDS databases.
CI/CD Pipeline: Automate the build, test, and deployment processes with Jenkins.
Containerization: Containerize the application using Docker.
Kubernetes Deployment: Deploy the application on Amazon EKS.
Monitoring: Implement continuous monitoring using Prometheus and Grafana.
Security: Secure the pipeline with Trivy and SonarQube.
Infrastructure as Code: Automate infrastructure management with Terraform.
Blue-Green Deployment: Implement blue-green deployment strategies.
Tools and Technologies
AWS: EC2, VPC, RDS, EKS.
Jenkins: CI/CD automation.
Docker: Containerization.
Kubernetes: Container orchestration.
Prometheus & Grafana: Monitoring and visualization.
Trivy & SonarQube: Security and code quality checks.
Terraform: Infrastructure as code.
Prerequisites
AWS Account: Required for cloud resource provisioning.
Basic Linux Knowledge: For managing EC2 instances.
Docker and Kubernetes Knowledge: For containerization and orchestration.
Familiarity with CI/CD: Understanding basic CI/CD concepts.
GitHub Account: For version control and Jenkins integration.
Step 1: Infrastructure Setup on AWS
1.1 Setting Up the VPC and Networking
Create a VPC:
aws ec2 create-vpc --cidr-block 10.0.0.0/16
Configure subnets:
aws ec2 create-subnet --vpc-id --cidr-block 10.0.1.0/24 --availability-zone us-east-1a
Set up an Internet Gateway:
aws ec2 create-internet-gateway
aws ec2 attach-internet-gateway --vpc-id --internet-gateway-id
Create route tables and associate with subnets:
aws ec2 create-route-table --vpc-id
aws ec2 create-route --route-table-id --destination-cidr-block 0.0.0.0/0 --gateway-id
aws ec2 associate-route-table --subnet-id --route-table-id
Set Up Security Groups:

Create a security group:
aws ec2 create-security-group --group-name MySecurityGroup --description "Security group for my app" --vpc-id
Allow SSH, HTTP, and HTTPS:
aws ec2 authorize-security-group-ingress --group-id --protocol tcp --port 22 --cidr 0.0.0.0/0
aws ec2 authorize-security-group-ingress --group-id --protocol tcp --port 80 --cidr 0.0.0.0/0
aws ec2 authorize-security-group-ingress --group-id --protocol tcp --port 443 --cidr 0.0.0.0/0
1.2 Provisioning EC2 Instances
Launch EC2 Instances:

Use the AWS Management Console or CLI:
aws ec2 run-instances --image-id ami-0abcdef1234567890 --count 1 --instance-type t2.micro --key-name MyKeyPair --security-group-ids --subnet-id
Install Docker and Jenkins on the EC2 instance:
sudo yum update -y
sudo yum install docker -y
sudo service docker start
sudo usermod -a -G docker ec2-user

# Jenkins
sudo yum install java-1.8.0-openjdk -y
wget -O /etc/yum.repos.d/jenkins.repo https://pkg.jenkins.io/redhat-stable/jenkins.repo
rpm --import https://pkg.jenkins.io/redhat-stable/jenkins.io.key
sudo yum install jenkins -y
sudo systemctl start jenkins
sudo systemctl enable jenkins
1.3 Setting Up an RDS Database
Provision an RDS Instance:

Create a MySQL instance:
aws rds create-db-instance --db-instance-identifier mydbinstance --db-instance-class db.t2.micro --engine mysql --master-username admin --master-user-password password --allocated-storage 20 --vpc-security-group-ids
Connect the Application:

Update application configuration with the RDS endpoint:
jdbc:mysql://:3306/mydatabase
Ensure connectivity by testing with MySQL client
:

 mysql -h <rds-endpoint> -u admin -p
Enter fullscreen mode Exit fullscreen mode

Step 2: Installing and Configuring Jenkins
2.1 Jenkins Installation
Install Jenkins:

Already covered under EC2 provisioning. Access Jenkins via :8080.
Unlock Jenkins:

Retrieve the initial admin password:
sudo cat /var/lib/jenkins/secrets/initialAdminPassword
Complete the setup wizard.
2.2 Configuring Jenkins for GitHub Integration
Install GitHub Plugin:

Navigate to Manage Jenkins -> Manage Plugins.
Search for "GitHub" and install it.
Generate a GitHub Token:

Generate a personal access token from GitHub and add it to Jenkins:
Manage Jenkins -> Manage Credentials -> Add Credentials.
Create a New Job:

Set up a new pipeline job and link it to your GitHub repository.
2.3 Setting Up Jenkins Pipelines
Define a Jenkinsfile:

Create a Jenkinsfile in your repository:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh 'mvn clean install'
}
}
stage('Test') {
steps {
sh 'mvn test'
}
}
stage('Deploy') {
steps {
sh 'docker build -t myapp .'
sh 'docker push myrepo/myapp'
}
}
}
}
Trigger the Pipeline:

Commit and push the Jenkinsfile to your repository.
Jenkins will automatically trigger the build.
Step 3: Containerizing the Application with Docker
3.1 Writing a Dockerfile
Create a Dockerfile:

In your application directory:
FROM openjdk:8-jdk-alpine
VOLUME /tmp
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
Build the Docker Image:

Run the following commands:
docker build -t myapp:latest .
3.2 Building and Pushing Docker Images
Tag and Push Image:

Tag the image with the appropriate version:
docker tag myapp:latest myrepo/myapp:v1.0.0
Push the image to Docker Hub:
docker push myrepo/myapp:v1.0.0
3.3 Docker Compose for Local Development
Create a docker-compose.yml File:

Define your multi-container application:
version: '3'
services:
app:
image: myrepo/myapp:v1.0.0
ports:
- "8080:8080"
db:
image: mysql:5.7
environment:
MYSQL_ROOT_PASSWORD: password
MYSQL_DATABASE: mydatabase
ports:
- "3306:3306"
Run Docker Compose:

Start the application locally:
docker-compose up
Step 4: Deploying to Kubernetes (Amazon EKS)
4.1 Setting Up the EKS Cluster
Install kubectl and eksctl:

Install kubectl:
curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl"
chmod +x kubectl
sudo mv kubectl /usr/local/bin/
Install eksctl:
curl --silent --location "https://github.com/weaveworks/eksctl/releases/download/0.110.0/eksctl_Linux_amd64.tar.gz" | tar xz -C /tmp
sudo mv /tmp/eksctl /usr/local/bin
Create an EKS Cluster:
eksctl create cluster --name my-cluster --version 1.21 --region us-east-1 --nodegroup-name my-nodes --node-type t3.medium --nodes 3
4.2 Creating Kubernetes Manifests
Write Deployment Manifests:

Create a deployment.yaml:
apiVersion: apps/v1
kind: Deployment
metadata:
name: myapp-deployment
spec:
replicas: 3
selector:
matchLabels:
app: myapp
template:
metadata:
labels:
app: myapp
spec:
containers:
- name: myapp
image: myrepo/myapp:v1.0.0
ports:
- containerPort: 8080
4.3 Deploying the Application on EKS
Apply the Manifests:

Deploy the application to EKS:
kubectl apply -f deployment.yaml
Monitor the deployment:
kubectl get pods
Expose the Application:

Create a service to expose the application:
apiVersion: v1
kind: Service
metadata:
name: myapp-service
spec:
type: LoadBalancer
selector:
app: myapp
ports:
- protocol: TCP
port: 80
targetPort: 8080
Apply the service:
kubectl apply -f service.yaml
Step 5: Implementing Continuous Monitoring with Prometheus and Grafana
5.1 Installing Prometheus
Deploy Prometheus:

Use Helm to install Prometheus:
helm repo add prometheus-community https://prometheus-community.github.io/helm-charts
helm repo update
helm install prometheus prometheus-community/prometheus
Configure Prometheus:

Edit the values.yaml file to scrape your application metrics:
scrape_configs:

  • job_name: 'myapp' static_configs:
    • targets: ['myapp-service:8080'] 5.2 Configuring Grafana Dashboards Deploy Grafana:

Install Grafana via Helm:
helm install grafana grafana/grafana
Access Grafana:

Retrieve the admin password:
kubectl get secret --namespace default grafana -o jsonpath="{.data.admin-password}" | base64 --decode ; echo
Forward port to access Grafana:
kubectl port-forward svc/grafana 3000:80
Add Prometheus as a Data Source:
Log in to Grafana and add Prometheus as a data source.
5.3 Setting Up Alerts
Define Alerting Rules:

Create alerting rules in Prometheus for critical metrics:
groups:

  • name: example rules:
    • alert: HighMemoryUsage expr: node_memory_Active_bytes > 1e+09 for: 1m labels: severity: critical annotations: summary: "Instance {{ $labels.instance }} high memory usage" Set Up Alertmanager:

Configure Alertmanager for notifications:
receivers:

  • name: 'email' email_configs:
    • to: '[email protected]' Step 6: Securing the CI/CD Pipeline 6.1 Scanning for Vulnerabilities with Trivy Install Trivy:

Install Trivy on the Jenkins server:
sudo apt-get install wget apt-transport-https gnupg lsb-release
wget -qO - https://aquasecurity.github.io/trivy-repo/deb/public.key | sudo apt-key add -
echo deb https://aquasecurity.github.io/trivy-repo/deb $(lsb_release -sc) main | sudo tee -a /etc/apt/sources.list.d/trivy.list
sudo apt-get update
sudo apt-get install trivy
Integrate Trivy with Jenkins:

Add Trivy to the Jenkins pipeline:
stage('Security Scan') {
steps {
sh 'trivy image myrepo/myapp:v1.0.0'
}
}
6.2 Integrating SonarQube for Code Quality
Install SonarQube:

Install SonarQube on the EC2 instance:
sudo yum install java-11-openjdk-devel -y
wget https://binaries.sonarsource.com/Distribution/sonarqube/sonarqube-8.9.6.50800.zip
unzip
sonarqube-*.zip
sudo mv sonarqube-8.9.6.50800 /opt/sonarqube
sudo chown -R sonar: /opt/sonarqube




Configure SonarQube:

Modify the sonar.properties file for database integration:
 sonar.jdbc.username=sonar
 sonar.jdbc.password=sonar
 sonar.jdbc.url=jdbc:postgresql://localhost/sonarqube
Integrate SonarQube with Jenkins:

Add SonarQube analysis in Jenkins:
 stage('SonarQube Analysis') {
   steps {
     withSonarQubeEnv('My SonarQube Server') {
       sh 'mvn sonar:sonar'
     }
   }
 }
Conclusion
This project guide provides an in-depth walkthrough of setting up an end-to-end DevOps pipeline with CI/CD, containerization, Kubernetes deployment, monitoring, and security. By following this guide, you’ll not only gain practical experience but also create a production-ready pipeline. Remember, the key to mastering DevOps is consistent practice and staying updated with the latest tools and methodologies.

Final Thoughts
Feel free to customize the steps and integrate more tools as per your project requirements. DevOps is a vast field, and this guide is just the beginning of your journey towards becoming a proficient DevOps engineer. Happy coding and happy deploying!

👤 Author
GIFbanner

Join Our Telegram Community || Follow me on GitHub for more DevOps content!
Top comments (22)

Subscribe
pic



notharshhaa profile image
H A R S H H A A

•
Aug 26 '24

Updated Project Integration
Repository: Vitual-Browser

Step-by-Step Integration with Vitual-Browser
1. Clone the Repository
Start by cloning the Vitual-Browser repository to your local environment:

git clone https://github.com/jaiswaladi246/Vitual-Browser.git
cd Vitual-Browser
2. Build and Test Locally
2.1 Build the Application

Since Vitual-Browser appears to be a Node.js application, you need to build it. Ensure you have Node.js and npm installed.

# Install dependencies
npm install

# Build the project (if applicable)
npm run build
2.2 Run Tests

npm test
3. Containerize the Application
3.1 Create a Dockerfile

In the root of your Vitual-Browser project, create a Dockerfile:

# Use official Node.js image as base
FROM node:14

# Create and set working directory
WORKDIR /usr/src/app

# Copy package.json and package-lock.json
COPY package*.json ./

# Install dependencies
RUN npm install

# Copy the rest of the application
COPY . .

# Build the application (if applicable)
RUN npm run build

# Expose port (adjust if necessary)
EXPOSE 3000

# Start the application
CMD ["npm", "start"]
3.2 Build and Push Docker Image

# Build the Docker image
docker build -t myrepo/vitual-browser:latest .

# Push the Docker image to Docker Hub
docker tag myrepo/vitual-browser:latest myrepo/vitual-browser:v1.0.0
docker push myrepo/vitual-browser:v1.0.0
4. Deploy to Kubernetes (Amazon EKS)
4.1 Create Kubernetes Manifests

Deployment Manifest (deployment.yaml):

apiVersion: apps/v1
kind: Deployment
metadata:
  name: vitual-browser-deployment
spec:
  replicas: 3
  selector:
    matchLabels:
      app: vitual-browser
  template:
    metadata:
      labels:
        app: vitual-browser
    spec:
      containers:
      - name: vitual-browser
        image: myrepo/vitual-browser:v1.0.0
        ports:
        - containerPort: 3000
Service Manifest (service.yaml):

apiVersion: v1
kind: Service
metadata:
  name: vitual-browser-service
spec:
  type: LoadBalancer
  selector:
    app: vitual-browser
  ports:
    - protocol: TCP
      port: 80
      targetPort: 3000
4.2 Apply the Manifests

kubectl apply -f deployment.yaml
kubectl apply -f service.yaml
5. CI/CD Integration with Jenkins
5.1 Update Jenkinsfile

In your Jenkinsfile, add stages to build, test, and deploy the application:

pipeline {
  agent any
  stages {
    stage('Checkout') {
      steps {
        git 'https://github.com/jaiswaladi246/Vitual-Browser.git'
      }
    }
    stage('Build') {
      steps {
        sh 'npm install'
        sh 'npm run build'
      }
    }
    stage('Test') {
      steps {
        sh 'npm test'
      }
    }
    stage('Docker Build and Push') {
      steps {
        sh 'docker build -t myrepo/vitual-browser:latest .'
        sh 'docker push myrepo/vitual-browser:latest'
      }
    }
    stage('Deploy') {
      steps {
        sh 'kubectl apply -f deployment.yaml'
        sh 'kubectl apply -f service.yaml'
      }
    }
  }
}
6. Monitoring and Security
Prometheus and Grafana: Ensure Prometheus is configured to scrape metrics from the Vitual-Browser application if it exposes any.
Trivy and SonarQube: Integrate Trivy in the Jenkins pipeline for scanning Docker images and SonarQube for code quality.
  Conclusion
This integration ensures a smooth CI/CD pipeline for the Vitual-Browser application. By following these steps, you’ll have a robust, automated setup for building, testing, deploying, and monitoring your application.
Enter fullscreen mode Exit fullscreen mode

Top comments (0)