Centralized Log Management on Kubernetes with the EFK Stack
In this updated guide, we’ll deploy the EFK Stack (Elasticsearch, Fluentd, Kibana) on any Kubernetes cluster (cloud-based or local) and use a Python Flask application to generate logs. This approach makes the guide flexible, suitable for any Kubernetes environment, such as Minikube, AKS, EKS, or GKE.
What is the EFK Stack?
The EFK Stack comprises:
- Elasticsearch (E): Stores and indexes log data for fast searches.
- Fluentd (F): Aggregates and forwards logs to Elasticsearch.
- Kibana (K): Visualizes logs from Elasticsearch through dashboards and graphs.
This stack provides centralized log management for distributed systems, making it easier to monitor and troubleshoot applications.
Pre-requisites
A Kubernetes Cluster:
- You can use any Kubernetes cluster (e.g., Minikube, AKS, EKS, GKE).
- Ensure
kubectl
is installed and configured to access the cluster.
Tools Installed:
- kubectl: Kubernetes CLI.
- Docker: For containerizing the Python app.
Step 1: Deploy Elasticsearch
Elasticsearch Deployment: Save the following as elasticsearch-deployment.yml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: elasticsearch
spec:
replicas: 1
selector:
matchLabels:
app: elasticsearch
template:
metadata:
labels:
app: elasticsearch
spec:
containers:
- name: elasticsearch
image: docker.elastic.co/elasticsearch/elasticsearch:8.10.2
env:
- name: discovery.type
value: single-node
ports:
- containerPort: 9200
Elasticsearch Service: Save this as elasticsearch-service.yml
:
apiVersion: v1
kind: Service
metadata:
name: elasticsearch
spec:
ports:
- port: 9200
targetPort: 9200
selector:
app: elasticsearch
Deploy Elasticsearch:
kubectl apply -f elasticsearch-deployment.yml
kubectl apply -f elasticsearch-service.yml
Step 2: Deploy Kibana
Kibana Deployment: Save this as kibana-deployment.yml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: kibana
spec:
replicas: 1
selector:
matchLabels:
app: kibana
template:
metadata:
labels:
app: kibana
spec:
containers:
- name: kibana
image: docker.elastic.co/kibana/kibana:8.10.2
env:
- name: ELASTICSEARCH_HOSTS
value: http://elasticsearch:9200
ports:
- containerPort: 5601
Kibana Service: Save this as kibana-service.yml
:
apiVersion: v1
kind: Service
metadata:
name: kibana
spec:
ports:
- port: 5601
targetPort: 5601
selector:
app: kibana
Deploy Kibana:
kubectl apply -f kibana-deployment.yml
kubectl apply -f kibana-service.yml
Step 3: Deploy Fluentd
Fluentd ConfigMap: Save this as fluentd-configmap.yml
:
apiVersion: v1
kind: ConfigMap
metadata:
name: fluentd-config
data:
fluent.conf: |
<source>
@type tail
path /var/log/containers/*.log
pos_file /var/log/fluentd-containers.log.pos
tag kube.*
format json
</source>
<match kube.**>
@type elasticsearch
host elasticsearch
port 9200
logstash_format true
</match>
Fluentd DaemonSet: Save this as fluentd-daemonset.yml
:
apiVersion: apps/v1
kind: DaemonSet
metadata:
name: fluentd
spec:
selector:
matchLabels:
app: fluentd
template:
metadata:
labels:
app: fluentd
spec:
containers:
- name: fluentd
image: fluent/fluentd:v1.16.3
volumeMounts:
- name: config-volume
mountPath: /fluentd/etc
- name: varlog
mountPath: /var/log
volumes:
- name: config-volume
configMap:
name: fluentd-config
- name: varlog
hostPath:
path: /var/log
Deploy Fluentd:
kubectl apply -f fluentd-configmap.yml
kubectl apply -f fluentd-daemonset.yml
Step 4: Deploy a Python Sample App
Create a Python Flask App
Dockerize the App:
- Save this Python app as
app.py
:
from flask import Flask
import logging
app = Flask(__name__)
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("SampleApp")
@app.route("/")
def hello():
logger.info("Hello, world log from Python!")
return "Hello, Kubernetes Logs!", 200
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)
- Create a
Dockerfile
:
FROM python:3.10
WORKDIR /app
COPY app.py /app
RUN pip install flask
CMD ["python", "app.py"]
Build and Push the Image:
docker build -t <your-docker-hub-username>/python-sample-app:latest .
docker push <your-docker-hub-username>/python-sample-app:latest
Deploy the App on Kubernetes
Create Deployment YAML: Save this as python-app-deployment.yml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: python-sample-app
spec:
replicas: 1
selector:
matchLabels:
app: python-sample-app
template:
metadata:
labels:
app: python-sample-app
spec:
containers:
- name: python-sample-app
image: <your-docker-hub-username>/python-sample-app:latest
ports:
- containerPort: 5000
Deploy the App:
kubectl apply -f python-app-deployment.yml
Step 5: Visualize Logs in Kibana
Expose Kibana:
kubectl port-forward svc/kibana 5601:5601
Access Kibana: Open http://localhost:5601 in your browser.
Set Up Index Patterns:
- Go to Stack Management > Index Patterns.
- Create an index pattern (e.g.,
logstash-*
).
Explore Logs: Navigate to Discover and view the logs generated by the Python app.
Summary
In this tutorial, you deployed the EFK stack on Kubernetes and used a Python Flask app to generate logs. This setup demonstrates how logs flow from applications to Elasticsearch, are processed by Fluentd, and are visualized in Kibana.
Let me know if you’d like to expand on security, scaling, or advanced features! 🚀