Install ELK Stack & Setup Spring Boot Logging with Filebeat
Source: Dev.to
Prepare Your System
sudo apt update && sudo apt upgrade -y
sudo apt install apt-transport-https wget curl gnupg -y
Add the Elastic repository
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | \
sudo gpg --dearmor -o /usr/share/keyrings/elasticsearch-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] \
https://artifacts.elastic.co/packages/8.x/apt stable main" | \
sudo tee /etc/apt/sources.list.d/elastic-8.x.list
sudo apt update
Install ELK Stack
Elasticsearch
sudo apt install elasticsearch -y
sudo systemctl enable elasticsearch
sudo systemctl start elasticsearch
Check that it’s running:
curl -k https://localhost:9200
Kibana
sudo apt install kibana -y
sudo systemctl enable kibana
sudo systemctl start kibana
Edit the Kibana config to allow external access:
sudo nano /etc/kibana/kibana.yml
# change (or add) the line:
# server.host: "0.0.0.0"
Open the required port and access the UI:
sudo ufw allow 5601
Visit http://YOUR_SERVER_IP:5601 in your browser.
Kibana & Elasticsearch Configuration
-
Generate an enrollment token for Kibana
sudo /usr/share/elasticsearch/bin/elasticsearch-create-enrollment-token -s kibana -
Get the Kibana verification code
sudo /usr/share/kibana/bin/kibana-verification-code -
Reset the
elasticuser passwordsudo /usr/share/elasticsearch/bin/elasticsearch-reset-password -u elastic
Logstash
sudo apt install logstash -y
sudo systemctl enable logstash
sudo systemctl start logstash
How to Send Spring Boot Logs to Logstash
Workflow
- Spring Boot writes logs to a file (Logback).
- Filebeat reads the log file.
- Filebeat forwards logs to Logstash (or directly to Elasticsearch).
- Logstash (optional) parses the logs.
- Elasticsearch stores the logs.
- Kibana visualises the logs.
Configure Spring Boot Logback
Create logback-spring.xml in src/main/resources/:
%d{HH:mm:ss} %-5level [%thread] %logger{36} - %msg%n
${LOG_DIR}/${APP_NAME}-app.log
${LOG_DIR}/${APP_NAME}.%d{yyyy-MM-dd}.gz
30
%d{yyyy-MM-dd HH:mm:ss} %-5level [%thread] %logger{36} - %msg%n
Create the log directory on both the development machine and the VM where the JAR runs:
sudo mkdir -p /var/log/taxes-backend
sudo chmod 777 /var/log/taxes-backend # adjust permissions as needed
Install and Configure Filebeat
sudo apt install filebeat -y
Edit the Filebeat configuration:
sudo nano /etc/filebeat/filebeat.yml
Add (or modify) the input section to point to your Spring Boot log file:
# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/taxes-backend/taxes-backend-app.log
(Continue editing the rest of filebeat.yml as required, e.g., output to Logstash.)
Next Steps
- Configure Logstash pipelines to parse the incoming logs (e.g., using Grok).
- Set up Filebeat output to point to Logstash (
output.logstash:section). - Create Kibana dashboards to visualise your Spring Boot logs.
You now have a functional ELK stack with Spring Boot logs being shipped via Filebeat → Logstash → Elasticsearch → Kibana. Happy logging!
Filebeat & Logstash Configuration Guide
1. Filebeat Configuration (filebeat.yml)
# Set to true to enable config reloading
reload.enabled: false
# ======================= Elasticsearch template setting =======================
setup.template.settings:
index.number_of_shards: 1
# =================================== Kibana ================================
setup.kibana:
# ------------------------------ Logstash Output ----------------------------
output.logstash:
hosts: ["localhost:5044"]
# ================================= Processors ==============================
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
The
localhost:5044entry tells Filebeat where to send logs (Logstash or Elasticsearch).
Start Filebeat
sudo systemctl enable filebeat
sudo systemctl restart filebeat
sudo systemctl status filebeat
2. Configure Logstash Pipeline
Create a Logstash configuration file, e.g., springboot-logstash.conf.
sudo nano /etc/logstash/conf.d/springboot.conf
Edit the config file
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} \[%{DATA:thread}\] %{DATA:logger} - %{GREEDYDATA:msg}" }
}
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss"]
timezone => "UTC"
}
mutate {
remove_field => ["timestamp"]
}
}
output {
elasticsearch {
hosts => ["https://localhost:9200"]
user => "elastic"
password => "ENTER_YOUR_ELASTIC_PASSWORD"
ssl_verification_mode => "none"
index => "springboot-%{+YYYY.MM.dd}"
}
}
Test the Logstash configuration syntax
sudo /usr/share/logstash/bin/logstash --config.test_and_exit -f /etc/logstash/conf.d/springboot.conf
Start Logstash
sudo systemctl restart logstash
Verify the index creation
curl -k -u elastic:ENTER_YOUR_ELASTIC_PASSWORD "https://localhost:9200/_cat/indices?v"
3. View Logs in Kibana
- Open Kibana.
- Navigate to Stack Management → Index Patterns.
- Create an index pattern that matches your logs (e.g.,
springboot-logs-*). - Go to Discover to view the ingested logs.
- Build dashboards to visualize log levels, services, errors, etc.
Replace ENTER_YOUR_ELASTIC_PASSWORD with the password you set for the elastic user.