Real-Time Log Analysis in Spring Boot with Elasticsearch and Kibana
Monitoring applications in real-time is key to catching issues before they impact users. From troubleshooting errors to detecting anomalies, real-time log analysis keeps your Spring Boot microservices running smoothly and reliably. The Elasticsearch and Kibana duo simplifies this task, offering scalable log ingestion, powerful search capabilities, and insightful visualizations.
This blog explores how to leverage Elasticsearch and Kibana for real-time log analysis in Spring Boot applications. You’ll learn how to configure log pipelines, set up real-time log ingestion with Logstash, write effective Kibana queries, and alert on anomalies.
Table of Contents
- What is Real-Time Log Analysis, and Why Is It Important?
- Configuring Log Pipelines
- Real-Time Log Ingestion Using Logstash
- Writing Kibana Queries for Log Analysis
- Setting Up Alerting on Anomalies
- Summary
What is Real-Time Log Analysis, and Why Is It Important?
At its core, real-time log analysis involves processing application logs as soon as they’re generated, allowing teams to extract actionable insights instantly. For Spring Boot applications designed with microservices in mind, logs serve as the trail of actions, errors, and system behaviors.
Key Benefits of Real-Time Log Analysis:
- Immediate Issue Detection: Uncover errors, downtimes, or anomalies the moment they happen.
- Improved Performance Monitoring: Analyze latency, request performance, and resource utilization in real time.
- Enhanced Security: Detect unauthorized activity or unusual patterns in logs, such as repeated failed logins.
- Operational Efficiency: Provide faster recovery during system incidents with immediate insights.
Through Elasticsearch and Kibana, you can implement real-time pipelines that centralize Spring Boot logs, making them accessible for immediate analysis.
Configuring Log Pipelines
A structured log pipeline ensures smooth ingestion, processing, and storage of logs in Elasticsearch. Below are the main components of a Spring Boot log pipeline.
Step 1. Centralize Logs with Elasticsearch
Elasticsearch stores logs in a structured and scalable manner, supporting high-speed searches and queries.
Step 2. Process Logs with Logstash
Logstash acts as an intermediary, transforming and enriching logs before sending them to Elasticsearch.
Step 3. Send Logs from Spring Boot
Spring Boot uses Logback by default, which integrates seamlessly with Logstash or Elasticsearch.
Configure Spring Boot Logging
Add the Logstash encoder dependency to your pom.xml
:
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>7.3</version>
</dependency>
Update logback-spring.xml to configure a Logstash appender:
<configuration>
<appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<destination>localhost:5044</destination>
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>
<root level="INFO">
<appender-ref ref="LOGSTASH" />
</root>
</configuration>
Logs are now configured to flow automatically from Spring Boot to Logstash.
Real-Time Log Ingestion Using Logstash
Logstash enables you to process, filter, and enrich logs before storing them in Elasticsearch.
Step 1. Install Logstash and Elasticsearch
Deploy Elasticsearch and Logstash locally or via Docker for simplified setup.
Example Docker Compose:
version: '3.8'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.5.0
environment:
- discovery.type=single-node
ports:
- "9200:9200"
logstash:
image: docker.elastic.co/logstash/logstash:8.5.0
ports:
- "5044:5044"
volumes:
- ./logstash/pipeline:/usr/share/logstash/pipeline
Step 2. Configure the Logstash Pipeline
Define input, filter, and output stages in logstash/pipeline/logstash.conf
:
input {
tcp {
port => 5044
codec => json
}
}
filter {
mutate {
add_field => {
"environment" => "production"
"application" => "spring-boot-app"
}
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "spring-logs-%{+yyyy.MM.dd}"
}
}
Step 3. Start Logstash
Run Logstash to begin processing logs:
docker-compose up logstash
Logs are now ingested into Elasticsearch in near-real time.
Writing Kibana Queries for Log Analysis
Once logs are ingested, Kibana enables intuitive query building to analyze and extract meaningful insights.
Step 1. Explore Logs Using Kibana Discover
Go to Kibana’s Discover tab, select the spring-logs-*
index pattern, and start querying.
Example Queries:
- Find Errors Across Services:
level:"ERROR"
- Filter Logs by Time Range:
@timestamp:["2025-06-13T12:00:00" TO "2025-06-13T14:00:00"]
- Search by Trace ID:
traceId:"123abc456xyz"
Step 2. Aggregate Metrics
Use Kibana’s advanced aggregations to analyze trends.
- Query: Display average response times:
serviceName.keyword:"order-service" AND metric:"responseTime"
Kibana’s query feature offers unparalleled flexibility, enabling deep analysis in real-time.
Setting Up Alerting on Anomalies
Elasticsearch and Kibana allow for configurable alerts to notify you of anomalies, such as frequent errors or high response times.
Step 1. Enable Watcher (Elasticsearch Alerting)
Create monitoring triggers using Elasticsearch’s Watcher API.
Example Watcher Alert (Frequent Errors Across Services):
PUT _watcher/watch/error-alert
{
"trigger": {
"schedule": { "interval": "5m" }
},
"input": {
"search": {
"request": {
"indices": ["spring-logs-*"],
"body": {
"query": {
"match": {
"level": "ERROR"
}
}
}
}
}
},
"condition": {
"compare": {
"ctx.payload.hits.total.value": {
"gte": 10
}
}
},
"actions": {
"email_admin": {
"email": {
"to": "[email protected]",
"subject": "High Error Rate Detected"
}
}
}
}
Step 2. Use Kibana Alerting (Kibana Watchers)
Set up custom triggers in Kibana for anomalies:
- Navigate to Alerts and Insights.
- Define conditions:
- Metric-based (e.g.,
responseTime > 1000ms
). - Log-based (e.g.,
count(level:"ERROR") > 10
).
- Metric-based (e.g.,
- Configure notifications (e.g., via email or Slack).
Alerts in Action:
When triggered, alerts keep your team informed, allowing proactive resolutions.
Summary
Real-time log analysis with Elasticsearch and Kibana elevates your Spring Boot application’s observability, ensuring rapid debugging, optimized performance, and robust anomaly detection. Here’s a recap of the essential steps:
- Log Pipelines: Configure pipelines from Spring Boot to Elasticsearch via Logstash.
- Real-Time Insights: Process and enrich logs in real time for actionable monitoring.
- Analyze with Kibana: Query logs for errors, trace IDs, and trends using advanced filters and visualizations.
- Anomaly Alerts: Automate notifications to detect issues proactively.
Integrate these practices to build a resilient and observable microservice architecture. Start analyzing your Spring Boot logs today with Elasticsearch and Kibana’s real-time capabilities!