Structured Logging in Spring Boot with ELK Stack
Logging is a critical part of modern application development, especially when it comes to debugging and monitoring. Traditional unstructured logging, while useful, can make it difficult to search, filter, and analyze log data in distributed systems. This is where structured logging comes in, offering a standardized way to log key application events and metadata in machine-readable formats like JSON.
This blog post will guide you through structured logging in Spring Boot using the ELK stack (Elasticsearch, Logstash, and Kibana). We’ll discuss the benefits of structured logs, configuring Logback for JSON output, sending Spring Boot logs to Logstash, and visualizing those logs in Kibana.
Table of Contents
- Why Use Structured JSON Logs?
- Configuring Logback for JSON Output in Spring Boot
- Sending Logs to Logstash
- Viewing Structured Logs in Kibana
- Official Documentation Links
- Summary
Why Use Structured JSON Logs?
Structured logging refers to the practice of logging events in a well-defined, machine-readable format like JSON. Unlike plaintext logs, structured logs capture contextual information using key-value pairs, providing better clarity and searchability.
Key Benefits of Structured JSON Logs:
- Searchability: Tools like Elasticsearch can index JSON logs, making it easier to search and query specific fields.
- Rich Context: Each log entry can contain metadata like request IDs, timestamps, and user IDs to provide deeper insights into application state.
- Improved Debugging: Search criteria can include specific properties, making it easier to find and trace errors.
- Compatibility: Structured logs work seamlessly with visualization tools like Kibana, where you can filter logs by fields and visualize trends interactively.
Example of Unstructured vs. Structured Logs:
- Unstructured:
2025-06-13 12:00:00 ERROR Something went wrong while processing the request
- Structured (JSON):
{ "timestamp": "2025-06-13T12:00:00Z", "level": "ERROR", "message": "Something went wrong while processing the request", "requestId": "abc123", "userId": "user45" }
Structured logs empower teams to extract meaningful insights from vast log data, especially when working with distributed Spring Boot microservices.
Configuring Logback for JSON Output in Spring Boot
Spring Boot uses Logback as its default logging framework. With the right configuration, you can easily produce structured JSON logs.
Step 1. Add the Dependency for Logback JSON Support
To enable JSON output in Logback, add the logstash-logback-encoder
dependency to your pom.xml
:
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>7.3</version>
</dependency>
Step 2. Configure Logback for JSON Logs
Create or modify the logback-spring.xml
file in the resources
folder.
Example Configuration:
<configuration>
<!-- Console Appender -->
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>
<!-- File Appender -->
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/application.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logs/application.%d{yyyy-MM-dd}.log</fileNamePattern>
<maxHistory>10</maxHistory>
</rollingPolicy>
<encoder class="net.logstash.logback.encoder.LogstashEncoder" />
</appender>
<root level="INFO">
<appender-ref ref="CONSOLE" />
<appender-ref ref="FILE" />
</root>
</configuration>
What This Configuration Does:
- Outputs logs to both the console and a file.
- Logs are encoded in JSON format using LogstashEncoder.
- Log files roll daily and keep up to 10 days of history.
Sample JSON Log Output:
{
"@timestamp": "2025-06-13T15:30:00.000Z",
"level": "INFO",
"message": "Application started successfully",
"thread": "main",
"logger_name": "com.example.MyService"
}
At this point, your Spring Boot application is ready to emit structured JSON logs.
Sending Logs to Logstash
Once your logs are structured, the next step is to send them to Logstash for further processing and forwarding to Elasticsearch.
Step 1. Logstash Configuration
Add an input and output configuration to your logstash.conf
:
input {
file {
path => "/path/to/logs/application.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "spring-logs-%{+yyyy.MM.dd}"
}
stdout { codec => rubydebug }
}
Step 2. Run Logstash
Start Logstash using the provided configuration:
bin/logstash -f /path/to/logstash.conf
Logstash will read the structured logs and forward them to Elasticsearch under the defined index.
Optional Step. Use Beats for Streaming Logs
For streaming logs in real-time, consider using Filebeat:
- Install Filebeat.
- Configure Filebeat to monitor Spring Boot log files.
- Output logs directly to Logstash.
Using Filebeat reduces the resource load on Logstash and improves scalability.
Viewing Structured Logs in Kibana
With logs flowing into Elasticsearch, Kibana provides an intuitive way to analyze, filter, and visualize your data.
Step 1. Access Kibana
Navigate to http://localhost:5601. Ensure that the Kibana server is running and connected to Elasticsearch.
Step 2. Create an Index Pattern
- Go to Management > Index Patterns.
- Create a new index pattern for your logs. For example, use
spring-logs-*
.
Step 3. Explore Logs
- Navigate to Discover.
- Select your newly created index pattern.
- Filter logs based on fields like
level
(e.g.,level=ERROR
) or custom metadata (userId=user45
).
Step 4. Build Dashboards
Use the Dashboards feature to create visual representations of your logs:
- Error Trends: Show the number of
WARN
orERROR
logs over time. - User Activity: Visualize logs grouped by
userId
.
Interactive dashboards make it simple to monitor microservices and troubleshoot issues instantly.
Official Documentation Links
For more insights, refer to the official documentation:
- Logback Documentation: Logback Docs
- Logstash Documentation: Logstash Docs
- Kibana Documentation: Kibana Docs
These resources provide detailed instructions for advanced use cases.
Summary
Structured logging with the ELK stack unlocks the full potential of your Spring Boot microservices by streamlining log management, searchability, and analysis.
Key Takeaways:
- Why Use JSON Logs?
JSON logs improve searchability, debugging, and visualization. - Logback Configuration: Use LogstashEncoder for structured JSON logging in Spring Boot.
- Logstash Integration: Process and forward logs from Spring Boot to Elasticsearch.
- Kibana Analysis: Filter and visualize logs interactively using custom dashboards.
Start implementing structured logging today to simplify monitoring and make informed decisions about your applications!