In Previous Blogs of this series, we learned about using Logstash, Syslog-ng and Log4J in order to have a centralized logging setup. We used Grok filters to parse/filter our logs and send them to ElasticSearch.
Now, let us see how we can visualize them to pull the best meaning out of the logs using Kibana!
Kibana
Kibana is a great tool for real time data analysis and visualization. You can create flexible bar charts, line and scatter plots, histograms, pie charts, and maps, for the data stored in ElasticSearch.
Creating Index Patterns
In order to see data from any index, we must first create an Index pattern. To create an Index Pattern, open Kibana UI in browser and go to Settings → Indices → Configure an index pattern.
In our Previous Blog, we were sending the Data to ElasticSearch Index with Index Name as logstash-%{+YYYY.MM}. So, the indices gets created as logstash-2016.08, logstash-2016.09 and so on.
So, in Kibana, you should specify a Wildcard Pattern as logstash-* and tick the setting given as Index contains time-based events.
Discover
The Discover tab automatically discovers all the data according to the Time range adjusted on Top-Right in Kibana UI. It will show you the Bar Graph according to the Count of Events versus the Time range, known as Date Histogram in Kibana.
Visualize
The Visualize Tab allows you to create different visualizations of the data with the help of various flexible Area Chart, Data Table, Line Chart, Bar Chart, Pie Chart and many more. You can create number of visualizations and can easily save them. Then, you can use those visualizations in creating Custom Dashboards.
Using Vertical Bar Chart
When clicked Vertical Bar Chart, we are given with two sections on left side: Metrics and Buckets.
As usual, on Y-Axis, we let it use Count of Events as Metrics. On the X-Axis, we aggregate it as Date Histogram by using the @timestamp field and Interval set to Auto.
Sub Aggregation via Sub Buckets
Let us further divide the Bars by different terms. For Example, let us divide the Data according to the HTTP Response Codes that we got in each request.
Click Add Sub-Buckets in Buckets section and select Split Bars. Then, in Sub-Aggregation, select Terms as your option. After that, select the field response from the dropdown. After this, click the green Play button to visualize.
You will see that the Bar Chart is divided into different sections according to different HTTP Status Codes that you got in an Apache Access Log like 200, 301, 404, 403 etc. Check out the example screenshot below:
Using Line Chart
Using the same Metrics, Buckets and Sub-Buckets settings that we used for creating Bar Chart above, we can also visualize the data in form of Line Chart. See the example Screenshot of Line Chart divided by HTTP Status Codes below:
Using Pie Chart
Using the same Metrics, Buckets and Sub-Buckets settings that we used for creating Bar Chart above, we can also visualize the data in form of Pie Chart. See the example Screenshot of Pie Chart divided by HTTP Status Codes below:
End of the story
So, here we are. From the 1st blog of this series, up to here, we learned about using ELK(ElasticSearch, Logstash and Kibana) Stack for Centralized Logging and Analysis of Data. We also learned about using Syslong-ng, Log4J along with Logstash and tried some basic Logstash Configurations. While learning about Filters in Logstash, we also learned making Custom Grok Patterns using Regular Expressions for parsing/filtering a basic Nginx Error Log.
At the end, when all the data came to ElasticSearch, we visualized it using various visualizations like Bar Chart, Pie Chart, Line Graph etc in Kibana.
If you found this series to be helpful, spread the knowledge! Questions are welcome!
Tags: analytics, elasticsearch, kibana, log4j, logging, logs, logstash
Leave a Reply