In Part 1, we read a brief about the ELK Stack. Apart from that, we also read about some other tools, which can be useful for transporting logs like Syslog, Syslog-ng and Log4J. Now, let us setup the tools in order to proceed for logging.
Setup
Here, you can read about setting up the tools which we might be using for Logging.
Installing ElasticSearch
You can download it in various forms like ZIP, TAR, DEB or RPM. On my Debian installation, I will simply download and install the DEB file as:
dpkg -i elasticsearch-X.X.X.deb
If you wish to install it via Package Manager, see the Repositories section.
Installing Logstash
You can download Logstash from here. Similar to installing ElasticSearch. you can install it as:
dpkg -i logstash-X.X.X.deb
Or if you wish to install it via Package Manager, see the Repositories section.
Installing Kibana
You can download Kibana from here. Similar to installing ElasticSearch and Logstash. you can install it as:
dpkg -i kibana_X.X.X_amd64.deb
Or if you wish to install it via Package Manager, see the Repositories section.
Installing Syslog-ng
You can either install the package for Syslog-ng OSE(Open Source Edition), or can compile the source code yourself. Read more about Installing and Building here.
Installing Log4J
In order to use Log4J in your Java based application, you can use the JARs provided in their downloads. You can download and read more about Log4J here.
Lets Start!
Now, when we are done with setting up most of the tools, let us proceed with our objective.
Logstash
As already explained before, Logstash is a pipeline. It processes any Log/Event Data.
In its most basic form, it comprises of 3 general phases:
- Input – Takes input data from various popular sources like File, Syslog, HTTP, JDBC, TCP.
- Filter – Filters the Data according to various operations like Grok, Date, Mutate etc.
- Output – Outputs the data to various popular destinations like ElasticSearch, CSV, File, TCP, HTTP, Email etc.
Check out the diagram below for more clarification:
Running your first Logstash Config
Logstash uses a configuration file to define all the Inputs, Filters and Outputs. So, you need to create a configuration file, e.g my_logstash.conf. Its basic structure is given below:
input { # Input plugins config here } filters { # Filter plugins config(if any used) } output { # Output plugins config here }
Let us take a working example for our first configuration. We will make a configuration as define below:
- Input – We will take the Input from a Local file, say, an Apache Log i.e located at /var/log/apache2/access.log.
- Filter – We will use the Grok filter to parse an Apache Access Log and read useful information from the Log.
- Output – At last, we will output this to STDOUT(console screen).
So, in order to have such scenario, we will be using the given below basic Logstash Configuration:
input { file { path => ["/var/log/apache2/access.log"] } } filter { grok { match => { "message" => "%{COMBINEDAPACHELOG}" } } } output { stdout { codec => rubydebug } }
So, what does this configuration do?
- We are monitoring a file for changes which is located at /var/log/apache2/access.log, using the File Input Plugin. So, whenever changes will occur in this file, it will be also directed to Logstash Input for us.
- We are using a very popular Grok Filter Plugin to parse the Input Logs. It has a very nice collections of patterns/regex, that you can use to Parse any Data. You can find the collection here: https://github.com/elastic/logstash/blob/v1.4.2/patterns/grok-patterns. We are using the %{COMBINEDAPACHELOG} pattern, which will parse an Apache Access Log for us.
- At last, we are sending the Output to Standard Output, i.e, to the shell running Logstash. We are using the Stdout Output Plugin and the Rubydebug Codec in Logstash.
Now, Let us Run our first Logstash configuration. You can start Logstash with your configuration simply by this command:
bin/logstash -f my_logstash.conf
* Use sudo if required.
If your configuration is valid, then Logstash will run by giving output as Logstash startup completed.
Now, try adding some Dummy Apache Access Logs to /var/log/apache2/access.log using vi or nano or any other editor. As soon as you save the file, you will see that Logstash will process your added logs, and will Output them to Console.
* Ensure that Logstash is running in another session, and you are editing the access.log file in another session if you are setting up things remotely via ssh or putty. Logstash must be running to see the changes.
Testing your Config before Running
Logstash allows you to test your Configuration for any kind of Errors/Invalidations by using a –configtest parameter. So, it will always be best to run your command like this before final run:
bin/logstash -f my_logstash.conf --configtest
If your Config is valid, then Logstash will output Configuration OK to your screen.
So, using various combinations of Input, Filter and Output Plugins provided by Logstash, we can make any configuration as per our needs.
Now, we know about reading logs from a local file and sending it to Logstash. In the next blog of this series, we will learn about sending logs remotely from different servers using Syslog-ng and Log4J.
Tags: elasticsearch, kibana, log4j, logging, logstash, syslog, syslog-ng
Leave a Reply