Hadoop Tutorials.CO.IN
Big Data - Hadoop - Hadoop Ecosystem - NoSQL - Spark

Log Analytics using Elasticsearch, Logstash and Kibana

by Tanmay Deshpande

In this series of articles we are going to talk about how to perform log analytics using Elasticseach, Logstash and Kibana. In previous article we talked about how to install Elasticsearch, Logstash and Kibana on windows. In this article we are going to see how to load Apache Logs data using Logstash into Elasticsearch and then create reports in Kibana using the same index.



Load logs data into Elasticsearch using Logstash

We are going to write one Logstash configuration which reads the data from Apache Logs file. It also allows us to integrate with GeoCity Data. By doing so we can transform the IP address entries into extact geo locations and its attributes like Longitude - Altitude, City, State, Country and other details of a give IP address.



The data file can be downloaded from this link. More specifically this link



Unzip the file and save it on some own location on your disk.

Next thing we are going to do is writing logstash conf file as shown below.

input {
  file {
    type => "apache"
    path => [ "logs/file/path" ] # e.g. path => [ "E:/mylogs2.txt" ]
    start_position => "beginning"
  }
}


filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
  geoip {
    source => "clientip"
    target => "geoip"
    database =>"geocity/data/file/path" # e.g. database =>"E:/logstash/GeoLiteCity.dat" 
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }

  mutate {
    convert => [ "[geoip][coordinates]", "float" ]
  }
   
}

output {
  elasticsearch {
    host => localhost
  }
  stdout { codec => rubydebug }
}

The above script first reads the apache logs file on given path, parses it into Apache logs format then before adding it to Elasticsearch, it first maps it with GeoLiteData and then add the corrosponding geo attributes in it.

Now let's execute this config using logstash and insert records into elasticsearch. Here we assume that you have followed the previous tutorial and elasticsearch is up and running. Also let's assume that we have saved the above give file as logstash-apache-geo-file.conf

> logstash -f logstash-apache-geo-file.conf

Now it will start reading from the file and also start inserting data into Elasticsearch. Here is screenshot of command prompt.



Now you go to Elastic search Head GUI and browse the logstash generated indexes. By default, logstash generates new index for everyday's data. You can also click on any entry to see if it has added geo location attributes as shown below.




Now go to Kibana dashboard and configure indexes and Timestamp attribute as shown in the screenshot below.


Now simply create the visualization the you want and create a dashboad like shown below


And like this as well


And that's all, you can keep repeating this process and keep on building great dashboards.





Search

Follow us on Twitter

Recommended for you