Using Logstash for large scale log analysis along with MongoDB

This simple demo will give you an idea on how to use the logstash for large scale analysis of logs along with MongoDB. MongoDB can be used to store the large amount of logs. These logs further can be used to perform analytic by bringing into SQL DB as per requirement or as per time frame.

Step 1:Installing the Contrib package which has MongoDB plugin 
Option 1:  Using Script
   vaibhav@ubuntu:~/logstash-1.4.2$ bin/plugin install contrib 

 As per (https://github.com/elasticsearch/logstash/issues/1658) if this fails there is option 2 to install contrib package.


Option 2: Install the contrib package in same directory, in which the Logstash is installed.
  • wget http://download.elasticsearch.org/logstash/logstash/logstash-contrib-1.4.2.tar.gz
  •  tar zxf logstash-contrib-1.4.2.tar.gz



Step 2: Modify the configuration file for the logstash with with the credentials for the MongoDB as output. Input to the logstash is simple project.log file
Here I have used MongoLab (MongoDB as a Service)

temp.conf

input 
{
  file
 {
    path => "/home/vaibhav/project2.log"
    start_position => beginning
  }
}

output 
 mongodb 
{
    collection => "CollectionName"
    database => "DBname"
    uri => "mongodb://username:pwd@ds053090.mongolab.com:53090/DBname" 
  }
}

Step 3: Run the logstash with configuration file


Step 4: For testing enter manually log line into the log file

project2.log

    




Step 5: As soon as we write in the log file, on every write event logstash will make parsed structured log entry in MongoDB



Hence we saw simple Demo of how we can use the MongoDB to store parsed logs from the Logstash, Which can be further used to perform analytic.

Comments

Post a Comment

Popular posts from this blog

Elasticsearch JIRA Alert and Email Notification using ElastAlert Library

Web Application using Spring Tool Suite,Gradle and Scala