Using Logstash for large scale log analysis along with MongoDB
This simple demo will give you an idea on how to use the
logstash for large scale analysis of logs along with MongoDB. MongoDB can be
used to store the large amount of logs. These logs further can be used to
perform analytic by bringing into SQL DB as per requirement or as per time
frame.
Step 1:Installing the Contrib package which has MongoDB plugin
Option 1: Using Script
vaibhav@ubuntu:~/logstash-1.4.2$ bin/plugin install contrib
As per (https://github.com/elasticsearch/logstash/issues/1658) if this fails there is option 2 to install contrib package.
Option 2: Install the contrib package in same directory, in which the Logstash is installed.
Option 1: Using Script
vaibhav@ubuntu:~/logstash-1.4.2$ bin/plugin install contrib
As per (https://github.com/elasticsearch/logstash/issues/1658) if this fails there is option 2 to install contrib package.
Option 2: Install the contrib package in same directory, in which the Logstash is installed.
- wget http://download.elasticsearch.org/logstash/logstash/logstash-contrib-1.4.2.tar.gz
- tar zxf logstash-contrib-1.4.2.tar.gz
Step 2: Modify the
configuration file for the logstash with with the credentials for the MongoDB
as output. Input to the logstash is simple project.log file
Here I have used MongoLab (MongoDB as a Service)
temp.conf
input
{
file
{
path =>
"/home/vaibhav/project2.log"
start_position
=> beginning
}
}
output
{
mongodb
{
collection =>
"CollectionName"
database =>
"DBname"
uri =>
"mongodb://username:pwd@ds053090.mongolab.com:53090/DBname"
}
}
nice explanation for newbies!!!
ReplyDeleteThanks! Helped me a lot!
ReplyDeleteIt really helped me a lot! Great Work!
ReplyDelete