Our Story So Far
So, if you are just joining up, what we are building is set to mimic the setup described Here. We have done a bit of set up in parts 1 and 2:
– OpenStack Lumberjack Part 1
– OpenStack Lumberjack Part 2
In this post, we will install Elasticsearch, Kibana, Logstash, and configure all of the above to drink in our logs from syslog.
Getting Started
For this work, you will need to have followed all the steps from part 1, and now part 2.
Once you have completed those steps, log into the controller VM, and let’s get started:
Installing Java
One of the prerequisites for this logging enterprise is to have Java 7 installed. To do that, run the following commands:
sudo apt-get install python-software-properties software-properties-common -y
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
echo debconf shared/accepted-oracle-license-v1-1 select true | sudo debconf-set-selections
echo debconf shared/accepted-oracle-license-v1-1 seen true | sudo debconf-set-selections
sudo apt-get -q -y install oracle-java7-installer
sudo bash -c "echo JAVA_HOME=/usr/lib/jvm/java-7-oracle/ >> /etc/environment"
What these commands do in order:
– Install 3rd party apt-repo support
– Add the Java repository
– Update our repositories
– Accept the licenses so that we’re not prompted for it
– Install Java 7
– Set the Java environment variable
Installing Elasticsearch
The next on our prerequisites list for logstash is Elasticsearch. Elastic Search will be used for storing said logs. At the time of this writing, the current version of Logstash (1.4.1) likes to have Elasticsearch 1.1.1.
To install Elasticsearch, we run the following commands:
wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -
echo 'deb http://packages.elasticsearch.org/elasticsearch/1.1/debian stable main' | sudo tee /etc/apt/sources.list.d/elasticsearch.list
sudo apt-get update
sudo apt-get install -y elasticsearch=1.1.1
sudo service elasticsearch start
sudo update-rc.d elasticsearch defaults 95 10
These commands did the following:
– Installed the package signing key for elastic search and added the elasticsearch apt-repo
– Installed Elasticsearch 1.1.1
– Started the Elasticsearch service
– Configured Elasticsearch to start on system startup
Installing Kibana
Now, we install the log visualizer, Kibana. To do that, we need to run the following commands:
cd ~; wget http://download.elasticsearch.org/kibana/kibana/kibana-latest.zip
unzip kibana-latest.zip
sudo mkdir -p /var/www/kibana
sudo cp -R ~/kibana-latest/* /var/www/kibana/
sudo cat > /etc/apache2/conf-enabled/kibana.conf <<EOF
Alias /kibana /var/www/kibana
<Directory /var/www/kibana>
Order allow,deny
Allow from all
</Directory>
EOF
sudo service apache2 restart
There is a log going on in there, so lets break it down:
– The first two lines download, unpack, and move the Kibana files into a place we can serve them from Apache.
– The remaining lines provide an Apache configuration & restart the Apache services.
Now you should be able to see the Kibana spash page by loading http://172.16.0.200/kibana/. That is, if you are following along exactly, otherwise it will be running at /kibana/ on whatever host you have your web server configured.
Installing Logstash
The last part of scaffolding we need in place for ‘the things’ to work is Logstash. Logstash is responsible for the receiving, filtering, and parsing of our logs. So,
echo 'deb http://packages.elasticsearch.org/logstash/1.4/debian stable main' | sudo tee /etc/apt/sources.list.d/logstash.list
sudo apt-get update
sudo apt-get install -y logstash=1.4.1-1-bd507eb
If you’ve been following along, the above follows the same pattern we’ve used so far:
– Add the apt-repo
– Update our apt info
– Install logstash
Configuring Logstash
Now we have all the pieces in place. However, at this stage they’re not all that useful. That is, we have not yet told it to do anything for us. To that end, we will run the command below to tell logstash to listen on port 5000 and filter for things that look like syslog info:
sudo cat > /etc/logstash/conf.d/10-syslog.conf <<EOF
input {
tcp {
port => 9000
type => syslog
}
udp {
port => 9000
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
EOF
service logstash restart
The file we created 10-syslog.conf contains three sections:
– Input
– Filter
– Output
In the input section, we tell logstash to listen on both tcp and udp port 5000 for incoming syslog messages. We then set up a filter to ingest syslog messages, matching on the timestamp and adding additional fields for host and timestamp. Finally, the output section will dump these logs into our elastic search setup.
Puking logs from rsyslog to logstash
Finally, we need to send things from where we are collecting them in rsyslog into logstash. To do that, run the following:
sudo echo "*.* @@localhost:9000" >> /etc/rsyslog.d/50-default.conf
sudo restart rsyslog
Verify We’re Seeing Logs in Kibana
So, we’ve done quite a bit of work here (and over the last few posts). If you’ve followed along to this point, you should be able to log up the kibana page (http://controller_ip/kibana) and see something that looks like this:
Summary
In this post, we configured a bunch of tools to collect, filter, and make search-able our logs. While this is super powerful and superuseful at the moment, in our next post, we will add some filters / dashboards to start adding some OpenStack specific intelligence to things.
>>tell logstash to listen on port 5000
May be it’s the port 9000 instead?