Sending IIS logs to Logstash and Elasticsearch

Sending IIS logs to Logstash and Elasticsearch

Parsing the IIS logs to get statistics can be very useful as these logs hold many properties for each request that has been served by you application(s). Some of these properties such as the duration, return status,URL and others can be used as key metrics to visualize the overall activity of a web application.

The bellow screen shot is just one example for a visual display in Grafana that is based on IIS logs.

grafana

 

There are probably other ways to accomplish this task but I went with nxlog after an unsuccessful attempt with Filebeat.

nxlog is reliable and a pretty light software with a low footprint. I have been using nxlog for a few years now and I am very pleased. There are 2 editions that can be used, the Community Edition that is free and the Enterprise Edition that needs to be paid for.

In my case I needed to send IIS logs to Elasticsearch so I needed nxlog to read IIS logs as the source and send the output in a JSON format to be recognized by elastic as the destination. nxlog Enterprise edition has a built-in module named om_elasticsearch designed to do just that. This module uses elastic’s Bulk API method making it robust and capable of handling large volumes of data.

After testing the Enterprise edition which worked well and was very simple to deploy I have decided to try using the Community Edition.

This edition does not have a built in module that can send data directly to Elastic but what we can do is send the data to Logstash first and from Logstash to Elastic. To do that I have used a module named om_tcp. The data is received at Logstash for parsing and then from logstash to elasticsearch. This configuration works well and can be easily deployed.

Here is the nxlog specific chapter in the user guide.

 

Below are the 2 configuration files I have used.

The nxlog config file simply defines the im_file module to read the IIS logs and then the om_tcp module to send the data to logstash.

 

nxlog configuration file to send iis logs data to logstash

One thing that is worth pointing out is that in cases where you have multiple web sites and therefore multiple folders the way to configure nxlog to process all folders is by omitting the last folder and specifying the Recursive TRUE option. This has to be done so since regular expressions are supported only for files and not for folders.

The bellow example omits the last folder in the path (W3SVC1, W3SVC2, W3SVC3 etc ….) making nxlog to work on all folders.

File  “C:\\inetpub\\logs\\LogFiles\\u_ex*.log”

Recursive TRUE

define ROOT C:\Program Files (x86)\nxlog

Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
LogFile %ROOT%\data\nxlog.log


<Input iis>
Module im_file
File "C:\\inetpub\\logs\\LogFiles\\W3SVC1\\u_ex*.log"
SavePos TRUE

Exec if $raw_event =~/^#/ drop();
</Input>

<Output out>
Module om_tcp
# ip address of logstash and tcp port number goes here
Host 10.0.0.200
Port 7200
OutputType LineBased
</Output>

<Route iis-to-out>
Path iis => out
</Route>

 

Logstash configuration file to send IIS logs data to Elasticsearch

input 
{
tcp
{
type => "iis"
port => 7200
}
}

filter
{
if [type] == "iis"
{
grok
{
match => {"message" => "%{TIMESTAMP_ISO8601:timeofevent} %{IPORHOST:hostip} %{WORD:method} %{URIPATH:page} %{NOTSPACE:query} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientip} %{NOTSPACE:useragent} %{NOTSPACE:referer} %{NUMBER:status} %{NUMBER:response} %{NUMBER:win32status} %{NUMBER:timetaken:int}"}
}
date
{
match => ["timeofevent", "YYYY-MM-dd HH:mm:ss"]
target => "timeofevent"
}
#mutate {convert => {"timetaken" => "integer"}}
}
}

output
{
elasticsearch
{
# elastic search ip and port goes here
hosts => ["http://10.0.0.200:9200"]
index => "nxlogiis-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}

The following two tabs change content below.
Yaniv Etrogi is an SQL Server consultant. He loves SQL Server and he is passionate about Performance Tuning, Automation and Monitoring.

Leave a Comment

Your email address will not be published. Required fields are marked *