Archive for June, 2016

You know that Logstash, Elasticsearch and Kibana triple, aka ELK is a well used log
analysis tool set. This howto guide explains how to publish logs of WSO2 Carbon
servers to ELK platform

# Setup ELK

You can download Logstash, Elasticsearch and Kibana binaries one by one and setup ELK. But I am a Docker fan
so I use a preconfigured Docker image. Most people use sebp/elk Docker image. This Docker images by default does not comes
with a Logstash receiver for receiving beats event. Thus I added below Logstash configuration to receive beats events and create my own
docker image udaraliyanage/elk. Either you can use my Docker image or add below Logstash configuration to default Docker images

input {
beats {
type => beats
port => 7000
}
}
output {
elasticsearch {
hosts => “localhost:9200”
}
stdout { codec => rubydebug }
}

Above configuration causes Logstash to listen on port 7000 (input section) and forward the logs to Elasticsearch which is running on port 9200
of Docker container.

Now start the docker container as
docekr run -d -p 7000:7000 -p 5601:5601 udaraliyanage/elklog4

port 6000 => Logstash
port 5601 => Kibana

  # Setup Carbon Server to publish logs to Logstash

* Download filebeat deb file from [2] and install
dpkg -i filebeat_1.2.3_amd64

* Create a filebeat configuration file /etc/carbon_beats.yml with following content.

Please make sure to provide the correct wso2carbon.log file location in paths section. You can provide multiple carbon logs as well
if you are running multiple Carbon servers in your machine.

filebeat:
prospectors:

paths:
– /opt/wso2as-5.3.0/repository/logs/wso2carbon.log
input_type: log
document_type: appserver_log
output:
logstash:
hosts: [“localhost:7000”]
console:
pretty: true
shipper:
logging:
files:
rotateeverybytes: 10485760 # = 10MB

* Now  start Carbon Server ./bin/wso2server.sh start`

# View logs from Kibana by visiting http://localhost:5601

[1] https://www.elastic.co/products/beats/filebeat
[2] https://hub.docker.com/r/sebp/elk/

I assume that you know that Logstash, Elasticsearch and Kibana stack, a.k.a ELK is a well used log analysis tool set. This howto guide explains how to publish logs of WSO2 Carbon
servers to ELK platform.

# Setup ELK

You can download Logstash, Elasticsearch and Kibana binaries one by one and setup ELK. But I am a Docker fan so I use a preconfigured Docker image. Most people use sebp/elk Docker image. This Docker images by default does not comes with a Logstash receiver for log4j events. Thus I added below Logstash configuration to receive log4j events and create my own docker image udaraliyanage/elk. Either you can use my Docker image or add below Logstash configuration to default Docker images

input {
  log4j {
    mode => server
    host => “0.0.0.0”
    port => 6000
    type => “log4j”
  }
}
output {
  elasticsearch {
      hosts => “localhost:9200”
  }
  stdout { codec => rubydebug }
}

Above configuration causes Logstash to listen on port 6000 (input section) and forward the logs to Elasticsearch which is running on port 9200
of Docker container.

Now start the docker container as
`docekr run -d -p 6000:6000 -p 5601:5601 udaraliyanage/elklog4j`

port 6000 => Logstash
port 5601 => Kibana

# Setup Carbon Server to publish logs to Logstash

* Download Logstash json even layout dependecy jary from [3] and place it $CARBON_HOME/repository/components/lib .
This convert the log event to binary format and stream them to
a remote log4j host, in our case Logstash running on port 6000

* Add following log4j appended configurations to Carbon servers by editing $CARBON_HOME/repository/conf/log4j.properties file

log4j.rootLogger=INFO, CARBON_CONSOLE, CARBON_LOGFILE, CARBON_MEMORY,tcp

log4j.appender.tcp=org.apache.log4j.net.SocketAppender
log4j.appender.tcp.layout=org.wso2.carbon.utils.logging.TenantAwarePatternLayout
log4j.appender.tcp.layout.ConversionPattern=[%d] %P%5p {%c} – %x %m%n
log4j.appender.tcp.layout.TenantPattern=%U%@%D[%T]
log4j.appender.tcp.Port=6000
log4j.appender.tcp.RemoteHost=localhost
log4j.appender.tcp.ReconnectionDelay=10000
log4j.appender.tcp.threshold=DEBUG
log4j.appender.tcp.Application=myCarbonApp

RemoteHost => Logstash server where we want to publish events to, it is localhost:6000 in our case.
Application => Name of the application which publishes log. It is useful for the one who view logs from Kibana so that he can find from which server a particular logs is received.

* Now  start Carbon Server ./bin/wso2server.sh start`

# View logs from Kibana by visiting http://localhost:5601

[1] https://hub.docker.com/r/sebp/elk/
[2] https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html
[3] http://mvnrepository.com/artifact/net.logstash.log4j/jsonevent-layout/1.7