Archive for the ‘log’ Category

I assume that you know that Logstash, Elasticsearch and Kibana stack, a.k.a ELK is a well used log analysis tool set. This howto guide explains how to publish logs of WSO2 Carbon
servers to ELK platform.

# Setup ELK

You can download Logstash, Elasticsearch and Kibana binaries one by one and setup ELK. But I am a Docker fan so I use a preconfigured Docker image. Most people use sebp/elk Docker image. This Docker images by default does not comes with a Logstash receiver for log4j events. Thus I added below Logstash configuration to receive log4j events and create my own docker image udaraliyanage/elk. Either you can use my Docker image or add below Logstash configuration to default Docker images

input {
  log4j {
    mode => server
    host => “”
    port => 6000
    type => “log4j”
output {
  elasticsearch {
      hosts => “localhost:9200”
  stdout { codec => rubydebug }

Above configuration causes Logstash to listen on port 6000 (input section) and forward the logs to Elasticsearch which is running on port 9200
of Docker container.

Now start the docker container as
`docekr run -d -p 6000:6000 -p 5601:5601 udaraliyanage/elklog4j`

port 6000 => Logstash
port 5601 => Kibana

# Setup Carbon Server to publish logs to Logstash

* Download Logstash json even layout dependecy jary from [3] and place it $CARBON_HOME/repository/components/lib .
This convert the log event to binary format and stream them to
a remote log4j host, in our case Logstash running on port 6000

* Add following log4j appended configurations to Carbon servers by editing $CARBON_HOME/repository/conf/ file

log4j.appender.tcp.layout.ConversionPattern=[%d] %P%5p {%c} – %x %m%n

RemoteHost => Logstash server where we want to publish events to, it is localhost:6000 in our case.
Application => Name of the application which publishes log. It is useful for the one who view logs from Kibana so that he can find from which server a particular logs is received.

* Now  start Carbon Server ./bin/ start`

# View logs from Kibana by visiting http://localhost:5601



If you are a software developer, a sysadmin or a devops, all of us have do deal with log files to see what’s happening in our system and to find analyse its behaviour. Going through normal log files (log4j) and analysing them is a bit painful since all the lines are printed in same colour.

There are some better log tools such as Multitail. However if you are  looged into a remote server via “ssh” , then you don’t have the luxury of using these tools. However Linux “awk” utility comes to help. AWK is a Linux text processing tool, text in files and streams. It matches a specified pattern and execute some action on it. If a pattern is not specified, it applies the action on all lines.

Print  with  colours

Below script reads the WSO2 ELB log file and matches all INFO s and print them in green, ERROR s in red, FATALs in red, WARN s in yellow.

tail -f /opt/wso2elb-2.0.4/repository/logs/wso2carbon.log | awk '
/INFO/ {print "\033[32m" $0 "\033[39m"}
/ERROR/ {print "\033[31m" $0 "\033[39m"}
/FATAL/ {print "\033[31m" $0 "\033[39m"}
/WARN/ {print "\033[33m" $0 "\033[39m"}
Log with colour

Log with colour

Filter with colours

In addition to above script, below script matches the log messages with the word “joined” and shows them in blue colour.

tail -f /opt/wso2elb-2.0.4/repository/logs/wso2carbon.log | awk '
/INFO/ {print "\033[32m" $0 "\033[39m"}
/ERROR/ {print "\033[31m" $0 "\033[39m"}
/FATAL/ {print "\033[31m" $0 "\033[39m"}
/WARN/ {print "\033[33m" $0 "\033[39m"}
/joined/ {print "\033[34m" $0 "\033[39m"}

Note : Some Linux systems such as Ubuntu has output buffer enabled by default for awk. Sometime it may result in some content at the end of the log file not showing in the console. To avoid it you can use gawk which is the GnuAWK implementation or turn off the standard output as below.

tail -f /opt/wso2elb-2.0.4/repository/logs/wso2carbon.log | stdbuf -o0 awk <awk_parameters>'

In order to log and see the SOAP messages that comes in to and goes out from WSO2 ESB add the following entries to the file


You can find some more logging varieties from [1].