Scalable Logging in Rails using Log4r and graylog

Rails default logger is quite simple.but it becomes quite difficult to debug in production as the app grows big. Imagine  you have 5 different app servers and you have to grep all the log files. Some simple shell scripting could ease the pain but still log file is quite messed up and it still wont help you to analyse  which url returned 500, time taken by the api etc. It would then be great to have all logs in a centralised location easing up the pain of developers. Yes ! There is a gem for that! 🙂

Log4r is a powerful and flexible ruby gem used for logging in ruby inspired by the popular java library log4j. It supports multiple output destinations per log level, custom log levels etc. Checkout the library .

You might also need to check graylog. Graylog is an open source log management solution that can be used to monitor your logs.  It is built on the top of java, mongodb, elasticsearch . To know more about it click here. Graylog also comes with a web interface. Check out the link to see to setup graylog web interface using nginx as a reverse proxy .

I used gem  ‘log4r-gelf‘ along with log4r to sent my logs to graylog. All log4r and  ‘log4r-gelf‘ in your gem file

gem 'log4r'
gem 'log4r-gelf'

and do

bundle install 

2. Now create a yml file log4r.yml.

  - name: production
    level: INFO
    trace : false
     - production
     - gelf

  - name: development
    level: DEBUG
    trace: true
      - development
  - type: DateFileOutputter
    name: production
    filename: production.log
    dirname: "log"
     date_pattern: '%H:%M:%S'
     pattern: '%d %l: %m '
     type: PatternFormatter

  - type: GelfOutputter
    name: gelf
    gelf_server: ""
    gelf_port: "12219"
    level: INFO

3. Next in the application.rb

require 'log4r'
require 'log4r/yamlconfigurator'
require 'log4r/outputter/datefileoutputter'
include Log4r

module DemoApi

 log4r_config = YAML.load_file(File.join(Rails.root, 'config', 'log4r.yml'))
 YamlConfigurator.decode_yaml( log4r_config["log4r_config"] )
 config.logger = Log4r::Logger[Rails.env]


Configuring nginx as reverse proxy for graylog


Graylog2 is a powerful tool for log management and analysis tool. One such use case we had in my company is collect all logs of rails application running in 5 different servers in a single location so as to make debugging easy. It is built on the top of ElasticSearch, MongoDB and Java. First you need to set up graylog on your server. These links are likely to help you.

Once it is setup you want to access the web interface. It is running on port 9000.You could actually use a single port to connect with graylog REST api and web interface or two separate ports. This is my nginx configuration.

location / {
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Graylog-Server-URL;

location /api/ {
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

Graylog Configuration

web_listen_uri =
rest_listen_uri =
rest_transport_uri =

Graylog as a very good community at You may post your issues there .

Script to Monitor RabbitMQ Queue Messages

Below is a code I used to monitor rabbitmq queues. We were  using a microservices architecture. In our architecture each services are communicating using a rabbitmq broker. Services tend to publish message to broker and the broker  forwards these messages. One bottle neck is in this communication part.Due to some memory issues the consumers that consumes these messages got hang and the messages in a particular queue went too high .So I decided to write a script to sent an SMS and alert in slack channel whenever the message count becomes greater than the threshold.Note that I have used a gem slack notifier link: You could also just curl

 require 'net/http'
 require 'uri'
 require 'json'
 require 'slack-notifier'
 require 'sevak_publisher'

 CONFIG = YAML.load_file(File.join(__dir__, 'rabbitmq_config.yml'))
 def monitor_rabbitmq
   rabbitmqctl_url = CONFIG['rabbitmqctl']['url']
   rabbitmqctl_user = CONFIG['rabbitmqctl']['username']
   rabbitmqctl_password = CONFIG['rabbitmqctl']['password']
   uri = URI.parse("#{rabbitmqctl_url}/api/queues")
   request =
   request.basic_auth(rabbitmqctl_user, rabbitmqctl_password)
   req_options = { use_ssl: uri.scheme == 'https' }
   response = Net::HTTP.start(uri.hostname, uri.port, req_options)  do |http|
   queue_details = JSON.parse(response.body)
   queue_details.each do |queue|
     output = { name: queue['name'],
                messages: {
                  total: queue['messages'],
                  ready: queue['messages_ready'],
                  Unacknowlged: queue['messages_unacknowledged']
                 node: queue['node'],
                 state: queue['state'],
                consumers: queue['consumers'] }
      if output[:messages][:ready] > 100
         sent_alert_slack("RabbitMQ QUEUE High! \n #{output[:messages][:ready]} :\n #{output}")
   def sent_alert_slack(message)
      notifier = CONFIG['slack_settings']    ['notification_api'],
                             channel: '#rabbitmq-monitoring',
                             username: 'notifier' message
    puts "\n",
  rescue => e
    puts "Error: #{e.message}"

Thank you

Opw has come to an end and looking back I feeel so proud and happy.I become more confident .I figured out what I want to do next.I also had fun, met a lot of amazing people .All good things come to an end. This is a small thank you note to thank all those wonder full people who have helped me in making this internship the best opportunity I ever had. My parents friends for their constant support and motivation.There are some persons I specifically want to thank

Rebecca Billings: My mentor,I am so privileged to have her as my mentor.I am really inspired by her the way she organizes her time.I have learned to organize my time.My commmunication skill has improved over the time,it is because of her.

Bob Silverberg:  I have improved a lot pair programming with him.I learned not just scripting in python but modularizing code so that its make the code more readable.He gives  exceptionally good PR reviewer that hardly any bug goes unnoticed. He has been patient when i took time more than required.

Stephend mbrand and mozwebqa team.I owe you a lot.Thanking for the constant support and words of motivation.I like to explore more

And here comes Round 10 outreachy

There is less than a week.I can believe 3 months are over. I have learned a lot about programming.Friends the next level of FOSS outreach Program has been announced.What I have to say from my experience is


Two roads diverged in a wood, and I—
I took the one less traveled by,
And that has made all the difference. !
 check out the link to know more :

For Future interns  some tips

  • if you are interested install an irc client like chatzilla  say hi at #opw
  • Find a participating organization that welcomes you
  • Start early as possible
  • Complete the patch submitting requirement.Be it a small patch 🙂 .I started by modifying Readme
  • Dont forget to reach for help when stuck
  • Be enthusiastic

The year 2015

This is a list what I am going to after Internship

  • I am about to join a cool startup Red Panthers .An amazing team who mainly focuses on ruby and loves and contributes open source
  • Learn Ruby, Become better at Python
  • Continue contributing to existing projects
  • Attend an international conference
  • Find some amazing cool open source projects

Any suggestions.Feel free to comment

February 7:Sharing experience as an OPW intern at Mozilla Quality Assurance

Featured image

February 7 ,2015 has been a really special day in event life.I had shared my experience as an OPW intern at Mozilla Web Maker Party organised at Rajagiry School of Engineering and Technology,Kochi. Thank you Rebecca Billings for connecting me with the mozilla reps in my region and thanking Vigneshwar Dhinakaran  who trusted me to take the session.

It was my first ever conference in which I handled a session, and it was next to awesome.The maker party gave me an opportunity to network with other mozilla people at my region. Further in this post I shall my experience of the event.

I reached the venue at 9.00 a.m.and met Vigneshar Dhinakaran.Vigneshwar is the Mozilla rep and main organiser of the event.He proved to be friendly host and smart organizer. RSETians where waiting with enthusiasm to for the event.A few hours later I met another team of Mozilla reps who co-organised with Vigneshawar including Nidhya,Praveen,Abid.Praveen was GSOC intern in one of the previous round. Praveen blog is quite informative. Click here to read his blog

Featured image

The session started with transforming ideas into prototype,followed by an introduction to web technologies and mozilla web maker,after that Praveen and myself shared our experience as an intern at Mozilla and SMC(Swantantra Malayalam Computing).We spoke about opportunities like Google Summer Code OPW and Rails Girls  summer code.

We both explained our respective projects Oneanddone. It was interesting to know about his project  developing and adding support for Indic language layouts for Firefox OS.

Featured image

I spoke about the upcoming opw round, and gave some tips to get started.We  talked about  irc,encouraged them to subscribe to mailing list,use irc and learn github. We were able to help them in finding  resources for learning like openhatch,trygit etc.

We found that students really doesn’t get  a chance to contribute in open source projects. There is a popular misconception that open source projects has place only for highly skilled developers.We presented open source as a platform for developing skills.I gave  a brief description about how they can be a part of documentation,quality and testing etc.

To conclude open source is an important part of technical education.But students are not aware of  how they can contribute in open source. Providing them proper mentorship could help them bringing to forefront. Opportunities like OPW and GSOC could be a good dive into open source.To achieve this the mozilla reps has decided to organise a boot camp in the upcoming week.Stay tuned

Git pre-commit hook

When I got repeated reviews about code not passing flake8 checking bob mentioned about using hooks. It was then a new term to me. After checking a couple of webpages I found it very interesting. I wrote a git pre-commit-hook using shell scripting.My script checks two case

1: Check if the current branch is a master branch.
2: Check for python file for checking python syntax errors

What are git hooks?
git can trigger some import actions before you perform an important function like commit, rebase etc. A pre-commit hook is triggered before a commit occurs. Git hooks are stored in hooks directory of .git

Step 1

cd .git/hooks
vi pre- commit

Step 2: Copy the code

(git diff --cached --name-only

git diff–cached is used to find the difference between latest commit and files added for staging.
–name-only list only names of the that have changed.

(git diff --cached --name-only --diff-filter=ACM | grep -e '\.py$'

This list all the python files.

flake8 $file --ignore=E501

This ignores E501(line too long).

Step3: Make the file executable

chmod +x pre-commit-hook

And that’s it. Now every time when you commit your code, git will run flake8 for you. Also you may not fear about accidentally committing your master branch.

Data Migration in Django

Changing the database is one side of the equation, but often a migration involves changing data as well;

Consider this case

Bug 1096431 Able to create tasks with duplicate names.Task names should be unique

class Task:

     name = models.CharField(max_length=255, verbose_name=title)

    start_date = models.DateTimeField(blank=True, null=True)

Schema migration of course. ie Add unique = True. If you apply this migration to production it will cause an IntergrityError because you make a database column unique, while it’s contents are not unique. To solve this you need to find all tasks with duplicate names and rename them to something unique. This will be a datamigration instead of a schemamigration. Thanks to Giorgos Logiotatidis for the guidance,

Step1 : Create a few task with same names say ‘task’

Step2 : This is datamigration part

python datamigration Tasks

Step 3: When you Open up the newly created file you can see the skeleton  of forwards and backwards function.I wrote a code to rename the duplicate taskname.

Step4 : Now add the unique keyword to the field and apply a schemamigration

Step5 :Finally migrate the model tasks.Now you can the the duplicate tasks gets renamed as  task 2 task 3 etc

Gnome Internship Status Update Week 2 and Week3

In the past week I have been picking issues from bugzilla and fixing them. I had  been lucky to get the support  of  amazing people like Giorgos and Anastasios who had been kind to review my PR .


Status of four bugs has been marked resolved in the bugzilla. I have received feedbacks on other bugs which I expect to resolve it soon

Challenges and Feedbacks

I have been receiving repeated feedbacks about the indentation issues in the HTML code which needs to be fixed soon. I shall fix it and explain it in the next post 🙂

Need to make marks as taken as there are chances for multiple users taking up the tasks

 Whats next?

Found an awesome task management tool Google keep.Its comes as a chrome extension and android application.Decided to use it

Decided to work some days extra to make up two days loss

I am sharing an extremely useful resource on the review process  .Thanks to Bob Silverberg for sharing.

The initial two weeks has come to an end and have learned a lot.Some advance git commands . I know there more to learn.