Send logs to RabbitMQ from Python/Django.

Overview

python-logging-rabbitmq

Build Status

Logging handler to ships logs to RabbitMQ. Compatible with Django.

Installation

Install using pip.

pip install python_logging_rabbitmq

Versions

Version Dependency
>= 2.x Pika == 0.13
<= 1.1.1 Pika <= 0.10

Handlers

This package has two built-in handlers that you can import as follows:

from python_logging_rabbitmq import RabbitMQHandler

or (thanks to @wallezhang)

from python_logging_rabbitmq import RabbitMQHandlerOneWay
Handler Description
RabbitMQHandler Basic handler for sending logs to RabbitMQ. Every record will be delivered directly to RabbitMQ using the exchange configured.
RabbitMQHandlerOneWay High throughput handler. Initializes an internal queue where logs are stored temporarily. A thread is used to deliver the logs to RabbitMQ using the exchange configured. Your app doesn't need to wait until the log is delivered. Notice that if the main thread dies you might lose logs.

Standalone python

To use with python first create a logger for your app, then create an instance of the handler and add it to the logger created.

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)

rabbit = RabbitMQHandler(host='localhost', port=5672)
logger.addHandler(rabbit)

logger.debug('test debug')

As result, a similar message as follows will be sent to RabbitMQ:

{
	"relativeCreated":280.61580657958984,
	"process":13105,
	"args":[],
	"module":"test",
	"funcName":"<module>",
	"host":"albertomr86-laptop",
	"exc_text":null,
	"name":"myapp",
	"thread":140032818181888,
	"created":1482290387.454017,
	"threadName":"MainThread",
	"msecs":454.01692390441895,
	"filename":"test.py",
	"levelno":10,
	"processName":"MainProcess",
	"pathname":"test.py",
	"lineno":11,
	"msg":"test debug",
	"exc_info":null,
	"levelname":"DEBUG"
}

Sending logs

By default, logs will be sent to RabbitMQ using the exchange 'log', this should be of type topic. The routing key used is formed by concatenating the logger name and the log level. For example:

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
logger.addHandler(RabbitMQHandler(host='localhost', port=5672))

logger.info('test info')
logger.debug('test debug')
logger.warning('test warning')

The messages will be sent using the following routing keys:

  • myapp.INFO
  • myapp.DEBUG
  • myapp.WARNING

For an explanation about topics and routing keys go to https://www.rabbitmq.com/tutorials/tutorial-five-python.html

When create the handler, you're able to specify different parameters in order to connect to RabbitMQ or configure the handler behavior.

Overriding routing-key creation

If you wish to override routing-key format entirely, you can pass routing_key_formatter function which takes LogRecord objects and returns routing-key. For example:

RabbitMQHandler(
	host='localhost',
	port=5672,
	routing_key_formatter=lambda r: (
		'some_exchange_prefix.{}'.format(r.levelname.lower())
	)
)

Configuration

These are the configuration allowed:

Parameter Description Default
host RabbitMQ Server hostname or ip address. localhost
port RabbitMQ Server port. 5672
username Username for authentication. None
password Provide a password for the username. None
exchange Name of the exchange to publish the logs. This exchange is considered of type topic. log
declare_exchange Whether or not to declare the exchange. False
routing_key_format Customize how messages are routed to the queues. {name}.{level}
routing_key_formatter Customize how routing-key is constructed. None
connection_params Allow extra params to connect with RabbitMQ. None
formatter Use custom formatter for the logs. python_logging_rabbitmq.JSONFormatter
close_after_emit Close the active connection after send a log. A new connection is open for the next log. False
fields Dict to add as a field in each logs send to RabbitMQ. This is useful when you want fields in each log but without pass them every time. None
fields_under_root When is True, each key in parameter 'fields' will be added as an entry in the log, otherwise they will be logged under the key 'fields'. True
message_headers A dictionary of headers to be published with the message. None
record_fields A set of attributes that should be preserved from the record object. None
exclude_record_fields A set of attributes that should be ignored from the record object. None
heartbeat Lower bound for heartbeat timeout 60

Examples

RabbitMQ Connection

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	username='guest',
	password='guest',
	connection_params={
		'virtual_host': '/',
		'connection_attempts': 3,
		'socket_timeout': 5000
	}
)

Custom fields

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	fields={
		'source': 'MyApp',
		'env': 'production'
	},
	fields_under_root=True
)

Custom formatter

By default, python_logging_rabbitmq implements a custom JSONFormatter; but if you prefer to format your own message you could do it as follow:

import logging
from python_logging_rabbitmq import RabbitMQHandler

FORMAT = '%(asctime)-15s %(message)s'
formatter = logging.Formatter(fmt=FORMAT)
rabbit = RabbitMQHandler(formatter=formatter)

For a custom JSON Formatter take a look at https://github.com/madzak/python-json-logger

Django

To use with Django add the handler in the logging config.

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Configuration

Same as when use it with standalone python, you could configure the handle directly when declaring it in the config:

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'port': 5672,
			'username': 'guest',
			'password': 'guest',
			'exchange': 'log',
			'declare_exchange': False,
			'connection_params': {
				'virtual_host': '/',
				'connection_attempts': 3,
				'socket_timeout': 5000
			},
			'fields': {
				'source': 'MainAPI',
				'env': 'production'
			},
			'fields_under_root': True
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Custom formatter

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'standard': {
			'format': '%(levelname)-8s [%(asctime)s]: %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'standard'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

JSON formatter

pip install python-json-logger
LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'json': {
			'()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
			'fmt': '%(name)s %(levelname) %(asctime)s %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'json'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Releases

Date Version Notes
Mar 10, 2019 1.1.1 Removed direct dependency with Django. Integration with Travis CI. Configuration for tests. Using pipenv.
May 04, 2018 1.0.9 Fixed exchange_type parameter in channel.exchange_declare (Thanks to @cklos).
Mar 21, 2018 1.0.8 Allowing message headers (Thanks to @merretbuurman).
May 15, 2017 1.0.7 Adding support to customize the routing_key (Thanks to @hansyulian).
Mar 30, 2017 1.0.6 Fix compatibility with python3 in RabbitMQHandlerOneWay (by @sactre).
Mar 28, 2017 1.0.5 Explicit local imports.
Mar 16, 2017 1.0.4 Added new handler RabbitMQHandlerOneWay (by @wallezhang).
Mar 14, 2017 1.0.3 Added config parameter close_after_emit.
Dec 21, 2016 1.0.2 Minor fixes.
Dec 21, 2016 1.0.1 Minor fixes.
Dec 21, 2016 1.0.0 Initial release.

What's next?

  • Let's talk about tests.
  • Issues, pull requests, suggestions are welcome.
  • Fork and improve it. Free for all.

Similar efforts

Comments
  • TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    I always get this error.

    The error was from line 101 in handler,py.
    But I think it is because of the line 62
    self.connection_params.update(dict(host=host, port=port, heartbeat_interval=0))

    Just go through this Pika Documentation

    connection_params does not have heartbeat_interval

    bug 
    opened by raj-kiran-p 7
  • fix: handle thread shutdown

    fix: handle thread shutdown

    Introduce two events (stopping, stopped) to interlock with the worker thread and cause a graceful shutdown.

    Add a timeout to the Queue get of 10s, this means that a graceful shutdown will not be instantaneous.

    Switch to del on the Pika blocking channels.

    opened by donbowman 4
  • RabbitMQ server closes the connection because not receiving heartbeat

    RabbitMQ server closes the connection because not receiving heartbeat

    Hi Albert, Similar to issue https://github.com/pika/pika/issues/1104. After digging into Pika and RabbitMQ, I find with BlockedConnection, pika will not automatically send out the heartbeat. The heartbeat event will only be handled/sent in "start_consuming" and "process_data_events". For consumer, we will use "start_consuming", there will not be such issue. But for producer, normally we won't call "process_data_events" specifically, it will only be called when we call "basic_publish". Let's say we set "heartbeat" to 20s, if we don't log any message within 3x10s, the server would close the connection. (Different version of RabbitMQ might have different behaviors, some might take 3x20s) I didn't see anyone report this issue or talk this on the internet, so I'm not sure if my understanding is correct. Look forward to your response. Thanks in advance.

    bug wip 
    opened by yuanli-cn 4
  • Standalone not working

    Standalone not working

    Hello everybody,

    I'm trying to implement your lib in my python app. We're not using Django and we have this error raised :

    Traceback (most recent call last): File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/test.py", line 1, in <module> import DarwinLogger File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/DarwinLogger.py", line 4, in <module> from python_logging_rabbitmq import RabbitMQHandlerOneWay File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/__init__.py", line 2, in <module> from .formatters import JSONFormatter # noqa: F401 File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/formatters.py", line 5, in <module> from django.core.serializers.json import DjangoJSONEncoder ImportError: No module named django.core.serializers.json

    I simply followed the "standalone" part of the readme. Is this normal ? Or Am I doing something wrong ?

    Thx !

    bug 
    opened by Travincebarker 2
  • wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    Hi,

    Thank you for your great package.

    Is there any way to wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ? Naive method could be to wait a few seconds (time.sleep(2)) but there is maybe a better method.

    Thanks a lot.

    enhancement planning 
    opened by BenjaminSchmitt 2
  • Unconfigurable Routing Key Format

    Unconfigurable Routing Key Format

    I need to able to change the routing key format in my system, so i prefered that this file, python_loggin_rabbitmq/handlers.py:

    line 115:

                routing_key ="{name}.{level}".format(name=record.name, level=record.levelname)
    

    to be changed to:

    line 14:

                ROUTING_KEY_FORMAT = "{name}.{level}"
    

    line 115:

                routing_key = self.ROUTING_KEY_FORMAT.format(name=record.name, level=record.levelname)
    

    so it will be configurable thank you

    enhancement 
    opened by hansyulian 2
  • ImportError: No module named 'compat'

    ImportError: No module named 'compat'

    When I use the library I see an Exception:

    File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/init.py", line 2, in from .formatters import JSONFormatter # noqa: F401 File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/formatters.py", line 4, in from compat import json ImportError: No module named 'compat'

    are some wrong in ini?

    Regards and thank you for your library.

    bug 
    opened by sactre 2
  • Add content_type in pika.BasicProperties parameters

    Add content_type in pika.BasicProperties parameters

    https://github.com/albertomr86/python-logging-rabbitmq/blob/5d3ce4cc0b86b7303a2097d6acb46972d334e213/python_logging_rabbitmq/handlers.py#L164 The safest way to work is to add content_type = 'STRING' but could be as parameter key in class method.

    wip 
    opened by TopperBG 1
  • Fix in publish(): the body is already formatted.

    Fix in publish(): the body is already formatted.

    In emit(), the record is formatted and than queued. The worker, is getting from the queue the record to be published In publish(), that record was formatted again (a second time)

    Try a simple app like this:

    import time import logging from python_logging_rabbitmq import RabbitMQHandlerOneWay

    logger = logging.getLogger('myapp') logger.setLevel(logging.DEBUG)

    rabbit = RabbitMQHandlerOneWay(host='localhost', port=5672) logger.addHandler(rabbit)

    logger.debug('test debug') time.sleep(3)

    -- Error: File "python-logging-rabbitmq/python_logging_rabbitmq/formatters.py", line 22, in format data = record.dict.copy() AttributeError: 'str' object has no attribute 'dict'

    opened by ghost 1
  • Returning batch of changes to upstream

    Returning batch of changes to upstream

    Hi, I'm pleased to say that we've been using your library in our project and it turned out very helpful. We've made some changes to fit our needs and thought to return them to upstream, you may find them useful. In summary, we've:

    • Updated .gitignore to include broader range of Python/Vim-related files
    • Made some stylistic tweaks; sorted imports, PEP8-ified some comments
    • Added routing_key_formatter option which allows to pass lambda overriding routing-key creation
    • Added support for serialization of Django's requests (this means that Rabbit handlers can handle errors logged to django.requests)
    • Added record_fields and exclude_record_fields options which allow to include/exclude specified LogRecord attributes (sometimes fields such as levelno are just not helpful)
    • Imported DjangoJSONEncoder to json formatter in order to handle breader range of objects (such as Decimal)
    • Updated README
    opened by IwoHerka 1
  • call of channel.exchange_declare modified

    call of channel.exchange_declare modified

    According to the Pika source at: https://github.com/pika/pika/blob/master/pika/channel.py#L658 the channel.exchange_declare method has no argument 'type', the corresponding argument is 'exchange_type'.

    opened by cklos 1
  • fix: only mark task done when a task was dequeued

    fix: only mark task done when a task was dequeued

    task_done will fail if we mark a task as having finished when no task was dequeued. Since this can only happen after a task was retrieved from the queue, move the finally into an inner try so that we know task_done will work.

    Fixes #29 for the most part -- it does not address the leak regarding messages still in the queue when is_stopping is set.

    opened by klarose 0
  • Call queue.task_done() only after a successful get()

    Call queue.task_done() only after a successful get()

    queue.task_done() should be called only when an item was actually returned by get(). If get() raises a Empty exception, task_done() should not be called.

    Also, close the Pika connection only if it was actually opened.

    wip 
    opened by kmorwath 1
  •  self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    The changes in version 2.2 for fix #25 in python_logging_rabbitmq/handlers_oneway.py may have introduced an issue. Before the Queue.Empty exception was never raised because record, routing_key = self.queue.get() had no timeout. Now when the exception is raised if no messages arrives within 10s, the exception handler will call "continue" but still the "finally" block is executed anyway - and queue.task_done() could be called more times than put() and it will lead to a ValueError exception.

    queue.task_done() should be called in a inner "try..finally" block after a message has been dequeued actually, for example:

    record, routing_key = self.queue.get(block=True, timeout=10) try: #Actually got a message ... try to send the message ... finally: queue.task_done()

    Moreover when is_stopping is set the loop is exited before queue.task_done() is called, and messages still in the queue are not processed. If on the other side of the queue something attempts to call queue.join() it could never return.

    opened by kmorwath 0
  • `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    As per https://github.com/esnme/ultrajson/issues/124

    If you have a package that requires ujson, it is automatically picked up by the compat.py and used in JSONFormatter thereafter. Unfortunately, ujson is not fully compatible with the built-in json.dump and it does not understand the cls parameter.

    opened by EivV 1
  • SSL configuration isn't working automatically

    SSL configuration isn't working automatically

    As a workaround I initilize to following:

    SSLOptions(ssl.SSLContext(protocol=ssl.PROTOCOL_TLSv1_2))

    and pass it as connection_params under ssl_options

    Without a workaround I get a connection reset error.

    bug wip 
    opened by Ghost93 2
Releases(2.0.0)
Owner
Alberto Menendez Romero
Technical Manager at Globant SA.
Alberto Menendez Romero
Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project

Django URL Shortener Django URL Shortener is a Django app to to include URL Shortening feature in your Django Project Install this package to your Dja

Rishav Sinha 4 Nov 18, 2021
Django Persistent Filters is a Python package which provide a django middleware that take care to persist the querystring in the browser cookies.

Django Persistent Filters Django Persistent Filters is a Python package which provide a django middleware that take care to persist the querystring in

Lorenzo Prodon 2 Aug 5, 2022
Meta package to combine turbo-django and stimulus-django

Hotwire + Django This repository aims to help you integrate Hotwire with Django ?? Inspiration might be taken from @hotwired/hotwire-rails. We are sti

Hotwire for Django 31 Aug 9, 2022
django-reversion is an extension to the Django web framework that provides version control for model instances.

django-reversion django-reversion is an extension to the Django web framework that provides version control for model instances. Requirements Python 3

Dave Hall 2.8k Jan 2, 2023
Django-environ allows you to utilize 12factor inspired environment variables to configure your Django application.

Django-environ django-environ allows you to use Twelve-factor methodology to configure your Django application with environment variables. import envi

Daniele Faraglia 2.7k Jan 7, 2023
Rosetta is a Django application that eases the translation process of your Django projects

Rosetta Rosetta is a Django application that facilitates the translation process of your Django projects. Because it doesn't export any models, Rosett

Marco Bonetti 909 Dec 26, 2022
Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly.

Cookiecutter Django Powered by Cookiecutter, Cookiecutter Django is a framework for jumpstarting production-ready Django projects quickly. Documentati

Daniel Feldroy 10k Dec 31, 2022
Django project starter on steroids: quickly create a Django app AND generate source code for data models + REST/GraphQL APIs (the generated code is auto-linted and has 100% test coverage).

Create Django App ?? We're a Django project starter on steroids! One-line command to create a Django app with all the dependencies auto-installed AND

imagine.ai 68 Oct 19, 2022
django-quill-editor makes Quill.js easy to use on Django Forms and admin sites

django-quill-editor django-quill-editor makes Quill.js easy to use on Django Forms and admin sites No configuration required for static files! The ent

lhy 139 Dec 5, 2022
A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, celery and redis.

Django Channels Websocket Chatbot A Django chatbot that is capable of doing math and searching Chinese poet online. Developed with django, channels, c

Yunbo Shi 8 Oct 28, 2022
A handy tool for generating Django-based backend projects without coding. On the other hand, it is a code generator of the Django framework.

Django Sage Painless The django-sage-painless is a valuable package based on Django Web Framework & Django Rest Framework for high-level and rapid web

sageteam 51 Sep 15, 2022
A beginner django project and also my first Django project which involves shortening of a longer URL into a short one using a unique id.

Django-URL-Shortener A beginner django project and also my first Django project which involves shortening of a longer URL into a short one using a uni

Rohini Rao 3 Aug 8, 2021
Dockerizing Django with Postgres, Gunicorn, Nginx and Certbot. A fully Django starter project.

Dockerizing Django with Postgres, Gunicorn, Nginx and Certbot ?? Features A Django stater project with fully basic requirements for a production-ready

null 8 Jun 27, 2022
pytest-django allows you to test your Django project/applications with the pytest testing tool.

pytest-django allows you to test your Django project/applications with the pytest testing tool.

pytest-dev 1.1k Dec 14, 2022
APIs for a Chat app. Written with Django Rest framework and Django channels.

ChatAPI APIs for a Chat app. Written with Django Rest framework and Django channels. The documentation for the http end points can be found here This

Victor Aderibigbe 18 Sep 9, 2022
django-dashing is a customisable, modular dashboard application framework for Django to visualize interesting data about your project. Inspired in the dashboard framework Dashing

django-dashing django-dashing is a customisable, modular dashboard application framework for Django to visualize interesting data about your project.

talPor Solutions 703 Dec 22, 2022
Django-MySQL extends Django's built-in MySQL and MariaDB support their specific features not available on other databases.

Django-MySQL The dolphin-pony - proof that cute + cute = double cute. Django-MySQL extends Django's built-in MySQL and MariaDB support their specific

Adam Johnson 504 Jan 4, 2023
Django-Audiofield is a simple app that allows Audio files upload, management and conversion to different audio format (mp3, wav & ogg), which also makes it easy to play audio files into your Django application.

Django-Audiofield Description: Django Audio Management Tools Maintainer: Areski Contributors: list of contributors Django-Audiofield is a simple app t

Areski Belaid 167 Nov 10, 2022
django Filer is a file management application for django that makes handling of files and images a breeze.

django Filer is a file management application for django that makes handling of files and images a breeze.

django CMS Association 1.6k Jan 6, 2023