A multiprocessing distributed task queue for Django

Related tags

Task Queues django-q
Overview
Q logo

A multiprocessing distributed task queue for Django

image0 image1 Documentation Status image2

Features

  • Multiprocessing worker pool
  • Asynchronous tasks
  • Scheduled, cron and repeated tasks
  • Signed and compressed packages
  • Failure and success database or cache
  • Result hooks, groups and chains
  • Django Admin integration
  • PaaS compatible with multiple instances
  • Multi cluster monitor
  • Redis, Disque, IronMQ, SQS, MongoDB or ORM
  • Rollbar and Sentry support

Requirements

Tested with: Python 3.7, 3.8, 3.9 Django 2.2.X and 3.2.X

Warning

Since Python 3.7 async became a reserved keyword and was refactored to async_task

Brokers

Installation

  • Install the latest version with pip:

    $ pip install django-q
    
  • Add django_q to your INSTALLED_APPS in your projects settings.py:

    INSTALLED_APPS = (
        # other apps
        'django_q',
    )
    
  • Run Django migrations to create the database tables:

    $ python manage.py migrate
    
  • Choose a message broker , configure and install the appropriate client library.

Read the full documentation at https://django-q.readthedocs.org

Configuration

All configuration settings are optional. e.g:

# settings.py example
Q_CLUSTER = {
    'name': 'myproject',
    'workers': 8,
    'recycle': 500,
    'timeout': 60,
    'compress': True,
    'cpu_affinity': 1,
    'save_limit': 250,
    'queue_limit': 500,
    'label': 'Django Q',
    'redis': {
        'host': '127.0.0.1',
        'port': 6379,
        'db': 0, }
}

For full configuration options, see the configuration documentation.

Management Commands

Start a cluster with:

$ python manage.py qcluster

Monitor your clusters with:

$ python manage.py qmonitor

Check overall statistics with:

$ python manage.py qinfo

Creating Tasks

Use async_task from your code to quickly offload tasks:

from django_q.tasks import async_task, result

# create the task
async_task('math.copysign', 2, -2)

# or with a reference
import math.copysign

task_id = async_task(copysign, 2, -2)

# get the result
task_result = result(task_id)

# result returns None if the task has not been executed yet
# you can wait for it
task_result = result(task_id, 200)

# but in most cases you will want to use a hook:

async_task('math.modf', 2.5, hook='hooks.print_result')

# hooks.py
def print_result(task):
    print(task.result)

For more info see Tasks

Schedule

Schedules are regular Django models. You can manage them through the Admin page or directly from your code:

# Use the schedule function
from django_q.tasks import schedule

schedule('math.copysign',
         2, -2,
         hook='hooks.print_result',
         schedule_type=Schedule.DAILY)

# Or create the object directly
from django_q.models import Schedule

Schedule.objects.create(func='math.copysign',
                        hook='hooks.print_result',
                        args='2,-2',
                        schedule_type=Schedule.DAILY
                        )

# Run a task every 5 minutes, starting at 6 today
# for 2 hours
import arrow

schedule('math.hypot',
         3, 4,
         schedule_type=Schedule.MINUTES,
         minutes=5,
         repeats=24,
         next_run=arrow.utcnow().replace(hour=18, minute=0))

# Use a cron expression
schedule('math.hypot',
         3, 4,
         schedule_type=Schedule.CRON,
         cron = '0 22 * * 1-5')

For more info check the Schedules documentation.

Testing

To run the tests you will need the following in addition to install requirements:

Or you can use the included Docker Compose file.

The following commands can be used to run the tests:

# Create virtual environment
python -m venv venv

# Install requirements
venv/bin/pip install -r requirements.txt

# Install test dependencies
venv/bin/pip install pytest pytest-django

# Install django-q
venv/bin/python setup.py develop

# Run required services (you need to have docker-compose installed)
docker-compose -f test-services-docker-compose.yaml up -d

# Run tests
venv/bin/pytest

# Stop the services required by tests (when you no longer plan to run tests)
docker-compose -f test-services-docker-compose.yaml down

Locale

Currently available in English, German and French. Translation pull requests are always welcome.

Todo

  • Better tests and coverage
  • Less dependencies?

Acknowledgements

Comments
  • TypeError: can't pickle _thread.lock objects

    TypeError: can't pickle _thread.lock objects

    Django 2.2.11 python 3.7.0 django-q 1.2.1 windows 10

    Hello, when i run manage.py qcluster i get error, does somebody know what could be source of it and how to resolve it?

      File "manage.py", line 21, in <module>
        main()
      File "manage.py", line 17, in main
        execute_from_command_line(sys.argv)
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django\core\management\__init__.py", line 381, in execute_from_command_line
        utility.execute()
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django\core\management\__init__.py", line 375, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django\core\management\base.py", line 323, in run_from_argv
        self.execute(*args, **cmd_options)
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django\core\management\base.py", line 364, in execute
        output = self.handle(*args, **options)
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django_q\management\commands\qcluster.py", line 22, in handle
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\spawn.py", line 105, in spawn_main
        exitcode = _main(fd)
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\spawn.py", line 115, in _main
        self = reduction.pickle.load(from_parent)
    EOFError: Ran out of input
        q.start()
      File "C:\Users\Mateusz\Desktop\project\env\lib\site-packages\django_q\cluster.py", line 65, in start
        self.sentinel.start()
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\process.py", line 112, in start
        self._popen = self._Popen(self)
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\context.py", line 223, in _Popen
        return _default_context.get_context().Process._Popen(process_obj)
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\context.py", line 322, in _Popen
        return Popen(process_obj)
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
        reduction.dump(process_obj, to_child)
      File "C:\Users\Mateusz\AppData\Local\Programs\Python\Python37\lib\multiprocessing\reduction.py", line 60, in dump
        ForkingPickler(file, protocol).dump(obj)
    TypeError: can't pickle _thread.lock objects
    
    help wanted 
    opened by OnufryKlaczynski 48
  • [Error] select_for_update cannot be used outside of a transaction.

    [Error] select_for_update cannot be used outside of a transaction.

    This Django error (raised from SQL Compiler) pops up in my logs and prevent any scheduled tasks to run. The error is raised from this django_q line, which is very strange since the whole try block is within the transaction.atomic() context manager.

    Any idea on why this is happening and how to fix it? Thanks!

    Config:

    • db: Postgres 11 with psycopg2 interface
    • django-q 1.2.1
    • django 3.0
    • python 3.8

    Edit

    The error is reproduced in this basic demo app

    opened by edthrn 20
  • Import Error running qcluster command Python 3.7 Django 2.1.5

    Import Error running qcluster command Python 3.7 Django 2.1.5

    Traceback (most recent call last):
      File "manage.py", line 22, in <module>
        execute_from_command_line(sys.argv)
      File "ENV/lib/python3.7/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
        utility.execute()
      File "ENV/lib/python3.7/site-packages/django/core/management/__init__.py", line 375, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "ENV/lib/python3.7/site-packages/django/core/management/__init__.py", line 224, in fetch_command
        klass = load_command_class(app_name, subcommand)
      File "ENV/lib/python3.7/site-packages/django/core/management/__init__.py", line 36, in load_command_class
        module = import_module('%s.management.commands.%s' % (app_name, name))
      File "ENV/versions/3.7.0/lib/python3.7/importlib/__init__.py", line 127, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
      File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
      File "<frozen importlib._bootstrap>", line 983, in _find_and_load
      File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
      File "<frozen importlib._bootstrap_external>", line 728, in exec_module
      File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
      File "ENV/lib/python3.7/site-packages/django_q/management/commands/qcluster.py", line 4, in <module>
        from django_q.cluster import Cluster
      File "ENV/lib/python3.7/site-packages/django_q/cluster.py", line 24, in <module>
        from django_q import tasks
      File "ENV/lib/python3.7/site-packages/django_q/tasks.py", line 12, in <module>
        from django_q.cluster import worker, monitor
    ImportError: cannot import name 'worker' from 'django_q.cluster' (ENV/lib/python3.7/site-packages/django_q/cluster.py)
    Sentry is attempting to send 1 pending error messages
    Waiting up to 10 seconds
    Press Ctrl-C to quit
    
    opened by viperfx 17
  • Task stays in queue when executing a requests.post

    Task stays in queue when executing a requests.post

    When I try and execute a requests.post the task stays on the queue and never completes the requests.post request.

    Pseudo code:

    class Auth
      def __init__(self, url, username, password):
          logging.debug('Making call for credentials.')
          r = requests.post(url, data={'username': username, 'password': password})
    
    def queue_get_auth(username, password):
        a = Auth('https://auth', 'username', 'password')
    
    
    def validate_login(username, password):
        job = async(queue_get_auth, username, password)
    

    Error I am seeing and repeats until I delete it from the database queued tasks:

    [DEBUG] | 2015-10-19 14:46:05,510 | auth:  Making call for credentials.
    14:46:05 [Q] ERROR reincarnated worker Process-1:3 after death
    14:46:05 [Q] INFO Process-1:9 ready for work at 13576
    14:46:19 [Q] INFO Process-1:4 processing [mango-two-juliet-edward]
    

    Is there a reason why the requests.post would be causing it to fail? How would I debug this? If I run it in sync: True it works fine. This is running on a Mac OS X 10.11, database sqlite.

    Q_CLUSTER = {
        'name': 'auth',
        'workers': 4,
        'recycle': 500,
        'timeout': 60,
        'compress': False,
        'save_limit': 250,
        'queue_limit': 500,
        'sync': False,
        'cpu_affinity': 1,
        'label': 'Django Q',
        'orm': 'default'
    }
    
    opened by Xorso 16
  • scheduler creating duplicate tasks in multiple cluster environment

    scheduler creating duplicate tasks in multiple cluster environment

    We have a service that uses django-q for asynchronous tasks and it's deployed as 2 instances (2 AWS EC2 servers each running the same django project and each running a django-q cluster to process tasks). We've encountered an issue where the same scheduled task -- scheduled to run once -- gets picked up by each of the clusters in the scheduler (django-q.cluster) and ends up having 2 separate tasks being created.

    Example entries in our logs:

    On server 1:

    2017-04-02 20:25:56,747 - django-q - INFO - Process-1 created a task from schedule [14789]
    2017-04-02 20:25:56,842 - django-q - DEBUG - Pushed ('hamper-india-magnesium-pip', 'f1a1141c1835400ebc4f4b3894922b82')
    

    On server 2:

    2017-04-02 20:25:56,853 - django-q - INFO - Process-1 created a task from schedule [14789]
    2017-04-02 20:25:56,990 - django-q - DEBUG - Pushed ('alpha-william-kansas-apart', '5a4fcadb47674590933415dd5a71e1cc')
    

    Is this the expected behavior or is it a bug?

    What we are looking for is to have a scheduled task create only one async task to execute the action, even in a multi-cluster setup like ours.

    Can you comment on this behavior?

    We're using:

    Django (1.10.6)
    django-q (0.7.18)
    

    Thanks

    -Kevin

    opened by kefin 14
  • Worker recycle causes:

    Worker recycle causes: "ERROR connection already closed"

    Hello, and thanks for this great Django app! I'm using Python 3.4, Django 1.9.2, the latest django-q and the ORM broker backed by PostgreSQL.

    It appears that when my worker recycles, it loses the ability to talk to the database. Everything works fine up until that point. I've verified it is directly related to the recycle configuration parameter, and changing this changes the placement of the error accordingly.

    The output below shows the issue I'm having while running 1 single worker.

    14:43:16 [Q] INFO Processed [zulu-montana-triple-michigan] 14:43:16 [Q] INFO recycled worker Process-1:1 14:43:16 [Q] INFO Process-1:4 ready for work at 22360 14:43:16 [Q] INFO Process-1:4 processing [maryland-king-queen-table] 14:43:17 [Q] INFO Process-1:4 processing [summer-mirror-mountain-september] 14:43:17 [Q] INFO Processed [maryland-king-queen-table] 14:43:17 [Q] INFO Process-1:4 processing [illinois-finch-orange-sodium] 14:43:17 [Q] ERROR server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.

    Can you provide any guidance?

    Thanks

    opened by grayb 14
  • Limit number of retries

    Limit number of retries

    When a task fails (like an unhandled exception) it is thrown in the failing jobs list but also retried. It doesn't seem there's any limit to the number of retries, and that means a job which is doomed to failure (due to my bad programming) continues to fail all day. I get a message sent to Rollbar each time, and that adds up to a lot of messages and eats my quota.

    Is there a way to set a retry limit?

    opened by tremby 13
  • SSL errors after upgrading to qcluster version 1.1.0

    SSL errors after upgrading to qcluster version 1.1.0

    Hi,

    Is anyone else having SSL Errors when using the new version (1.1.0)? I tried upgrading it in production, but started to get "django.db.utils.OperationalError: SSL error: decryption failed or bad record mac" whenever django ORM's performs a query from within a django-q task (traceback below).

    Traceback (most recent call last):
      File "/usr/local/lib/python3.7/site-packages/django_q/cluster.py", line 379, in worker
        res = f(*task['args'], **task['kwargs'])
      File "/home/docker/src/tasks.py", line 8, in wake_up_driver_app
        for company in Company.objects.all():
      File "/usr/local/lib/python3.7/site-packages/django/db/models/query.py", line 274, in __iter__
        self._fetch_all()
      File "/usr/local/lib/python3.7/site-packages/django/db/models/query.py", line 1242, in _fetch_all
        self._result_cache = list(self._iterable_class(self))
      File "/usr/local/lib/python3.7/site-packages/django/db/models/query.py", line 55, in __iter__
        results = compiler.execute_sql(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size)
      File "/usr/local/lib/python3.7/site-packages/django/db/models/sql/compiler.py", line 1133, in execute_sql
        cursor.execute(sql, params)
      File "/usr/local/lib/python3.7/site-packages/sentry_sdk/integrations/django/__init__.py", line 446, in execute
        return real_execute(self, sql, params)
      File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 67, in execute
        return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
      File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
        return executor(sql, params, many, context)
      File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
        return self.cursor.execute(sql, params)
      File "/usr/local/lib/python3.7/site-packages/django/db/utils.py", line 89, in __exit__
        raise dj_exc_value.with_traceback(traceback) from exc_value
      File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
        return self.cursor.execute(sql, params)
    django.db.utils.OperationalError: SSL error: decryption failed or bad record mac
    

    Configuration:

    • broker: AWS SQS
    • database: AWS RDS Postgres 11.5
    • Django version: 2.2.10
    • Qcluster version: 1.1.0

    I found an old similar issue: https://github.com/Koed00/django-q/issues/79 but it seems to have been solved.

    Does anyone have a clue on how to investigate this issue? For now I'm keeping the previous version (1.0.2) that doesn't have those issues but I need some of the fixes that are part of 1.1.0 release.

    Thanks in advance!

    opened by marcelolima 12
  • Change schedule function to update or create new

    Change schedule function to update or create new

    By searching for the function we can update existing schedules.

    During development I noticed every time I changed a param from my scheduled function there was a entry inserted into the database. Giving me multiple scheduled functions from a single function.

    I havent updated any tests, if you want I can add a test for this 'new' feature

    opened by Eagllus 12
  • Add Sentry support

    Add Sentry support

    I see Rollbar is built in. Could you add support for Sentry too?

    Alternatively, you could pull out Rollbar into its own django-q-rollbar repo (and perhaps use extras so someone can add django-q[rollbar] to their requirements.txt), and expose a generic error handling interface so others can add their own.

    Really enjoying this project btw 😄

    opened by joeyespo 11
  • Django-q calls task twice or more

    Django-q calls task twice or more

    My background process is called twice (or more) but I'm really sure that should not be happening. My settings for Django Q:

    Q_CLUSTER = {
        'name': 'cc',
        'recyle': 10,
        'retry': -1,
        'workers': 2,
        'save_limit': 0,
        'orm': 'default'
    }
    

    My test task function:

    def task_test_function(email, user):
        print('test')
    

    calling it from the commandline:

    > python manage.py shell
    >>> from django_q.tasks import async
    >>> async('task_test_function', 'email', 'user')
    '9a0ba6b8bcd94dc1bc129e3d6857b5ee'
    

    Starting qcluster (after that I called the async)

    > python manage.py qcluster
    13:48:08 [Q] INFO Q Cluster-33552 starting.
    ...
    13:48:08 [Q] INFO Q Cluster-33552 running.
    13:48:34 [Q] INFO Process-1:2 processing [mobile-utah-august-indigo]
    test
    13:48:34 [Q] INFO Process-1:1 processing [mobile-utah-august-indigo]
    test
    13:48:34 [Q] INFO Processed [mobile-utah-august-indigo]
    13:48:34 [Q] INFO Processed [mobile-utah-august-indigo]
    ...
    

    And the function is called twice... For most functions I wouldn't really care if they run twice (or more) but I have a task that calls send_mail and people that are invited receive 2 or more mails...

    Is this a bug in Django Q or in my logic?

    opened by Eagllus 10
Releases(v1.3.9)
  • v1.3.9(Jun 10, 2021)

  • v1.3.8(Jun 8, 2021)

  • v1.3.7(Jun 3, 2021)

    Changes

    • Build improvements and localtime fixes(#569) @Koed00
    • Create codeql-analysis.yml (#564) @Koed00
    • Fix docs error (#563) @aken830806
    • Codecov_fixes. Got coverage again (#562) @Koed00
    • Feature/improves multiple databases support (#561) @abxsantos

    This release will try to fully switch to Poetry for building releases. Please create an issue if there are any problems.

    Source code(tar.gz)
    Source code(zip)
  • v1.3.6(May 14, 2021)

    Changes

    • Fix for SSL errors in #422 (#556) @nittolese
    • Fixes #314 - Convert func to its import path str so that resubmitting failed task works (#554) @kennyhei
    • Add "qmemory" command (#553) @kennyhei
    • Allow tasks to be scheduled on a specific cluster (#555) @midse
    • Fixes #225 - Successful tasks grow beyond SAVE_LIMIT (#552) @kennyhei
    • Update documentation for new retry time default (#538) @amo13
    • Fixes deprecated count method (#549) @Koed00
    • Updates testing to python 3.9 and Django 3.2 (#548) @Koed00
    • Adds long polling support (#506) @Javedgouri
    • Use 'timezone.localtime()' when calculating the next run time (#520) @wy-z
    Source code(tar.gz)
    Source code(zip)
  • v1.3.5(Feb 26, 2021)

    Changes

    • Add a warning for misconfiguration. (#509) @icfly2
    • Migrate to Github Action CI (#507) @Koed00
    • Add broker name in Schedule and enhanced Queued Tasks list display admin (#502) @telmobarros
    • Add example of http health check (#504) @pysean3
    • Added german translation (#499) @jonaswinkler
    • Update brokers.rst (#497) @MaximilianKindshofer
    Source code(tar.gz)
    Source code(zip)
  • v1.3.4(Oct 29, 2020)

    Changes

    • Try to get SQS queue before creating it (#478) @fallenhitokiri
    • Empty dictionary as configuration value for SQS (#477) @fallenhitokiri
    • Model.unicode() has no effect in Python 3.X (#479) @alx-sdv
    • Fix deprecation warning RemovedInDjango40Warning (#483) @Djailla
    • Fix for #424 TypeError: can't pickle _thread.lock objects (#482) @ihuk
    • [WIP]Change Django documentation links and URLs to a supported version (v1.8 -> v2.2) (#481) @jagu2012
    Source code(tar.gz)
    Source code(zip)
  • v1.3.3(Aug 16, 2020)

    Changes

    • Add attempt_count to limit the number of times a filed task will be re-attempted (#466) @timomeara
    • Updates to Django 3.1 (#464) @Koed00
    Source code(tar.gz)
    Source code(zip)
  • v1.3.2(Jul 8, 2020)

  • v1.3.1(Jul 2, 2020)

  • v1.3.0(Jul 2, 2020)

    Changes

    • Support for Cron expressions (#452) @Koed00
    • Adds hint, some linting and a release drafter (#449) @Koed00
    • Use 'force_str' instead of deprecated 'force_text' (#448) @edouardtheron
    • [cleanup] Few cleanup commit for linting and migrations (#447) @Djailla

    Dependency Updates

    • Updates packages (#450) @Koed00

    Notes

    • This release includes migrations
    • For cron expression support you will have to pip install croniter.
    Source code(tar.gz)
    Source code(zip)
  • v1.2.4(Jun 10, 2020)

  • v1.2.3(May 31, 2020)

  • v1.2.2(May 31, 2020)

    Closed issues:

    • Scheduled task being executed many times #426
    • schedule doesn't work #416
    • Expose list of workers and their states via API #364
    • Tasks are not encrypted, only signed #300

    Merged pull requests:

    • Poetry #442 (Koed00)
    • Fix issues when using multiple databases with a database router #440 (maerteijn)
    • Update documentation to say tasks are signed, not encrypted #429 (asedeno)
    • Fix issue when using USE_TZ=False with MySQL #428 (hhyo)
    • When sync=True, re-raise exceptions from the worker. #417 (rbranche)
    Source code(tar.gz)
    Source code(zip)
  • v1.2.1(Feb 18, 2020)

  • v1.2.0(Feb 17, 2020)

  • v.1.1.0(Jan 18, 2020)

  • v1.0.2(Aug 10, 2019)

    • Fixes deprecated Arrow interface issues.
    • Tests with Django 2.2.4 and 1.11.23
    • SQL server fix @wgordon17
    • Circular import fix @lamby
    • MySQL fix for no timezone configs @maerteijn
    • Loads of timeout and concurrency fixes from @jannero
    Source code(tar.gz)
    Source code(zip)
  • v1.0.1(Aug 29, 2018)

    • Added French locale @tboulogne
    • Removes Python 3.4 from supported versions

    Unfortunately the re-factoring for Python 3.7 has caused some circular imports in Python 3.4 I've removed support for now but I'm open to PR's to fix this for those that are stuck with 3.4 until EOL.

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(Aug 14, 2018)

    Breaking Changes!

    • Supports Python 3.7 > async became a reserved keyword, replaced by async_task @P-EB
    • Deprecated Python 2
    • Supports Django 2+
    • Deprecated Django < 1.11
    • Deprecated Rollbar as a separate module > use the pluggeable error reporters from now on
    • Adds exception traceback to failed tasks @strets123
    • Frees resources when running in sync mode @abompard

    This release no longer supports older Python and Django versions. For those that want to keep using Django-Q with Python 2.7, please use release v0.9.4.

    Please replace all occurrences of async with async_task when you upgrade.

    Source code(tar.gz)
    Source code(zip)
  • v0.9.4(Mar 13, 2018)

  • v0.9.3(Mar 13, 2018)

    • Adds option for acknowledging failed tasks @Balletie
    • Changes some imports to prevent namespace clashes @Eagllus
    • Updated to the latest Django versions ( 1.8.19, 1.11.11, 2.0.3) and packages for testing
    Source code(tar.gz)
    Source code(zip)
  • v0.9.2(Feb 13, 2018)

    • fixes some Python 2 regressions - @Eagllus

    Please note that we will gradually remove support for Python 2 in 2018. We will keep a stable compatible package around for those that need it.

    Source code(tar.gz)
    Source code(zip)
  • v0.9.1(Feb 2, 2018)

    • fixes admin url escaping for Django 2 - @Eagllus
    • fixes error reporter plugin entry points - @danielwelch
    • allows SQS to use generic AWS environment variables - @svdgraaf
    • updates some packages for testing
    Source code(tar.gz)
    Source code(zip)
  • v0.9.0(Jan 8, 2018)

  • v0.8.1(Oct 12, 2017)

    • Introduces pluggable error reporters and Sentry support. Thanks to @danielwelch
    • Removes the future dependency. Thanks to @benjaoming
    • Uses 32 bit integer for repeat field. Thanks to @gchardon-hiventy
    • Replaces some relative imports for better compatibility
    • Updates supported Django to 1.11.6
    • Tested with latest Python 2.7 and 3.6 versions
    • Updated package dependencies
    Source code(tar.gz)
    Source code(zip)
  • v0.8.0(Apr 5, 2017)

  • v0.7.18(Jun 7, 2016)

  • v0.7.17(Apr 24, 2016)

  • v0.7.16(Mar 7, 2016)

    • Updates compatibility to Django 1.8.11 and 1.9.4
    • Clears database connections on worker death
    • Group names for schedules now actually work
    • Schedule names in the admin can be left blank
    • Task names can now be set with the task_name argument
    Source code(tar.gz)
    Source code(zip)
  • v0.7.15(Jan 27, 2016)

    • You can now use your own custom broker classes by setting broker_class https://django-q.readthedocs.org/en/latest/brokers.html#custom-broker
    • The polling interval for database based brokers can now be set with poll https://django-q.readthedocs.org/en/latest/configure.html#poll
    • Fixes an issue with the Rollbar config
    Source code(tar.gz)
    Source code(zip)
Distributed Task Queue (development branch)

Version: 5.0.5 (singularity) Web: http://celeryproject.org/ Download: https://pypi.org/project/celery/ Source: https://github.com/celery/celery/ Keywo

Celery 20.7k Jan 2, 2023
Distributed Task Queue (development branch)

Version: 5.1.0b1 (singularity) Web: https://docs.celeryproject.org/en/stable/index.html Download: https://pypi.org/project/celery/ Source: https://git

Celery 20.7k Jan 1, 2023
a little task queue for python

a lightweight alternative. huey is: a task queue (2019-04-01: version 2.0 released) written in python (2.7+, 3.4+) clean and simple API redis, sqlite,

Charles Leifer 4.3k Jan 8, 2023
Django database backed celery periodic task scheduler with support for task dependency graph

Djag Scheduler (Dj)ango Task D(AG) (Scheduler) Overview Djag scheduler associates scheduling information with celery tasks The task schedule is persis

Mohith Reddy 3 Nov 25, 2022
A simple app that provides django integration for RQ (Redis Queue)

Django-RQ Django integration with RQ, a Redis based Python queuing library. Django-RQ is a simple app that allows you to configure your queues in djan

RQ 1.6k Dec 28, 2022
RQ (Redis Queue) integration for Flask applications

Flask-RQ RQ (Redis Queue) integration for Flask applications Resources Documentation Issue Tracker Code Development Version Installation $ pip install

Matt Wright 205 Nov 6, 2022
Accept queue automatically on League of Legends.

Accept queue automatically on League of Legends. I was inspired by the lucassmonn code accept-queue-lol-telegram, and I modify it according to my need

null 2 Sep 6, 2022
Redis-backed message queue implementation that can hook into a discord bot written with hikari-lightbulb.

Redis-backed FIFO message queue implementation that can hook into a discord bot written with hikari-lightbulb. This is eventually intended to be the backend communication between a bot and a web dashboard.

thomm.o 7 Dec 5, 2022
Sync Laravel queue with Python. Provides an interface for communication between Laravel and Python.

Python Laravel Queue Queue sync between Python and Laravel using Redis driver. You can process jobs dispatched from Laravel in Python. NOTE: This pack

Sinan Bekar 3 Oct 1, 2022
Beatserver, a periodic task scheduler for Django 🎵

Beat Server Beatserver, a periodic task scheduler for django channels | beta software How to install Prerequirements: Follow django channels documenta

Raja Simon 130 Dec 17, 2022
A fast and reliable background task processing library for Python 3.

dramatiq A fast and reliable distributed task processing library for Python 3. Changelog: https://dramatiq.io/changelog.html Community: https://groups

Bogdan Popa 3.4k Jan 1, 2023
Dagon - An Asynchronous Task Graph Execution Engine

Dagon - An Asynchronous Task Graph Execution Engine Dagon is a job execution sys

null 8 Nov 17, 2022
Full featured redis cache backend for Django.

Redis cache backend for Django This is a Jazzband project. By contributing you agree to abide by the Contributor Code of Conduct and follow the guidel

Jazzband 2.5k Jan 3, 2023
A Django app that integrates with Dramatiq.

django_dramatiq django_dramatiq is a Django app that integrates with Dramatiq. Requirements Django 1.11+ Dramatiq 0.18+ Example You can find an exampl

Bogdan Popa 261 Dec 25, 2022
Queuing with django celery and rabbitmq

queuing-with-django-celery-and-rabbitmq Install Python 3.6 or above sudo apt-get install python3.6 Install RabbitMQ sudo apt-get install rabbitmq-ser

null 1 Dec 22, 2021
A fully-featured e-commerce application powered by Django

kobbyshop - Django Ecommerce App A fully featured e-commerce application powered by Django. Sections Project Description Features Technology Setup Scr

Kwabena Yeboah 2 Feb 15, 2022
A multiprocessing distributed task queue for Django

A multiprocessing distributed task queue for Django Features Multiprocessing worker pool Asynchronous tasks Scheduled, cron and repeated tasks Signed

Ilan Steemers 1.7k Jan 3, 2023
Mr. Queue - A distributed worker task queue in Python using Redis & gevent

MRQ MRQ is a distributed task queue for python built on top of mongo, redis and gevent. Full documentation is available on readthedocs Why? MRQ is an

Pricing Assistant 871 Dec 25, 2022
A Python package for easy multiprocessing, but faster than multiprocessing

MPIRE, short for MultiProcessing Is Really Easy, is a Python package for multiprocessing, but faster and more user-friendly than the default multiprocessing package.

null 753 Dec 29, 2022
A django integration for huey task queue that supports multi queue management

django-huey This package is an extension of huey contrib djhuey package that allows users to manage multiple queues. Installation Using pip package ma

GAIA Software 32 Nov 26, 2022