PostgreSQL-based Task Queue for Python

Overview

Procrastinate: PostgreSQL-based Task Queue for Python

Deployed to PyPI Deployed to PyPI GitHub Repository Continuous Integration Documentation Coverage MIT License Contributor Covenant

Procrastinate is an open-source Python 3.7+ distributed task processing library, leveraging PostgreSQL to store task definitions, manage locks and dispatch tasks. It can be used within both sync and async code.

In other words, from your main code, you call specific functions (tasks) in a special way and instead of being run on the spot, they're scheduled to be run elsewhere, now or in the future.

Here's an example:

# mycode.py
import procrastinate

# Make an app in your code
app = procrastinate.App(connector=procrastinate.AiopgConnector())

# Then define tasks
@app.task(queue="sums")
def sum(a, b):
    with open("myfile", "w") as f:
        f.write(str(a + b))

with app.open():
    # Launch a job
    sum.defer(a=3, b=5)

    # Somewhere in your program, run a worker (actually, it's often a
    # different program than the one deferring jobs for execution)
    app.run_worker(queues=["sums"])

The worker will run the job, which will create a text file named myfile with the result of the sum 3 + 5 (that's 8).

Similarly, from the command line:

export PROCRASTINATE_APP="mycode.app"

# Launch a job
procrastinate defer mycode.sum '{"a": 3, "b": 5}'

# Run a worker
procrastinate worker -q sums

Lastly, you can use Procrastinate asynchronously too:

import asyncio

import procrastinate

# Make an app in your code
app = procrastinate.App(connector=procrastinate.AiopgConnector())

# Define tasks using coroutine functions
@app.task(queue="sums")
async def sum(a, b):
    await asyncio.sleep(a + b)

async with app.open_async():
    # Launch a job
    await sum.defer_async(a=3, b=5)

    # Somewhere in your program, run a worker (actually, it's often a
    # different program than the one deferring jobs for execution)
    await app.run_worker_async(queues=["sums"])

There are quite a few interesting features that Procrastinate adds to the mix. You can head to the Quickstart section for a general tour or to the How-To sections for specific features. The Discussion section should hopefully answer your questions. Otherwise, feel free to open an issue.

The project is still quite early-stage and will probably evolve.

Note to my future self: add a quick note here on why this project is named "Procrastinate".

Where to go from here

The complete docs is probably the best place to learn about the project.

If you encounter a bug, or want to get in touch, you're always welcome to open a ticket.

Comments
  • App.import_paths not working

    App.import_paths not working

    I would like to invoke a worker process(es) using the CLI invocation, a la:

    procrastinate --app=app.procrastinate.app worker
    

    However, we also would like to "hook in" to the app initialization in order to do some setup work (e.g. DB connection pool, etc.).

    That setup work is illustrated in this example procrastinate.py module:

    import asyncio
    
    from procrastinate import AiopgConnector, App
    
    from app.settings import database, DATABASE_URL
    
    app = App(
        connector=AiopgConnector(dsn=DATABASE_URL),
        import_paths=['app.reconciler'],
    )
    
    
    async def main():
        await app.open_async()
        await database.connect()
        await app.run_worker_async(
            delete_jobs='successful',
        )
    
    
    if __name__ == "__main__":
        print('SCRIPT MODE')
        print('__name__ == "__main__"')
        asyncio.run(main())
    

    This allows me to run the main() function which does my setup work before initializing the worker. However this requires me to execute this process as a Python script - python -m app.procrastinate - which is fine, except my reconcile task no longer seems to be imported/loadable by my worker:

    (⎈ minikube:cloud)➜  dev git:(mc/cloud-worker) βœ— k logs -n cloud deployments/cloud-app -c cloud-app-worker -f
    SCRIPT MODE
    __name__ == "__main__"
    Task was not found: Task at reconcile cannot be imported: reconcile is not a valid path
    Traceback (most recent call last):
      File "/usr/local/lib/python3.9/site-packages/procrastinate/tasks.py", line 14, in load_task
        task = utils.load_from_path(path, Task)
      File "/usr/local/lib/python3.9/site-packages/procrastinate/utils.py", line 27, in load_from_path
        raise exceptions.LoadFromPathError(f"{path} is not a valid path")
    procrastinate.exceptions.LoadFromPathError: reconcile is not a valid path
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/lib/python3.9/site-packages/procrastinate/worker.py", line 175, in process_job
        await self.run_job(job=job, worker_id=worker_id)
      File "/usr/local/lib/python3.9/site-packages/procrastinate/worker.py", line 240, in run_job
        task = self.load_task(task_name=task_name, worker_id=worker_id)
      File "/usr/local/lib/python3.9/site-packages/procrastinate/worker.py", line 222, in load_task
        task = tasks.load_task(task_name)
      File "/usr/local/lib/python3.9/site-packages/procrastinate/tasks.py", line 16, in load_task
        raise exceptions.TaskNotFound(f"Task at {path} cannot be imported: {str(exc)}")
    procrastinate.exceptions.TaskNotFound: Task at reconcile cannot be imported: reconcile is not a valid path
    
    

    When I run the process via CLI, then App.import_paths are honored. When I run the process as a Python script, App.import_paths does not seem to be honored.

    Any suggestions? Thank you! πŸ˜„

    Issue type: Feature ⭐️ Issue contains: Some documentation πŸ“š Issue contains: Exploration & Design decisions 🀯 Issue contains: Some Python 🐍 Issue appropriate for: People up for a challenge 🀨 
    opened by mecampbellsoup 22
  • When worker restarts, interrupted jobs are forever lost

    When worker restarts, interrupted jobs are forever lost

    I have been careful to apply the db migrations, but I suspect I have missed something.

    • procrastinate version: procrastinate==0.15.2

    Steps:

    1. start a worker
    2. submit a job
    3. restart the worker while the job is processed
    4. on worker restart, it never picks up the incomplete job

    The relevant state in the db: image

    There's no queueing_lock which makes me suspicious.

    What is the expected state of a running job, and how would I determine if a job can be picked up if the worker failed?

    Thanks!

    Issue type: Bug 🐞 Issue appropriate for: newcomers 🀩 Issue contains: Some documentation πŸ“š Issue contains: Exploration & Design decisions 🀯 Issue contains: Some Python 🐍 Issue appropriate for: Occasional contributors πŸ˜‰ 
    opened by dionjwa 17
  • Second pass on blueprints

    Second pass on blueprints

    Closes #421 again.

    Ping @tomdottom:

    • I might split this into multiple PRs, but the individual commits should make sense
    • I'm not sure at all about the approach yet, I'm trying out things and deciding what I like or not. I haven't fixed tests yet, but I believe everything in there should be quite testable.
    • And I welcome your opinion too!

    So what is this PR doing:

    • Removing app.queues, that was unused and undocumented (well except a bit in tests, we'll see)
    • Making app inherit blueprint. At this point, all the code App did with tasks was copy-pasted in blueprint. I think it makes sense to say an App really is built from a blueprint :D This removes the need for the protocol and the problems of having matching signatures.
    • Added the namespace thing. For now it's : the separator. Also, after giving it a though, it names
    • As you mentioned, add a safeguard preventing the registration of multiple tasks with the same name. Also, the registration of multiple blueprint namespaces with the same app/blueprint.
    • Renaming register as add_tasks_from because if we have blueprint.register(blueprint), it becomes hard to understand who registers who.
    • Made builtin tasks a Blueprint. Drink our own champagne. Our unique builtin task is builtin:procrastinate.builtin_tasks.remove_old_jobs
    • Added App.add_task_alias, which helps when a task changes namespace (e.g. it's moved into a different blueprint). For example, this lets us keep compatibility with our builtin task.

    Successful PR Checklist:

    • [x] Tests
      • [ ] (not applicable?)
    • [x] Documentation
      • [ ] (not applicable?)
    opened by ewjoachim 16
  • Best way to integrate with django migration system

    Best way to integrate with django migration system

    This isn't a bug or feature request but rather a discussion to understand the best way to integrate procrastinate within a Django project, in particular, at the SQL level, as Django as its own migration system.

    My first attempt was to use raw SQL and execute each sql/migrations/delta* delta file, like this:

    # Generated by Django 3.0.8 on 2020-07-18 08:39
    import os
    from django.db import connection
    from django.db import migrations
    from django.db import transaction
    
    
    def apply_sql(apps, schema_editor):
        import procrastinate
        sql_path = os.path.join(
            os.path.dirname(procrastinate.__file__),
            'sql',
            'migrations',
        )
         
        migrations = [
            "baseline-0.5.0.sql",
            "delta_0.5.0_001_drop_started_at_column.sql",
            "delta_0.5.0_002_drop_started_at_column.sql",
            "delta_0.5.0_003_drop_procrastinate_version_table.sql",
            "delta_0.6.0_001_fix_procrastinate_fetch_job.sql",
            "delta_0.7.1_001_fix_trigger_status_events_insert.sql",
            "delta_0.8.1_001_add_queueing_lock_column.sql",
            "delta_0.10.0_001_close_fetch_job_race_condition.sql",
            "delta_0.10.0_002_add_defer_job_function.sql",
            "delta_0.11.0_003_add_procrastinate_periodic_defers.sql",
            "delta_0.12.0_001_add_foreign_key_index.sql",
        ]
    
        with connection.cursor() as cursor:
            with transaction.atomic():
                for name in migrations:
                    full_path = os.path.join(sql_path, name)
                    with open(full_path, 'rb') as f:
                        print('Applying {}'.format(full_path))
                        sql = f.read().decode()
                        cursor.execute(sql)
    
    
    def rewind(*args, **kwargs):
        pass
    
    
    class Migration(migrations.Migration):
    
        dependencies = [
            ('common', '0008_auto_20200701_1317'),
        ]
    
        operations = [
            migrations.RunPython(apply_sql, rewind)
        ]
    

    However, this crashes, with the following output:

    python manage.py migrate common 0009
    
    Operations to perform:
      Target specific migration: 0009_procrastinate, from common
    Running migrations:
      Applying common.0009_procrastinate...Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/baseline-0.5.0.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.5.0_001_drop_started_at_column.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.5.0_002_drop_started_at_column.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.5.0_003_drop_procrastinate_version_table.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.6.0_001_fix_procrastinate_fetch_job.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.7.1_001_fix_trigger_status_events_insert.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.8.1_001_add_queueing_lock_column.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.10.0_001_close_fetch_job_race_condition.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.10.0_002_add_defer_job_function.sql
    Applying /venv/lib/python3.7/site-packages/procrastinate/sql/migrations/delta_0.11.0_003_add_procrastinate_periodic_defers.sql
    Traceback (most recent call last):
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
        return self.cursor.execute(sql)
    psycopg2.errors.SyntaxError: syntax error at or near ";"
    LINE 9: DROP FUNCTION IF EXISTS procrastinate_defer_job;
                                                           ^
    
    
    The above exception was the direct cause of the following exception:
    
    Traceback (most recent call last):
      File "manage.py", line 27, in <module>
        execute_from_command_line(sys.argv)
      File "/venv/lib/python3.7/site-packages/django/core/management/__init__.py", line 401, in execute_from_command_line
        utility.execute()
      File "/venv/lib/python3.7/site-packages/django/core/management/__init__.py", line 395, in execute
        self.fetch_command(subcommand).run_from_argv(self.argv)
      File "/venv/lib/python3.7/site-packages/django/core/management/base.py", line 328, in run_from_argv
        self.execute(*args, **cmd_options)
      File "/venv/lib/python3.7/site-packages/django/core/management/base.py", line 369, in execute
        output = self.handle(*args, **options)
      File "/venv/lib/python3.7/site-packages/django/core/management/base.py", line 83, in wrapped
        res = handle_func(*args, **kwargs)
      File "/venv/lib/python3.7/site-packages/django/core/management/commands/migrate.py", line 233, in handle
        fake_initial=fake_initial,
      File "/venv/lib/python3.7/site-packages/django/db/migrations/executor.py", line 117, in migrate
        state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial)
      File "/venv/lib/python3.7/site-packages/django/db/migrations/executor.py", line 147, in _migrate_all_forwards
        state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial)
      File "/venv/lib/python3.7/site-packages/django/db/migrations/executor.py", line 245, in apply_migration
        state = migration.apply(state, schema_editor)
      File "/venv/lib/python3.7/site-packages/django/db/migrations/migration.py", line 124, in apply
        operation.database_forwards(self.app_label, schema_editor, old_state, project_state)
      File "/venv/lib/python3.7/site-packages/django/db/migrations/operations/special.py", line 190, in database_forwards
        self.code(from_state.apps, schema_editor)
      File "/app/funkwhale_api/common/migrations/0009_procrastinate.py", line 37, in apply_sql
        cursor.execute(sql)
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 100, in execute
        return super().execute(sql, params)
      File "/venv/lib/python3.7/site-packages/cacheops/transaction.py", line 93, in execute
        result = self._no_monkey.execute(self, sql, params)
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 68, in execute
        return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 77, in _execute_with_wrappers
        return executor(sql, params, many, context)
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 86, in _execute
        return self.cursor.execute(sql, params)
      File "/venv/lib/python3.7/site-packages/django/db/utils.py", line 90, in __exit__
        raise dj_exc_value.with_traceback(traceback) from exc_value
      File "/venv/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
        return self.cursor.execute(sql)
    django.db.utils.ProgrammingError: syntax error at or near ";"
    LINE 9: DROP FUNCTION IF EXISTS procrastinate_defer_job;
    

    Execution of delta_0.11.0_003_add_procrastinate_periodic_defers.sql fails with a syntax error, which I don't understand unfortunately. If you have some ideas, I can definitely try these, and possibly document Django integration when I have a working setup :)

    Issue type: Bug 🐞 Issue contains: Some SQL 🐘 Issue type: Feature ⭐️ Issue contains: Some Python 🐍 Issue appropriate for: Occasional contributors πŸ˜‰ 
    opened by agateblue 16
  • Don't set enable_json and enable_hstore to False

    Don't set enable_json and enable_hstore to False

    Don't set enable_json and enable_hstore to False when calling aiopg.connect.

    aiopg does register_default_json(self._conn), so the registration of the json typecaster only applies to the connection being created. And with Procrastinate creating its own connections there's no risk of overriding global application settings.

    Cf. #113

    Successful PR Checklist:

    • [ ] Tests
    • [X] Documentation (optionally: run spell checking) Not relevant
    • [X] Had a good time contributing? (if not, feel free to give some feedback)
    opened by elemoine 16
  • Add Blueprint and lazy registration pattern

    Add Blueprint and lazy registration pattern

    Enables collections of tasks to be defined independently of the app itself and registered at a later time.

    Closes #421

    Successful PR Checklist:

    • [x] Tests
      • [ ] (not applicable?)
    • [x] Documentation
      • [ ] (not applicable?)
    opened by tomdottom 15
  • Remove the procrastinate_jobs.started_at column

    Remove the procrastinate_jobs.started_at column

    Closes #143

    This PRΒ removes the unneeded procrastinate_jobs.started_at column. The select_stalled_jobs, which is the only user of that column, is rewritten in terms of a JOIN between procrastinate_jobs and procrastinate_events.

    TODO with a separate PR: document how to use Pum to do database migrations.

    Successful PR Checklist:

    • [x] Tests
    • [ ] Documentation (optionally: run spell checking) not relevant + I propose to add an "Use Pum for migrations" How-to section with a separate PR
    • [x] Had a good time contributing? (if not, feel free to give some feedback)
    opened by elemoine 14
  • New rules for migrations

    New rules for migrations

    See #100 for our first iteration.

    Why

    We need to provide a way to run migrations while procrastinate runs, which means we need each each version of the schema to be compatible with 2 at least versions of the code (and likewise, each version of the code to be compatible with at least 2 versions of the schema).

    How

    • By setting clear rules on what version of the schema should work with what version of the code
    • By testing that it's the case
    • By documenting the upgrade strategy

    We can be quite limiting at start (e.g. only upgrading versions 1 by 1 is supported)

    Let's flesh out proposals below.

    Issue contains: Some SQL 🐘 Issue contains: Exploration & Design decisions 🀯 Issue type: Process βš™οΈ Issue appropriate for: People up for a challenge 🀨 
    opened by ewjoachim 13
  • Capture logs for individual jobs

    Capture logs for individual jobs

    It should be possible to easily inspect logs of a single job that has run.

    Option 1

    Update the logger formatting to include the task name & id so that backend logs can be easily filtered.

    Option 2

    • Dynamically create new logging handlers & filters when a job is run to collect logs only for the job.
    • Save these in a new table associated with the job.
    • Provide a way to access logs from job instances
    job = app.job_manager.list_job(id=1)
    for attempt in jobs.attempts:
        print(attempt.logs)
    
    CREATE TABLE procrastinate_logs (
        id BIGSERIAL PRIMARY KEY,
        job_id integer NOT NULL REFERENCES procrastinate_jobs ON DELETE CASCADE,
        attempt_id integer NOT NULL,
        logs TEXT,
    );
    
    opened by tomdottom 12
  • Access app.connector from a worker

    Access app.connector from a worker

    I am not sure this is an appropriate place for questions like mine, as I haven't found a good place for procrastinate related questions - I apologize if it is not & please let me know, so I will move it to an appropriate place.

    I want my workers to write results from completed jobs to a psql database - the same database initiated for the App connector. Is there a way for workers to access to app.connector from within a task function?

    I have tried my best to search for sample code in the repository and the document and haven't found any. I also tried to pass app or an aiopg.Pool object to my task functions, but I am getting a "TypeError: Object of type Pool is not JSON serializable".

    Related to this, what is the best way to pass an object like an aiopg.Pool that is to be shared across jobs and close it before all workers exit? Besides the connection pool, I hope to share aihttp.ClientSession across jobs and then close it before all workers exit. Is it possible?

    At this point, I resort to pass the dsn to each job and have each worker create its own psql connection and aihttp.ClientSession. It works but is less than optimal. I like how the arq package (an async job queues with redis) accepts a startup & shutdown function to initiate & dispose resources that are to be shared across jobs (https://arq-docs.helpmanual.io/#startup-shutdown-coroutines). Is this something possible with procrastinate?

    Thanks for creating this powerful package!

    opened by lmwang9527 12
  • Ambiguous PSQL function when `delete_job` undefined

    Ambiguous PSQL function when `delete_job` undefined

    Summary

    When delete_job of the current procrastinate_finish_job function is None, the current procrastinate_finish_job, defined here, becomes ambiguous with the old(est) one defined here.

    Trace

    The resulting bug looks like:

      File "aiopg/connection.py", line 106, in _ready
        state = self._conn.poll()
    psycopg2.errors.AmbiguousFunction: function procrastinate_finish_job(integer, unknown, unknown) is not unique
    LINE 1: SELECT procrastinate_finish_job(4, 'succeeded', NULL);
                   ^
    HINT:  Could not choose a best candidate function. You might need to add explicit type casts.
    
    The above exception was the direct cause of the following exception:
    
    [...]
     File "/procrastinate/worker.py", line 171, in process_job
        await self.job_manager.finish_job(
     File "/procrastinate/manager.py", line 135, in finish_job
        await self.connector.execute_query_async(
     File "/procrastinate/aiopg_connector.py", line 35, in wrapped
        raise exceptions.ConnectorException from exc
    procrastinate.exceptions.ConnectorException: 
        Database error.
    

    Sadly, I don't have a minimal example for reproduction at the moment. The error occured when starting one worker after a job was deferred.

    A possible solution would be enforcing a boolean value of the delete_job param of finish_job.

    opened by BracketJohn 12
  • Log expected exceptions triggering retries at non-error level

    Log expected exceptions triggering retries at non-error level

    I would like to have INFO level logging switched on for Procrastinate's worker.py in order to see the start/stop points for the worker itself, along with any errors and warnings about misconfiguration, etc..

    However, worker.py currently logs all unhanded exceptions from a task at ERROR, which results in error-level logging for expected exceptions. This is not desirable where the procrastinate task is being used to make use of the retry mechanism e.g. to repeatedly attempt a call to a remote API that can fail intermittently. In this context., I only want the final exception to be logged as an error (as now that there will be no more retry attempts, it's truly a problem). Prior to that, if the retry strategy is triggered, I believe INFO would be more appropriate.

    Would you be open to this change being made? (Am happy to put a PR together).

    opened by ashleyheath 1
  • Dynamically extending job liveness

    Dynamically extending job liveness

    So you have Retry Stalled Jobs, where you set a RUNNING_JOBS_MAX_TIME and you retry jobs that are stalled https://procrastinate.readthedocs.io/en/stable/howto/retry_stalled_jobs.html.

    Now, what happens if your jobs are dynamic in nature and can take very long?

    Think, you're converting videos on a video-uploading website. Someone uploads 1 minute video, another one uploads a 24hour video.

    You need a way to "extend" the lifetime of the 24hour video job while also having the ability to retry it if it failed/stalled somehow.

    This way you'd need a thread, that will extend a timestamp in the database every, say, 30 seconds. And stalled jobs would only be considered those where time has passed since last extended time.

    Makes sense? Or maybe there's another way?

    opened by ddorian 2
  • Ability to enqueue a task ahead of already-enqueued tasks?

    Ability to enqueue a task ahead of already-enqueued tasks?

    I skimmed the docs and didn't see anything but - is there any existing feature where we could "short-circuit" already-enqueued tasks so that the next thing enqueued is taken off the queue and performed next?

    opened by mecampbellsoup 1
  • Running procrastinate in fastapi hangs with SIGINIT

    Running procrastinate in fastapi hangs with SIGINIT

    Hi

    When running procrastinate in fastapi and doing a SIGINT, the application hangs.

    from fastapi import FastAPI
    import asyncio
    from procrastinate import App, AiopgConnector
    
    app = FastAPI()
    
    task_queue = App(connector=AiopgConnector(dsn="postgresql://postgres:docker@localhost:11001"))
    
    
    @app.on_event("startup")
    async def setup() -> None:
        print("STARTUP")
    
        await task_queue.open_async()
        asyncio.create_task(task_queue.run_worker_async())
    
    
    @app.on_event("shutdown")
    async def teardown() -> None:
        print("SHUTDOWN")
        await task_queue.close_async()
    
    
    @app.get("/")
    async def root():
        return {"Hello": "World"}
    

    When executing CTRL-C I see the following output:

    INFO:     Started server process [812800]
    INFO:     Waiting for application startup.
    STARTUP
    INFO:     Application startup complete.
    INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
    ^C^C^C^C^C
    

    The fastapi application is stuck.

    Any ideas how to mitigate/fix?

    opened by sevaho 2
  • Asyncpg Connector implementation

    Asyncpg Connector implementation

    Still very very early implementation. Mostly taking the suggested converter and loosely implementing the connector interface.

    Needs:

    • [ ] Error handling - steal from others?
    • [ ] Tests - copy/paste from aiopg
    • [ ] a second set of eyes, maybe even more

    Closes #417

    Successful PR Checklist:

    • [ ] Tests
      • [ ] (not applicable?)
    • [ ] Documentation
      • [ ] (not applicable?)
    opened by nwillems 2
Releases(0.26.0)
  • 0.26.0(Oct 8, 2022)

  • 0.25.2(Sep 27, 2022)

  • 0.25.1(Sep 27, 2022)

    Migrations

    None

    Bugfix

    • friendly error for get_full_path failures (#682)

    Dependencies

    • Automated deps maintenance (#683, #679, #677, #673, #668, #665, #663, #661, #659, #681, #680, #678, #675, #667, #664, #662)
    • Update deps (#660)
    • Sphinx 5 (#658)

    Kudos:

    @abe-winter

    Source code(tar.gz)
    Source code(zip)
  • 0.25.0(Jul 21, 2022)

    Migrations

    None

    Bugfixes

    • SQL Alchemy integrity Error (& hole in coverage) (#642)
    • Use typing_extensions only if Python < 3.8 (#656)

    Documentation

    • Document usage of Sentry with Procrastinate (#647)
    • Clarify schedule_in docstring (#595)
    • Fix link in readme (#640)

    Dependencies shenaningans

    There should be much less noise in the future thanks to Renovate.

    • use setup-python v4 (#638, #639)
    • Revert changes to the deploy workflow (#578)
    • Sync pre-commit and poetry (#650)
    • Use Renovate instead of Dependabot (#648, #634, #629, #628, #627, #619, #632, #636)
    • Dependabot (#616, #617, #618, #614, #613, #610, #611, #612, #600, #601, #602, #603, #604, #596, #597, #598, #599, #594, #593, #589, #588, #590, #582, #583, #584, #585, #579, #580)
    • Pre-commit CI (#654, #643, #631, #615, #608, #607, #591, #587, #581, #575)
    • Renovate (#655, #653, #652, #635, #625, #622, #624, #621)
    • Clear CI cache by updating deps (#641)

    Kudos:

    Thank you @abe-winter for multiple contributions! πŸŽ‰ Thank you @pmav99 for a first public PR ever in this repo! πŸŽ‰

    Source code(tar.gz)
    Source code(zip)
  • 0.24.1(Apr 21, 2022)

  • 0.24.0(Apr 21, 2022)

    Migrations

    This PR doesn't add a new migration, however, it fixes an issue with a previous release where a migration was created and didn't have the proper name, ending up not being advertised in the changelog.

    This is the migration:

    • https://github.com/procrastinate-org/procrastinate/blob/0.24.0/procrastinate/sql/migrations/00.23.00_01_null_locks_excluded.sql

    If you're unsure whether you need to apply the migration or not, here's a rundown:

    • If you are currently anywhere between 0.23 and 0.24 included and there is no newer version at the time you read this, you can safely apply the migration. Even if it has already been applied, it will be a noop.
    • If you're working up versions and will continue to apply migrations after 0.24, you can safely apply the migration as long as it's in order (so if there is a migration 0.25, apply this one and then 0.25)
    • If you have already applied migrations for 0.25 or above... well it depends :) Either analyze the migrations that you have applied and see if there could be a clash, or open an issue and we'll look at it together.

    Breaking changes

    • Harmonize periodic logs with other job logs (#507) If your workflow depend on exact log messages, then you may want to experiment with the new version. The messages are more consistent.
    • Renamed old improperly named migration (#576)

    Dependencies

    • Bump attrs, black, croniter, django, dunamai, importlib-metadata, mypy, psycopg2-binary, pytest-asyncio, pytest-click, pytest-mock, sphinx-github-changelog, tomlkit, types-croniter, types-psycopg2, types-python-dateutil (#508, #509, #514, #516, #515, #513, #523, #522, #526, #530, #529, #528, #527, #538, #537, #536, #541, #542, #547, #548, #549, #550, #551, #553, #552, #558, #556, #555, #562, #561, #565, #567, #566, #570, #572, #571, #573)
    • [pre-commit.ci] pre-commit autoupdate (#505, #519, #525, #545, #569)
    • update deps (#559, #517)

    Miscellaneous

    • Remove assert in real code (#535)
    • Fix a few typos in the docs by @benjamb (#563)

    Workflows, CI, etc

    • Fix publish workflow (#520, #518)
    • Switch to main branch (#531)
    • Switch to python-coverage-comment-action (#532, #533, #534)
    • Auto-merge dependabot PRs when they pass the CI (#539)
    • Change how the dependabot PRs are automerged (#543)
    • Delete dependabot-auto-merge.yml (#544)

    Kudos

    @benjamb

    Source code(tar.gz)
    Source code(zip)
  • 0.23.0(Dec 19, 2021)

    Migrations

    • https://github.com/procrastinate-org/procrastinate/blob/master/procrastinate/sql/migrations/00.22.00_01_add_kwargs_to_defer_periodic_job.sql

    The following migration should have been part of the 0.23 release, but due to a naming error, it wasn't originally displayed as such. Here is a link to the migration as it was when the tag was issued:

    • https://github.com/procrastinate-org/procrastinate/blob/0.23.0/procrastinate/sql/migrations/00.19.00_02_null_locks_excluded.sql

    Breaking changes

    • Arguments for periodic tasks 436 (#471): If you configured multiple periodic scheduling based on the same task, you now need to define them with explicitely different periodic_id: https://procrastinate.readthedocs.io/en/latest/howto/cron.html#scheduling-a-job-multiple-times-with-multiple-arguments . Also, if you relied on the fact that periodic schedules configures on the same task but with different queues were seen as independent schedules (https://github.com/procrastinate-org/procrastinate/blob/0.22.0/docs/howto/cron.rst#queue-lock-queuing-lock or #289), you will probably need to update your code and define unique periodic_ids in addition to the queues.

    Features

    • Arguments for periodic tasks 436 (#471)

    Bugfixes

    • Fix problem with AsyncMock (#504) (only impacted tests)

    Misc

    • Bump mypy from 0.910 to 0.920 (#503)
    • Bump pytest-django from 4.5.1 to 4.5.2 (#499)
    • Bump django from 3.2.9 to 3.2.10 (#500)
    • Bump black from 21.11b1 to 21.12b0 (#501)
    • [pre-commit.ci] pre-commit autoupdate (#498)

    Kudos:

    @aleksandr-shtaub

    Source code(tar.gz)
    Source code(zip)
  • 0.22.0(Dec 5, 2021)

    Migrations

    None (TODO: replace if there are migrations) https://github.com/procrastinate-org/procrastinate/tree/master/procrastinate/sql/migrations

    Breaking changes & news

    This release officially adds Blueprints with their supported API. See the documentation for details. Note that if you used blueprints from Procrastiate 0.21, this is a breaking change.

    Also, starting with this version, when the worker cannot function correctly (e.g. when it looses access to the database), it will crash instead of entering a half-broken state. This way, you get to define your own restart strategy, with back-off etc, the way you want.

    App.with_connector lets you create multiple synchronized apps with different connectors, in case you need to support both a sync and an async app.

    In procrastinate shell, a new list_locks command lets you find out about task locks that are currently set, to simplify the hunting for longstanding locks.

    Support for python3.6 has been removed, support for python3.10 has been added.

    Documentation has been updated regarding where you should place your app. In particular, in case you put your app in the module whose __name__ is __main__, a warning is now emitted, because this could lead to issues.

    Periodic tasks used to all share a lock, so no 2 periodic tasks could run at the same time. This is now fixed.

    And finally, did you notice ? The repository moved to the procrastinate-org organization, yay !

    Features

    • Second pass on blueprints (#428)
    • App.with_connector (#463)
    • Add list_locks (#401)
    • Remove support for Python 3.6, add support for 3.10 (#470)

    Bugfixes

    • Stop the whole worker process when a coroutine raises (#494)
    • Add a stack check (#442)
    • Null locks are ignored (#402)

    Misc

    • Fix longstanding Mypy issues (#490)
    • Add croniter-types (#476)
    • First wave of CODEOWNERS update (#462)
    • Update dev-env (#475)
    • Tweaks in the contribution doc (#472)
    • Remove a bit of repetition in docs/requirements.txt by using extras (#457)
    • Add empty setup.py to counter dependabot bug. (#458)
    • Switch org to procrastinate-org (#459)

    Bots

    • Bump typing-extensions from 4.0.0 to 4.0.1 (#496)
    • Bump pytest-django from 4.4.0 to 4.5.1 (#495)
    • [pre-commit.ci] pre-commit autoupdate (#492)
    • [pre-commit.ci] pre-commit autoupdate (#489)
    • [pre-commit.ci] pre-commit autoupdate (#485)
    • Bump importlib-resources from 5.2.2 to 5.4.0 (#481)
    • Bump django from 3.2.8 to 3.2.9 (#483)
    • Bump aiopg from 1.3.2 to 1.3.3 (#480)
    • Bump dunamai from 1.6.0 to 1.7.0 (#484)
    • Bump click from 8.0.1 to 8.0.3 (#482)
    • [pre-commit.ci] pre-commit autoupdate (#477)
    • Bump pytest-cov from 2.12.1 to 3.0.0 (#466)
    • Bump aiopg from 1.3.1 to 1.3.2 (#468)
    • Bump django from 3.2.7 to 3.2.8 (#467)
    • Bump sqlalchemy from 1.4.25 to 1.4.26 (#473)
    • Bump pytest-asyncio from 0.15.1 to 0.16.0 (#474)
    • [pre-commit.ci] pre-commit autoupdate (#469)
    • [pre-commit.ci] pre-commit autoupdate (#464)

    Kudos:

    @tomdottom, @elemoine, @ewjoachim

    Source code(tar.gz)
    Source code(zip)
  • 0.21.0(Sep 27, 2021)

    Migrations

    None

    Breaking changes

    • Remove auto-loading of tasks (#425): before this version, if a task was not loaded as a side effect of the app, and its module was not specified in the import_paths parameters of the app, and a worker received that task, it tried as a last-resort attempt, to load this task through its name. This meant that someone controlling a task name could make the worker load any code. The feature was removed altogether. If your worker emited warnings like Task at {task_name} was not registered, it's been loaded dynamically. (with log action load_dynamic_task), it means you need to change the code for it to work. If the warning was not emitted, you're good to go.

    Features

    • Add a connector for SQLAlchemy with Psycopg2 (#453)

    Bugfix

    • Misconfiguration of extra deps meant that the lib always depended on Django (#453)

    Misc

    • Remove redundant word in worker logging (#446)
    • path_hook should raise ImportError when a module isn't found (#430)
    • Add Blueprint and lazy registration pattern (#423) (though it's being refactored at the moment, it's advised to wait for the next release)

    Documentation

    • Nitpick on worker.py (#448)
    • Fix docstring and Psycopg2Connector init signature (#447)

    Process

    • Fix typo in path (#429)
    • Replace codecov with Coverage-Comment (#455, #456)
    • [pre-commit.ci] pre-commit autoupdate (#450, #443, #439 #434, #433)
    • Fix CI python version (#435)

    Kudos:

    @mecampbellsoup and @tomdottom

    Source code(tar.gz)
    Source code(zip)
  • 0.20.0(Jul 30, 2021)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.19.00_01_add_index_on_procrastinate_jobs.sql

    Features

    • Add index to procrastinate_jobs for improved fetch_job performance (#396)

    Bug Fixes

    • Psycopg2: % in queries need to be escaped (#427)
    • Attributes on the decorated task function shouldn't leak on the task (#399)

    Miscellaneous

    • Update deps & fix docs (#426)
    • [pre-commit.ci] pre-commit autoupdate (#422, #420, #418, #416, #414, #411, #408, #407, #405, #404, #398)
    • Add Dependabot auto upgrade PR (#415)
    • Bring some improvements to dev-env script (#413)
    • Goodbye MickaΓ«l - Remove @mgu from codeowners (#406)
    • Add support for Python 3.9 & boilerplate (#397)

    Kudos:

    @elemoine, @ioben, and MickaΓ«l GuΓ©rin

    Source code(tar.gz)
    Source code(zip)
  • 0.19.0(Apr 16, 2021)

    Migrations

    None

    Features

    • Make it possible to define additional context on the worker (#392)

    Bug Fixes

    • Small fix for dev-env script (#387)
    • Fix the async howto (#388)

    Miscellaneous

    • Small fix for dev-env script (#387)
    • Update PR template (#389)

    Kudos:

    @elemoine, @ewjoachim and @mxd4

    Source code(tar.gz)
    Source code(zip)
  • 0.18.2(Feb 5, 2021)

    Migrations

    The following migration fixes a schema bug introduced in 0.18.0. Please make sure to deploy this migration at the same time as the migrations mentioned in 0.18.0.

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.18.01_01_fix_finish_job_compat_issue.sql

    Fixes

    • Fix finish_job compatibility issue (#383)
    • Do not raise in the BaseConnector destructor (#385)

    Miscellaneous

    • Update CONTRIBUTING.rst: add info on rebuilding docs (#380)

    Kudos:

    @BracketJohn

    Source code(tar.gz)
    Source code(zip)
  • 0.18.1(Jan 13, 2021)

    :warning: If you plan to deploy 0.18.1, consider deploying 0.18.2 instead, as its migrations contain a fix for a 0.18.0 migration that could cause some bugs.

    Migrations

    No migrations in bugfix releases.

    Fixes

    • add todo check for queueing lock, add test for todo check (#379)
    • Fix doc parts (#377)

    Kudos:

    @BracketJohn and @tmartinfr

    Source code(tar.gz)
    Source code(zip)
  • 0.18.0(Jan 8, 2021)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.16.00_01_add_finish_job_and_retry_job_functions.sql (this migration was named 0.16 but should have been named 0.17)
    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.17.00_01_add_trigger_on_job_deletion.sql
    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.17.00_02_delete_finished_jobs.sql
    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.17.00_03_add_checks_to_finish_job.sql
    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/00.17.00_04_add_checks_to_retry_job.sql

    :warning: These migration contain an incompatibility that is resolved in 0.18.2. Please run the migrations mentioned in 0.18.0 and 0.18.2 as a bundle to avoid potential downtimes

    Features

    • Add async method to apply schema (#376)
    • Add the ability to personalize log format by environment variable (#357)
    • Second pass on log format (#361)
    • Delete finished jobs (#354)

    Bug Fixes

    • Dev-env script improvement (#353)
    • Fix InMemoryConnector handling of queueing_lock (#374)
    • Fix the shell's "retry job" and "cancel job" actions (#356)

    Documentation

    • Add a "Retry stalled jobs" howto (#366)
    • Change comments in the README's code snippets (#371)
    • Fix README async example, make both sync and async example more complete (#368)
    • Fix README example (#365)
    • Remove the Sphinx bug-related addendum in the doc (#351)

    Miscellaneous

    • Dev-env script improvement (#353)
    • Fix dev-env (#362)
    • CI on both PRs and the master branch (#373)
    • CI: switch from Branch to PR (#358)
    • Fix deployment pipeline (#360)
    • Add @thomasperrot as codeowner (#359)
    • Make the shell fixture properly terminate the shell process (#352)
    • Remove Admin and move its methods to JobManager (#349)
    • Split procrastinate_finish_job into two functions (#336)

    Kudos:

    @BracketJohn, @ignaciocabeza, @mxd4

    Source code(tar.gz)
    Source code(zip)
  • 0.17.0(Nov 13, 2020)

    Migrations

    This version didn't actually add new migrations (but existing migrations were renamed)

    A consequence of this, if you're using Django Migrations and ran migrations pre-0.17 is that one of the Django migrations was renamed, which confuses Django. Before running the Migrations for 0.17.0, you're invited to run the following Django code in the Django shell (manage.py shell):

    >>> from django.db.migrations.recorder import MigrationRecorder
    >>> MigrationRecorder.Migration.objects.filter(app="procrastinate", name="0001_baseline").update(name="0001_initial")
    

    Starting at this release, migrations can be run while the system runs, provided certain conditions are met. See the migrations doc for more information.

    Work on migrations:

    • Use underscores in migration script names (#345)
    • Document new rules for database migrations (#342)
    • Rename SQL migrations (#347)
    • Rename migration: wrong index number (#350)
    • Django migrations: generate on the fly via import hooks (#340)

    Breaking Changes

    • Django migrations, see above.
    • procrastinate healthchecks now doesn't report the number of jobs anymore. Use procrastinate shell for that.

    Miscellaneous

    • Adjust badges (#337)
    • Rename Job Store > Job Manager (#335)
    • Report test workflow success in Check API (#322)
    • Don't test the PRs, just test the branches (#348)
    • Ignore aiopg warnings that we can't do anything about (#338)
    • Remove HealthCheckRunner, simplify healthchecks (#339)

    Kudos:

    @elemoine, @ewjoachim and @thomasperrot

    Source code(tar.gz)
    Source code(zip)
  • 0.16.0(Oct 6, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.15.2_001_fix_procrastinate_defer_periodic_job.sql

    Features

    • Auto-generating Django migrations using Django's own machinery (#298)

    Bug Fixes

    • Fix the procrastinate_defer_periodic_job SQL function (#329)
    • Change name of short option for --listen-notifiy (#325)
    • Fix the AppNotOpen exception message (#327)

    Documentation

    • Document how to make remove_old_jobs periodic (#326)

    Miscellaneous

    • Use setuptools-scm to get migration from Git (#319)
    • Make test_run_log_current_job_when_stopping more robust (#317)
    • Use postgres:12 image for faster tests (#315)
    • Use migra instead of pum for the db migration tests (#308)
    • Run the main workflow for other branches than master (#311)
    • GitHub actions (#302)
    • Simplify release-drafter PR process (#324)

    Kudos:

    @elemoine, @ewjoachim, @mgu and @thomasperrot

    Source code(tar.gz)
    Source code(zip)
  • 0.15.2(Aug 25, 2020)

    Once again, this release is just a way for us to test the release process. Nothing changed in the code.

    Migrations

    None

    Bugfixes

    • Fix deployment in CI (#301)

    Kudos:

    @thomasperrot

    Source code(tar.gz)
    Source code(zip)
  • 0.15.1(Aug 25, 2020)

    OK, this release is just a way for us to test the release process. Nothing changed in the code.

    Migrations

    None

    Bugfixes

    • Trigger CI when publishing a new release (#300)

    Kudos:

    @thomasperrot

    Source code(tar.gz)
    Source code(zip)
  • 0.15.0(Aug 25, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.14.0_001_add_locks_to_periodic_defer.sql

    Breaking changes

    • It is now expected that connections are explicitly opened (and close): see documentation for the various ways of doing that. (#270)

    Features

    • You can now have a single periodic task be deferred multiple times on multiple queues (doc) (#296)
    • Tasks (including periodic tasks) now accept default locks and queueing locks in the same way as it was accepting default queues (doc) (#296)

    Bugfix

    • Periodic tasks deferring won't be interrupted by long synchronous tasks (#296)

    Miscellaneous

    • Add github actions (#290)
    • Badges (#299)
    • Additional doc on periodic tasks - forgotten wording fix (#294)

    Kudos:

    @ewjoachim and @thomasperrot

    Source code(tar.gz)
    Source code(zip)
  • 0.14.0(Aug 20, 2020)

    Migrations

    None

    Features

    • Django contrib app (#283)

    Bug Fixes

    • Fix the "every second periodic task" bug (#293)

    Miscellaneous

    • Fix quickstart documentation (#285)
    • Additional doc on periodic tasks (#291)
    • Dev env script (#284)

    Kudos:

    Agate (ping @EliotBerriot), @t-eckert, @elemoine, @ewjoachim

    Source code(tar.gz)
    Source code(zip)
  • 0.13.0(Jul 17, 2020)

    Migrations

    None

    Breaking changes

    • Very slight possible breaking change: the code parsing dates when calling procrastinate defer --at=<date> has changed. If you were using ISO8601 strings, nothing changed. If you were using more exotic formats, we think that nothing changed, but we don't have a definitive proof of that, so here's a warning. See https://procrastinate.readthedocs.io/en/stable/howto/schedule.html?highlight=pendulum#from-the-code (#279)

    Miscellaneous

    • Use python-dateutil to parse date (#279)

    Kudos:

    @EliotBerriot

    Source code(tar.gz)
    Source code(zip)
  • 0.12.1(Jul 17, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.12.0_001_add_foreign_key_index.sql

    Bug Fixes

    • Add missing index to procrastinate_periodic_defers (#277)

    Miscellaneous

    • Adapt to new isort 5 (#275)

    Kudos:

    @ewjoachim, @anayrat

    Source code(tar.gz)
    Source code(zip)
  • 0.12.0(Jul 6, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.11.0_003_add_procrastinate_periodic_defers.sql

    Features

    Miscellaneous

    • Add py.typed to package (#272)
    • Add missing documentation on arguement retry_exceptions (#271)

    Kudos:

    @SBillion, @elemoine, @ewjoachim and @tmartinfr

    Source code(tar.gz)
    Source code(zip)
  • 0.11.0(Jun 23, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.10.0_001_close_fetch_job_race_condition.sql
    • https://github.com/peopledoc/procrastinate/blob/master/procrastinate/sql/migrations/delta_0.10.0_002_add_defer_job_function.sql

    Breaking changes

    • If you created an AiopgConnector with maxsize=0 or 1, it used to change it to 2. Now it won't. 0 should really be avoided. 1 will disable the listen/notify feature. See doc.

    Features

    • Synchronous programs can now define a Psycopg2Connector() and have real synchronous I/Os, likely to work better with multithreaded programs. See doc. (#237)
    • Default log message contains task result. This means you can return <something> in your task to make this <something> appear in your logs (#252)
    • Make listen/notify optional, through listen_notify=False in your worker configuration (#258)

    Bug Fixes

    • Close race condition in procrastinate_fetch_job (#231)
    • Retry on "server closed connection unexpectedly" errors (#259)
    • Synchronous closing for AiopgConnector (#263)
    • Add task.defer as an explicit sync method (#257)

    Miscellaneous

    • Documentation "Quickstart" section was full of imprecisions (#254)
    • Display test results & simplify Tox setup (#253)
    • Add function procrastinate_defer_job (#232)
    • CONTRIBUTING.rst: more precise wording on release automated steps (#248)
    • Update release-drafter.yml (#245)

    Kudos:

    @elemoine and @ewjoachim

    Source code(tar.gz)
    Source code(zip)
  • 0.10.0(Jun 12, 2020)

    A bug (#236) since 0.7.1 was preventing the listen-notify feature from working correctly.

    Migrations

    None

    Bug Fixes

    • Create "set pool" lock lazily (#236)

    Miscellaneous

    • Improve release-drafter settings (#243)
    • Add a github action to get automated changelogs (#239)
    • Change editions to edits in CONTRIBUTING (#244)
    • Make setup.py work against lightweight tags (#234)
    • Improve job defer logs (#233)

    Kudos:

    @elemoine and @ewjoachim

    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Jun 5, 2020)

    Breaking changes

    • Rename PostgresConnector into AiopgConnector (#225):

      The class PostgresConnector became AiopgConnector. The only change is the name.

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/0.9.0/procrastinate/sql/migrations/delta_0.8.1_001_add_queueing_lock_column.sql

    Features

    • Add notion of "queueing lock" (#219):

      Queueing locks can ensure periodic jobs do not accumulate in the queue: at any given time, only a single job with a given queueing lock can be waiting.

      See https://procrastinate.readthedocs.io/en/latest/howto/cron.html#launch-a-task-periodically

    Documentation

    • Fix PostgreSQL docker example in quickstart doc (#230) (Thanks @tmartinfr!)

    Miscellaneous

    • Code improvements around Procrastinate Admin module (#224)
    Source code(tar.gz)
    Source code(zip)
  • 0.8.1(May 29, 2020)

  • 0.8.0(May 29, 2020)

    Breaking changes

    • None \o/ (that we know of)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/0.8.0/procrastinate/sql/migrations/delta_0.7.1_001_fix_trigger_status_events_insert.sql

    Features

    • Refactor builtin tasks to use pass_context #205
    • Real concurrent asynchronous tasks #206
    • Implement an administration prompt #204

    Fixes

    • Worker should instantiate Event lazily, and should use wait_for, not wait #202
    • Insert event after defer job #184

    Tests

    • Activate test_lock #210
    • Add a schema migration test #209
    • Instrument test_lock in case it fails again #214
    • Simplify migration tests #215

    Docs

    • Fix connector doc & worker doc (& probably other docs too) #208
    • Add migration tests to contributing doc #211
    • Fix typo in locks howto #216

    Misc

    • Remove extraneous spaces in license file #218
    Source code(tar.gz)
    Source code(zip)
  • 0.7.1(Apr 29, 2020)

  • 0.7.0(Apr 27, 2020)

    Migrations

    • https://github.com/peopledoc/procrastinate/blob/0.7.0/procrastinate/sql/migrations/delta_0.6.0_001_fix_procrastinate_fetch_job.sql

    Features

    Postgres Connector

    Breaking compatibility :warning:

    Now, to setup your postgres connector you can do:

    connector = procrastinate.PostgresConnector(
        dsn="postgres://user:password@host:port/dbname"
    )
    # or if you already have an aiopg pool:
    connector = procrastinate.PostgresConnector()
    connector.set_pool(my_pool)
    

    Associated PRs:

    • Remove PostgresJobStore (compatibility layer) #188
    • Create aiopg pool lazily #185
    • Use a pool instead of a single connection, and reorganize the worker around that #173

    Docker

    • Adding a first version of dockerized development environment. #144
    • add POSTGRES_PASSWORD to docker-compose.yml and in contributing documentation #177

    Logs

    • add log formatting when deffering a task #117
    • Small second pass on logs #159

    Migrations

    • Renaming Migrator as SchemaManager #161
    • Derive schema version from migration scripts #162
    • Improve our migration story #167

    Documentation

    • Reword migration section of contributing guide #168
    • Add a "Set database schema" how-to #169
    • Add a "Use Pum for migrations" how-to #170

    Misc., including process changes

    • Fix setup.py #174
    • Fix builds 164 #165
    • Fix mypy errors #172
    • Rewrite travis.yml file #179
    • Closes instead of Cf. in PR template #180
    • Remove the procrastinate_jobs.started_at column #145
    • Add official support for Python 3.8 #178
    Source code(tar.gz)
    Source code(zip)
Owner
Procrastinate
Development of the Procrastinate tool (profile pic by NautileBleu)
Procrastinate
Distributed Task Queue (development branch)

Version: 5.0.5 (singularity) Web: http://celeryproject.org/ Download: https://pypi.org/project/celery/ Source: https://github.com/celery/celery/ Keywo

Celery 20.7k Jan 2, 2023
A multiprocessing distributed task queue for Django

A multiprocessing distributed task queue for Django Features Multiprocessing worker pool Asynchronous tasks Scheduled, cron and repeated tasks Signed

Ilan Steemers 1.7k Jan 3, 2023
Distributed Task Queue (development branch)

Version: 5.1.0b1 (singularity) Web: https://docs.celeryproject.org/en/stable/index.html Download: https://pypi.org/project/celery/ Source: https://git

Celery 20.7k Jan 1, 2023
Sync Laravel queue with Python. Provides an interface for communication between Laravel and Python.

Python Laravel Queue Queue sync between Python and Laravel using Redis driver. You can process jobs dispatched from Laravel in Python. NOTE: This pack

Sinan Bekar 3 Oct 1, 2022
Django database backed celery periodic task scheduler with support for task dependency graph

Djag Scheduler (Dj)ango Task D(AG) (Scheduler) Overview Djag scheduler associates scheduling information with celery tasks The task schedule is persis

Mohith Reddy 3 Nov 25, 2022
RQ (Redis Queue) integration for Flask applications

Flask-RQ RQ (Redis Queue) integration for Flask applications Resources Documentation Issue Tracker Code Development Version Installation $ pip install

Matt Wright 205 Nov 6, 2022
A simple app that provides django integration for RQ (Redis Queue)

Django-RQ Django integration with RQ, a Redis based Python queuing library. Django-RQ is a simple app that allows you to configure your queues in djan

RQ 1.6k Dec 28, 2022
Accept queue automatically on League of Legends.

Accept queue automatically on League of Legends. I was inspired by the lucassmonn code accept-queue-lol-telegram, and I modify it according to my need

null 2 Sep 6, 2022
Redis-backed message queue implementation that can hook into a discord bot written with hikari-lightbulb.

Redis-backed FIFO message queue implementation that can hook into a discord bot written with hikari-lightbulb. This is eventually intended to be the backend communication between a bot and a web dashboard.

thomm.o 7 Dec 5, 2022
A fast and reliable background task processing library for Python 3.

dramatiq A fast and reliable distributed task processing library for Python 3. Changelog: https://dramatiq.io/changelog.html Community: https://groups

Bogdan Popa 3.4k Jan 1, 2023
Beatserver, a periodic task scheduler for Django 🎡

Beat Server Beatserver, a periodic task scheduler for django channels | beta software How to install Prerequirements: Follow django channels documenta

Raja Simon 130 Dec 17, 2022
Dagon - An Asynchronous Task Graph Execution Engine

Dagon - An Asynchronous Task Graph Execution Engine Dagon is a job execution sys

null 8 Nov 17, 2022
Flower is a web based tool for monitoring and administrating Celery clusters.

Real-time monitor and web admin for Celery distributed task queue

Mher Movsisyan 5.5k Jan 2, 2023
Simple job queues for Python

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is desi

RQ 8.7k Jan 7, 2023
Asynchronous tasks in Python with Celery + RabbitMQ +Β Redis

python-asynchronous-tasks Setup & Installation Create a virtual environment and install the dependencies: $ python -m venv venv $ source env/bin/activ

Valon Januzaj 40 Dec 3, 2022
Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code

py_extract Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code. Still in Development Stage! I

I'm Not A Bot #Left_TG 7 Nov 7, 2021
OpenQueue is a experimental CS: GO match system written in asyncio python.

What is OpenQueue OpenQueue is a experimental CS: GO match system written in asyncio python. Please star! This project was a lot of work & still has a

OpenQueue 10 May 13, 2022
A django integration for huey task queue that supports multi queue management

django-huey This package is an extension of huey contrib djhuey package that allows users to manage multiple queues. Installation Using pip package ma

GAIA Software 32 Nov 26, 2022
a little task queue for python

a lightweight alternative. huey is: a task queue (2019-04-01: version 2.0 released) written in python (2.7+, 3.4+) clean and simple API redis, sqlite,

Charles Leifer 4.3k Jan 8, 2023
a little task queue for python

a lightweight alternative. huey is: a task queue (2019-04-01: version 2.0 released) written in python (2.7+, 3.4+) clean and simple API redis, sqlite,

Charles Leifer 4.3k Dec 29, 2022