Pyramid configuration with celery integration. Allows you to use pyramid .ini files to configure celery and have your pyramid configuration inside celery tasks.

Overview

Getting Started

https://travis-ci.org/sontek/pyramid_celery.png?branch=master https://coveralls.io/repos/sontek/pyramid_celery/badge.png?branch=master

Include pyramid_celery either by setting your includes in your .ini, or by calling config.include('pyramid_celery'):

pyramid.includes = pyramid_celery

Then you just need to tell pyramid_celery what ini file your [celery] section is in:

config.configure_celery('development.ini')

Then you are free to use celery, for example class based:

from pyramid_celery import celery_app as app

class AddTask(app.Task):
    def run(self, x, y):
        print x+y

or decorator based:

from pyramid_celery import celery_app as app

@app.task
def add(x, y):
    print x+y

To get pyramid settings you may access them in app.conf['PYRAMID_REGISTRY'].

Configuration

By default pyramid_celery assumes you want to configure celery via an ini settings. You can do this by calling config.configure_celery('development.ini') but if you are already in the main of your application and want to use the ini used to configure the app you can do the following:

config.configure_celery(global_config['__file__'])

If you want to use the standard celeryconfig python file you can set the use_celeryconfig = True like this:

[celery]
use_celeryconfig = True

You can get more information for celeryconfig.py here:

http://celery.readthedocs.io/en/latest/userguide/configuration.html

An example ini configuration looks like this:

[celery]
broker_url = redis://localhost:1337/0
imports = app1.tasks
          app2.tasks

[celery:broker_transport_options]
visibility_timeout = 18000
max_retries = 5

[celerybeat:task1]
task = app1.tasks.Task1
type = crontab
schedule = {"minute": 0}

You'll notice the configuration options that are dictionaries or have multiple values will be split into their own sections.

Scheduled/Periodic Tasks

To use celerybeat (periodic tasks) you need to declare 1 celerybeat config section per task. The options are:

  • task - The python task you need executed.
  • type - The type of scheduling your configuration uses, options are crontab, timedelta, and integer.
  • schedule - The actual schedule for your type of configuration.
  • args - Additional positional arguments.
  • kwargs - Additional keyword arguments.

Example configuration for this:

[celerybeat:task1]
task = app1.tasks.Task1
type = crontab
schedule = {"minute": 0}

[celerybeat:task2]
task = app1.tasks.Task2
type = timedelta
schedule = {"seconds": 30}
args = [16, 16]

[celerybeat:task3]
task = app2.tasks.Task1
type = crontab
schedule = {"hour": 0, "minute": 0}
kwargs = {"boom": "shaka"}

[celerybeat:task4]
task = myapp.tasks.Task4
type = integer
schedule = 30

A gotcha you want to watchout for is that the date/time in scheduled tasks is UTC by default. If you want to schedule for an exact date/time for your local timezone you need to set timezone. Documentation for that can be found here:

http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html#time-zones

If you need to find out what timezones are available you can do the following:

from pprint import pprint
from pytz import all_timezones
pprint(all_timezones)

Worker Execution

The celerybeat worker will read your configuration and schedule tasks in the queue to be executed at the time defined. This means if you are using celerybeat you will end up running 2 workers:

$ celery -A pyramid_celery.celery_app worker --ini development.ini
$ celery -A pyramid_celery.celery_app beat --ini development.ini

The first command is the standard worker command that will read messages off of the queue and run the task. The second command will read the celerybeat configuration and periodically schedule tasks on the queue.

Routing

If you would like to route a task to a specific queue you can define a route per task by declaring their queue and/or routing_key in a celeryroute section.

An example configuration for this:

[celeryroute:otherapp.tasks.Task3]
queue = slow_tasks
routing_key = turtle

[celeryroute:myapp.tasks.Task1]
queue = fast_tasks

Running the worker

To run the worker we just use the standard celery command with an additional argument:

celery worker -A pyramid_celery.celery_app --ini development.ini

If you've defined variables in your .ini like %(database_username)s you can use the --ini-var argument, which is a comma separated list of key value pairs:

celery worker -A pyramid_celery.celery_app --ini development.ini --ini-var=database_username=sontek,database_password=OhYeah!

The values in ini-var cannot have spaces in them, this will break celery's parser.

The reason it is a csv instead of using --ini-var multiple times is because of a bug in celery itself. When they fix the bug we will re-work the API. Ticket is here:

https://github.com/celery/celery/pull/2435

If you use celerybeat scheduler you need to run with the --beat flag to run beat and the worker at the same time.

celery worker --beat -A pyramid_celery.celery_app --ini development.ini

Or you can launch it separately like this:

celery beat -A pyramid_celery.celery_app --ini development.ini

Logging

If you use the .ini configuration (i.e don't use celeryconfig.py) then the logging configuration will be loaded from the .ini and will not use the default celery loggers.

You most likely want to add a logging section to your ini for celery as well:

[logger_celery]
level = INFO
handlers =
qualname = celery

and then update your [loggers] section to include it.

If you want use the default celery loggers then you can set CELERYD_HIJACK_ROOT_LOGGER=True in the [celery] section of your .ini.

Celery worker processes do not propagate exceptions inside tasks, but swallow them silently by default. This is related to the behavior of reading asynchronous task results back. To see if your tasks fail you might need to configure celery.worker.job logger to propagate exceptions:

# Make sure Celery worker doesn't silently swallow exceptions
# See http://stackoverflow.com/a/20719461/315168
# https://github.com/celery/celery/issues/2437
[logger_celery_worker_job]
level = ERROR
handlers =
qualname = celery.worker.job
propagate = 1

If you want use the default celery loggers then you can set CELERYD_HIJACK_ROOT_LOGGER=True in the [celery] section of your .ini

Demo

To see it all in action check out examples/long_running_with_tm, run redis-server and then do:

$ python setup.py develop
$ populate_long_running_with_tm development.ini
$ pserve ./development.ini
$ celery worker -A pyramid_celery.celery_app --ini development.ini
Comments
  • Can't define a result backend

    Can't define a result backend

    Maybe this is just a PEBKAC, however I have found that pyramid_celery doesn't allow to define a result backend.

    I'm not sure what makes this happen, as other configuration options are handled correctly except the backend ones. I use the exact same options like in celeryconfig.py (minus quotes) and it just doesn't reach celery. If I call a get() on a task, I get an error, that no backend is defined.

    I had to uninstall pyramid_celery completely and remove all references from the code to make it work again.

    Am I doing something wrong? It seems odd that I'm the first with that problem…unfortunately I can't make much sense out of the Pyramid-specific code so I can't even guess what's going on.

    opened by hynek 9
  • Planning a new release

    Planning a new release

    The current release in PyPI is not compatible current versions of Celery, but the code in Git works. Would it be possible to tag a new version and push it to PyPI?

    opened by deuxpi 5
  • pyramid_celery is not compatible with pyramid 2

    pyramid_celery is not compatible with pyramid 2

    In pyramid 2 there is no pyramid.compat module anymore. The ConfigParser should be loaded directly from the python3 configparser module.

    from pyramid.compat import configparser
    

    Should be

    import configparser
    
    opened by tonthon 3
  •  Make other Pyramid bootstrap values available (in addition to registry)

    Make other Pyramid bootstrap values available (in addition to registry)

    These additional values (app, root, closer and request) can be quite useful. Since they are returned anyway by pyramid.paster.boostrap() it's a shame to waste them.

    They are available under:

    celery_app.conf['PYRAMID_APP']
    celery_app.conf['PYRAMID_ROOT']
    celery_app.conf['PYRAMID_CLOSER']
    celery_app.conf['PYRAMID_REQUEST']
    

    Probably some more work needs to be done but this is the basic idea. It seems to work alright. What do you think?

    opened by omarkohl 3
  • Added tuple-list config parsing for Celery ADMINS config, with test.

    Added tuple-list config parsing for Celery ADMINS config, with test.

    This adds list-of-tuple parsing to the INILoader so that configuration settings like ADMINS can be correctly read and used by pyramid_celery.

    This PR solves #55 by adding the code suggested there and includes an additional test for it.

    opened by edelooff 3
  • Can't set CELERY_ACCEPT_CONTENT parameter through the ini file (will be blocking with celery 3.2)

    Can't set CELERY_ACCEPT_CONTENT parameter through the ini file (will be blocking with celery 3.2)

    In pyramid_celery 2.0.0-rc3, while configuring through the main ini file, it's not possible to configure CELERY_ACCEPT_CONTENT

    The ini file content

    CELERY_ACCEPT_CONTENT =
         json
    

    The traceback

    Traceback (most recent call last):
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/billiard/process.py", line 292, in _bootstrap
        self.run()
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/billiard/pool.py", line 286, in run
        self.after_fork()
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/billiard/pool.py", line 389, in after_fork
        self.initializer(*self.initargs)
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/concurrency/prefork.py", line 81, in process_initializer
        app=app)
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/app/trace.py", line 161, in build_tracer
        backend = task.backend
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/five.py", line 284, in __get__
        return self.__get.__get__(obj, type)()
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/task/base.py", line 75, in backend
        return cls.app.backend
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/kombu/utils/__init__.py", line 322, in __get__
        value = obj.__dict__[self.__name__] = self.__get(obj)
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/app/base.py", line 625, in backend
        return self._get_backend()
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/app/base.py", line 444, in _get_backend
        return backend(app=self, url=url)
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/celery/backends/base.py", line 107, in __init__
        conf.CELERY_ACCEPT_CONTENT if accept is None else accept,
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/kombu/serialization.py", line 454, in prepare_accept_content
        return set(n if '/' in n else name_to_type[n] for n in l)
      File "/home/gas/.virtualenvs/autonomie/lib/python2.7/site-packages/kombu/serialization.py", line 454, in <genexpr>
        return set(n if '/' in n else name_to_type[n] for n in l)
    KeyError: '\n'
    

    Expected CELERY_IMPORTS (a specific treatment has been added in the pyramd_celery/init.py file), we can't customize parameters expecting a list as value.

    opened by tonthon 3
  • pceleryctl has an import error

    pceleryctl has an import error

    with celery==3.0.19

    vagrant@orion:~$ pceleryctl 
    Traceback (most recent call last):
      File "/usr/local/bin/pceleryctl", line 9, in <module>
        load_entry_point('pyramid-celery==1.3', 'console_scripts', 'pceleryctl')()
      File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 337, in load_entry_point
        return get_distribution(dist).load_entry_point(group, name)
      File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 2279, in load_entry_point
        return ep.load()
      File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 1989, in load
        entry = __import__(self.module_name, globals(),globals(), ['__name__'])
      File "/usr/local/lib/python2.7/dist-packages/pyramid_celery/commands/celeryctl.py", line 3, in <module>
        from celery.bin.celeryctl import help as BaseHelp
    ImportError: cannot import name help
    
    opened by BDuelz 3
  • Passing variables to .ini config file

    Passing variables to .ini config file

    My config file makes use of variables like so:

    sqlalchemy.url = postgresql+psycopg2://%(db_user)s:%(db_pass)[email protected]/%(db_name)s
    

    And I pass them along like so:

    pserve production.ini db_user=user db_pass=pass db_name=name
    

    How can I pass these variables along using pceleryd?

    opened by BDuelz 3
  • celerybeat configuration impossible using .ini

    celerybeat configuration impossible using .ini

    If pyramid_celery is included, celeryconfig.py is ignored. That’s bad, because in order to configure celerybeat, one has to import modules like timedelta or crontab, otherwise the eval() while parsing fails.

    I haven’t found a way to achieve that so I claim that pcelerybeat is mostly useless – or did I miss something?

    I don’t really have an off-hand solution to this problem – maybe add a configuration option with dependencies that are needed in the eval() context?

    opened by hynek 3
  • pyramid_celery not installable from pypi

    pyramid_celery not installable from pypi

    It looks like README.md is not included in the sdist, probably due to lack of a appropriate MANIFST.in:

    Getting distribution for 'pyramid-celery'.
    error: /var/folders/+P/+PcLj-6aHLaQAin55j1OYE+++TI/-Tmp-/easy_install-2iPIa1/pyramid_celery-0.1/README.md: No such file or directory
    
    opened by wichert 3
  • pyramid_celery fails silently when the inifile is not well configured

    pyramid_celery fails silently when the inifile is not well configured

    In the pyramid_celery/init.py file, inside on_preload_parsed function the pyramid app is bootstraped using

    env = bootstrap(ini_location, options=options)
    

    or

    env = bootstrap(ini_location)
    

    When that call fails, the celery_app starts without having been customized with default configuration.

    For example running the long_running_task_with_tm example results by default in the following error.

    [2017-09-28 17:36:35,569: WARNING/MainProcess] consumer: Cannot connect to %s: %s.
    %s
    [2017-09-28 17:36:35,569: WARNING/MainProcess] amqp://guest:**@127.0.0.1:5672//
    [2017-09-28 17:36:35,569: WARNING/MainProcess] [Errno 111] Connection refused
    [2017-09-28 17:36:35,570: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
    Trying again in 4.00 seconds...
    

    Adding a traceback print shows :

    InterpolationMissingOptionError: Error in file /home/mysuser/pool/pyramid_celery/examples/long_running_with_tm/development.ini: Bad value substitution:
    	section: [app:main]
    	option : sqlalchemy.url
    	key    : database
    	rawval : postgresql://localhost/%(database)s
    
    opened by tonthon 2
  • TypeError: flower() got an unexpected keyword argument 'ini_var'

    TypeError: flower() got an unexpected keyword argument 'ini_var'

    With pyramid-celery==4.0.0, celery==5.2.3 and flower=1.0.0 I'm getting the error after invoking flower via

    celery -A pyramid_celery.celery_app flower
    

    It seems like pyramid-celery is trying to send ini_var parameter with the newer version of flower no longer accepts.

    Stack trace is

    flower_1      | Traceback (most recent call last):
    flower_1      |   File "/usr/local/bin/celery", line 8, in <module>
    flower_1      |     sys.exit(main())
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/celery/__main__.py", line 15, in main
    flower_1      |     sys.exit(_main())
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/celery/bin/celery.py", line 213, in main
    flower_1      |     return celery(auto_envvar_prefix="CELERY")
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1128, in __call__
    flower_1      |     return self.main(*args, **kwargs)
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1053, in main
    flower_1      |     rv = self.invoke(ctx)
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1659, in invoke
    flower_1      |     return _process_result(sub_ctx.command.invoke(sub_ctx))
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1395, in invoke
    flower_1      |     return ctx.invoke(self.callback, **ctx.params)
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 754, in invoke
    flower_1      |     return __callback(*args, **kwargs)
    flower_1      |   File "/usr/local/lib/python3.8/dist-packages/click/decorators.py", line 26, in new_func
    flower_1      |     return f(get_current_context(), *args, **kwargs)
    flower_1      | TypeError: flower() got an unexpected keyword argument 'ini_var'
    
    opened by dpdoughe 1
  • augur-168 Make sure startup errors are reported

    augur-168 Make sure startup errors are reported

    Fixes # Refactor config values parsing - extracted new methods for each settings type. And wrapped it, so fail_silently passed to read_configuration is false - exception will be raised. In other case exceptions will be logged and setting's default value will be taken

    opened by laptus 0
  • Current implementation does not support SQS configuration broker transport options

    Current implementation does not support SQS configuration broker transport options

    Currently, configuration for broker_transport_options are being limited by the following map:

    BROKER_TRANSPORT_OPTIONS_MAP = {
        'visibility_timeout': int,
        'max_retries': int,
    }
    

    However, for SQS, there are other options required such as region. Because these are not mapped, they are ignored and we get an error further down the line as this information is missing.

    It would not be very sustainable to support every option required by every (supported) provider, but what is the potential risk of removing this mapping in its entirety? Or is there another way to configure this that I missed?

    opened by JurgenFiGO 0
  • Added 'master_name' to 'BROKER_TRANSPORT_OPTIONS_MAP'

    Added 'master_name' to 'BROKER_TRANSPORT_OPTIONS_MAP'

    This option is necessary to use REDIS replication based on sentinels:

    [celery]
    BROKER_URL = sentinel://:mypassword@sentinel1:26379/1;sentinel://:mypassword@sentinel2:26379/1
    
    [celery:broker_transport_options]
    master_name = mymaster
    

    More info in https://github.com/celery/kombu/blob/master/kombu/transport/redis.py (see SentinelChannel).

    Thanks

    opened by System25 0
  • Worker Concurrency  configuration on .ini does not work.

    Worker Concurrency configuration on .ini does not work.

    Versions used: * Python: 3.9.4 * Celery: 5.0.5 * Pyramid: 2.0 * Pyramid-Celery: 4.0.0

    Before openinig this issue a digged a lot and I think I kind of understand why and how it happens but I am not sure how to fix it and even if it is possible to fix without some ajustments in the celery project itself.

    I will put my findings here and hopefuly we may find a way to address it.

    How to reproduce:

    Files

    The files can also be found here: https://github.com/debonzi/pyramid-celery-issue-sample

    setup.py

    from setuptools import setup, find_packages
    
    requires = [
        "celery==5.0.5",
        "pyramid==2.0",
        "pyramid-celery==4.0.0",
    ]
    setup(
        name="demo",
        version="0.0.1",
        install_requires=requires,
        packages=find_packages(exclude=['tests']),
        entry_points={
            "paste.app_factory": [
                "main = demo:main",
            ],
        },
    )
    
    

    demo.py

    from pyramid.config import Configurator
    
    
    def main(global_config, **settings):
        """This function returns a Pyramid WSGI application."""
        with Configurator(settings=settings) as config:
            config.include("pyramid_celery")
            config.configure_celery(global_config["__file__"])
    
        return config.make_wsgi_app()
    
    

    conf.ini

    [app:main]
    use = egg:demo
    
    [server:main]
    use = egg:waitress#main
    listen = localhost:6543
    
    [celery]
    broker_url = redis://localhost:6379/1
    worker_concurrency = 4
    

    Install and Run

    $ pip install -e .
    $ celery -A pyramid_celery.celery_app worker --ini config.ini
    

    Output:

     -------------- celery@Doomhammer v5.0.5 (singularity)
    --- ***** -----
    -- ******* ---- Linux-5.4.72-microsoft-standard-WSL2-x86_64-with-glibc2.31 2021-05-02 16:15:11
    - *** --- * ---
    - ** ---------- [config]
    - ** ---------- .> app:         __main__:0x7f3e25b9e670
    - ** ---------- .> transport:   redis://localhost:6379/1
    - ** ---------- .> results:     disabled://
    - *** --- * --- .> concurrency: 8 (prefork)
    -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
    --- ***** -----
     -------------- [queues]
                    .> celery           exchange=celery(direct) key=celery
    

    Concurrency is 8 (Number of CPU Cores on my machine) and not 4 as expected.

    As fas as I could understand, celery 5 uses click and celery-pyramid uses the signal user_preload_options to configure celery using config_from_object method.

    At this point it seems that celery has already loaded the click parameters into internal variables and since we didn't use --concurrency it behaves as if we have used --concurrency=0 (which will later translate to Number of CPU cores). Since command line parameters has precedency over config file parameters (config_from_object) celery will ignore the value from.ini` file and use the default value.

    If you guys have a idea on how to fix (or work around it) or need any more information I might have I will be glad to provide it.

    Cheers

    opened by debonzi 0
Owner
John Anderson
John Anderson
Flower is a web based tool for monitoring and administrating Celery clusters.

Real-time monitor and web admin for Celery distributed task queue

Mher Movsisyan 5.5k Jan 2, 2023
Queuing with django celery and rabbitmq

queuing-with-django-celery-and-rabbitmq Install Python 3.6 or above sudo apt-get install python3.6 Install RabbitMQ sudo apt-get install rabbitmq-ser

null 1 Dec 22, 2021
FastAPI with Celery

Minimal example utilizing fastapi and celery with RabbitMQ for task queue, Redis for celery backend and flower for monitoring the celery tasks.

Grega Vrbančič 371 Jan 1, 2023
Django database backed celery periodic task scheduler with support for task dependency graph

Djag Scheduler (Dj)ango Task D(AG) (Scheduler) Overview Djag scheduler associates scheduling information with celery tasks The task schedule is persis

Mohith Reddy 3 Nov 25, 2022
RQ (Redis Queue) integration for Flask applications

Flask-RQ RQ (Redis Queue) integration for Flask applications Resources Documentation Issue Tracker Code Development Version Installation $ pip install

Matt Wright 205 Nov 6, 2022
A simple app that provides django integration for RQ (Redis Queue)

Django-RQ Django integration with RQ, a Redis based Python queuing library. Django-RQ is a simple app that allows you to configure your queues in djan

RQ 1.6k Dec 28, 2022
Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code

py_extract Py_extract is a simple, light-weight python library to handle some extraction tasks using less lines of code. Still in Development Stage! I

I'm Not A Bot #Left_TG 7 Nov 7, 2021
A fast and reliable background task processing library for Python 3.

dramatiq A fast and reliable distributed task processing library for Python 3. Changelog: https://dramatiq.io/changelog.html Community: https://groups

Bogdan Popa 3.4k Jan 1, 2023
Sync Laravel queue with Python. Provides an interface for communication between Laravel and Python.

Python Laravel Queue Queue sync between Python and Laravel using Redis driver. You can process jobs dispatched from Laravel in Python. NOTE: This pack

Sinan Bekar 3 Oct 1, 2022
ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete

ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete

null 240 Jan 6, 2023
Fully Automated YouTube Channel ▶️with Added Extra Features.

Fully Automated Youtube Channel ▒█▀▀█ █▀▀█ ▀▀█▀▀ ▀▀█▀▀ █░░█ █▀▀▄ █▀▀ █▀▀█ ▒█▀▀▄ █░░█ ░░█░░ ░▒█░░ █░░█ █▀▀▄ █▀▀ █▄▄▀ ▒█▄▄█ ▀▀▀▀ ░░▀░░ ░▒█░░ ░▀▀▀ ▀▀▀░

sam-sepiol 249 Jan 2, 2023
Django-environ allows you to utilize 12factor inspired environment variables to configure your Django application.

Django-environ django-environ allows you to use Twelve-factor methodology to configure your Django application with environment variables. import envi

Daniele Faraglia 2.7k Jan 7, 2023
Django-environ allows you to utilize 12factor inspired environment variables to configure your Django application.

Django-environ django-environ allows you to use Twelve-factor methodology to configure your Django application with environment variables. import envi

Daniele Faraglia 2.7k Jan 3, 2023
The best way to have DRY Django forms. The app provides a tag and filter that lets you quickly render forms in a div format while providing an enormous amount of capability to configure and control the rendered HTML.

django-crispy-forms The best way to have Django DRY forms. Build programmatic reusable layouts out of components, having full control of the rendered

null 4.6k Jan 5, 2023
The best way to have DRY Django forms. The app provides a tag and filter that lets you quickly render forms in a div format while providing an enormous amount of capability to configure and control the rendered HTML.

django-crispy-forms The best way to have Django DRY forms. Build programmatic reusable layouts out of components, having full control of the rendered

null 4.6k Dec 31, 2022
The best way to have DRY Django forms. The app provides a tag and filter that lets you quickly render forms in a div format while providing an enormous amount of capability to configure and control the rendered HTML.

django-crispy-forms The best way to have Django DRY forms. Build programmatic reusable layouts out of components, having full control of the rendered

null 4.6k Jan 7, 2023
Allows including an action inside another action (by preprocessing the Yaml file). This is how composite actions should have worked.

actions-includes Allows including an action inside another action (by preprocessing the Yaml file). Instead of using uses or run in your action step,

Tim Ansell 70 Nov 4, 2022
A python package for your Kali Linux distro that find the fastest mirror and configure your apt to use that mirror

Kali Mirror Finder Using Single Python File A python package for your Kali Linux distro that find the fastest mirror and configure your apt to use tha

MrSingh 6 Dec 12, 2022
Spam your friends and famly and when you do your famly will disown you and you will have no friends.

SpamBot9000 Spam your friends and family and when you do your family will disown you and you will have no friends. Terms of Use Disclaimer: Please onl

DJ15 0 Jun 9, 2022