task management & automation tool

Overview

README

https://travis-ci.org/pydoit/doit.png?branch=master https://ci.appveyor.com/api/projects/status/f7f97iywo8y7fe4d/branch/master?svg=true https://coveralls.io/repos/pydoit/doit/badge.png?branch=master

doit - automation tool

doit comes from the idea of bringing the power of build-tools to execute any kind of task

Sample Code

Define functions returning python dict with task's meta-data.

Snippet from tutorial:

def task_imports():
    """find imports from a python module"""
    for name, module in PKG_MODULES.by_name.items():
        yield {
            'name': name,
            'file_dep': [module.path],
            'actions': [(get_imports, (PKG_MODULES, module.path))],
        }

def task_dot():
    """generate a graphviz's dot graph from module imports"""
    return {
        'targets': ['requests.dot'],
        'actions': [module_to_dot],
        'getargs': {'imports': ('imports', 'modules')},
        'clean': True,
    }

def task_draw():
    """generate image from a dot file"""
    return {
        'file_dep': ['requests.dot'],
        'targets': ['requests.png'],
        'actions': ['dot -Tpng %(dependencies)s -o %(targets)s'],
        'clean': True,
    }

Run from terminal:

$ doit list
dot       generate a graphviz's dot graph from module imports
draw      generate image from a dot file
imports   find imports from a python module
$ doit
.  imports:requests.models
.  imports:requests.__init__
.  imports:requests.help
(...)
.  dot
.  draw

Project Details

license

The MIT License Copyright (c) 2008-2018 Eduardo Naufel Schettino

see LICENSE file

developers / contributors

see AUTHORS file

install

doit is tested on python 3.5 to 3.8.

The last version supporting python 2 is version 0.29.

$ pip install doit

dependencies

  • cloudpickle
  • pyinotify (linux)
  • macfsevents (mac)

Tools required for development:

  • git * VCS
  • py.test * unit-tests
  • coverage * code coverage
  • sphinx * doc tool
  • pyflakes * syntax checker
  • doit-py * helper to run dev tasks

development setup

The best way to setup an environment to develop doit itself is to create a virtualenv...

doit$ virtualenv dev
doit$ source dev/bin/activate

install doit as "editable", and add development dependencies from dev_requirements.txt:

(dev) doit$ pip install --editable .
(dev) doit$ pip install --requirement dev_requirements.txt

tests

Use py.test - http://pytest.org

$ py.test

documentation

doc folder contains ReST documentation based on Sphinx.

doc$ make html

They are the base for creating the website. The only difference is that the website includes analytics tracking. To create it (after installing doit):

$ doit website

spell checking

All documentation is spell checked using the task spell:

$ doit spell

It is a bit annoying that code snippets and names always fails the check, these words must be added into the file doc/dictionary.txt.

The spell checker currently uses hunspell, to install it on debian based systems install the hunspell package: apt-get install hunspell.

profiling

python -m cProfile -o output.pstats `which doit` list

gprof2dot -f pstats output.pstats | dot -Tpng -o output.png

releases

Update version number at:

  • doit/version.py
  • setup.py
  • doc/conf.py
python setup.py sdist
python setup.py bdist_wheel
twine upload dist/doit-X.Y.Z.tar.gz
twine upload dist/doit-X.Y.Z-py3-none-any.whl

contributing

On github create pull requests using a named feature branch.

Comments
  • Allow to customize how file_dep are checked.

    Allow to customize how file_dep are checked.

    Following #22, I tried to do something, but I don't understand clearly how the config / params are handled. Comments welcome ;)

    Add a modified_checkers option which allows to choose how file_dep are checked to know if they are modified. modified_checkers is a dict with boolean values for each checker:

    DOIT_CONFIG = {
        'modified_checkers' : {
            'timestamp': True,
            'size': True,
            'md5': True
        }
    }
    
    opened by saimn 24
  • clean not able to access doit data?

    clean not able to access doit data?

    Actions are able to access doit's database, e.g. the return values of other tasks via getargs. As such configuration affects the action and what is done and produced, I see frequent need to also provide this to clean. But it does not seem to be possible. Is it?

    Thx!

    enhancement docs 
    opened by moltob 23
  • Executing tasks in parallel fails on Windows

    Executing tasks in parallel fails on Windows

    First of all a thank you for this tool, which seems to be very useful for automating scientifc workflows. I am using doit for automating a training-evaluation workflow, which contains many embarrassingly parallel tasks so the -n <NUM_JOB> options is exactly what I want. Unfortunately there are some pickling isses when doing so on Windows.

    My software stack:

    configparser              3.3.0.post2
    doit                      0.28.0
    pip                       6.1.1
    python                    2.7.9
    setuptools                15.1
    six                       1.9.0
    

    dodo.py (instead of the echo commands, some python scripts are started in my application, but the parallel execution behavior is the same):

    # -*- coding: utf-8 -*-
    import os
    import os.path as osp
    
    from doit import tools
    
    FEATURES = ["lbp_small", "lbp_medium", "lbp_72angles", "hog_normalised",
                "hog_default", "daisy_default", "hog_single_cell"]
    OUT = "out"
    
    paths = {}
    paths["OUT_FEATURES"] = osp.join(OUT, "features")
    paths["OUT_EVALUATION"] = osp.join(OUT, "evaluation")
    paths["OUT_FIGURES"] = osp.join(OUT, "figures")
    
    
    def task_feature_extraction():       
        for feat in FEATURES:
            feat_spec = "feat_{}.json".format(feat)
            feat_file = osp.join(paths["OUT_FEATURES"], "feat_{}.hdf5".format(feat))
            yield {"name": feat_file,
                   "actions": ["echo extract %s > %s" % (feat_spec, feat_file)],
                   "targets": [feat_file],
                   "clean": True,
                   # force doit to always mark the task
                   # as up-to-date (unless target removed)
                   'uptodate': [True]}
    

    Command: doit -n 4 causes traceback:

    Traceback (most recent call last):
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\doit_cmd.py", line 165, in run
        return command.parse_execute(args)
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\cmd_base.py", line 122, in parse_execute
        return self.execute(params, args)
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\cmd_base.py", line 405, in execute
        return self._execute(**exec_params)
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\cmd_run.py", line 239, in _execute
        return runner.run_all(self.control.task_dispatcher())
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\runner.py", line 238, in run_all
        self.run_tasks(task_dispatcher)
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\runner.py", line 417, in run_tasks
        proc_list = self._run_start_processes(job_q, result_q)
      File "C:\Anaconda\envs\surface-classification\lib\site-packages\doit\runner.py", line 390, in _run_start_processes
        process.start()
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\process.py", line 130, in start
        self._popen = Popen(self)
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\forking.py", line 277, in __init__
        dump(process_obj, to_child, HIGHEST_PROTOCOL)
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\forking.py", line 199, in dump
        ForkingPickler(file, protocol).dump(obj)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 224, in dump
        self.save(obj)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 419, in save_reduce
        save(state)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 649, in save_dict
        self._batch_setitems(obj.iteritems())
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 681, in _batch_setitems
        save(v)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\forking.py", line 67, in dispatcher
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 401, in save_reduce
        save(args)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 548, in save_tuple
        save(element)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 419, in save_reduce
        save(state)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 649, in save_dict
        self._batch_setitems(obj.iteritems())
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 681, in _batch_setitems
        save(v)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 419, in save_reduce
        save(state)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 649, in save_dict
        self._batch_setitems(obj.iteritems())
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 681, in _batch_setitems
        save(v)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\forking.py", line 67, in dispatcher
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 401, in save_reduce
        save(args)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 548, in save_tuple
        save(element)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 419, in save_reduce
        save(state)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 649, in save_dict
        self._batch_setitems(obj.iteritems())
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 681, in _batch_setitems
        save(v)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 419, in save_reduce
        save(state)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 649, in save_dict
        self._batch_setitems(obj.iteritems())
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 681, in _batch_setitems
        save(v)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 331, in save
        self.save_reduce(obj=obj, *rv)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 396, in save_reduce
        save(cls)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 286, in save
        f(self, obj) # Call unbound method with explicit self
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 748, in save_global
        (obj, module, name))
    PicklingError: Can't pickle <type 'DB'>: it's not found as __builtin__.DB
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Anaconda\envs\surface-classification\lib\multiprocessing\forking.py", line 381, in main
        self = load(from_parent)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 1378, in load
        return Unpickler(file).load()
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 858, in load
        dispatch[key](self)
      File "C:\Anaconda\envs\surface-classification\lib\pickle.py", line 880, in load_eof
        raise EOFError
    EOFError
    Exception AttributeError: "'_DBWithCursor' object has no attribute 'dbc'" in  ignored
    

    If you need further details, please contact me.

    Windows X-platform WIP 
    opened by joschkazj 23
  • delay creation of tasks until after another task is executed

    delay creation of tasks until after another task is executed

    This is required when loading dodo.py you dont have enough information to create some of your tasks metadata. see: https://github.com/getnikola/nikola/issues/1562

    WIP branch: https://github.com/pydoit/doit/tree/delayed-task-creation

    Sample testing code: https://gist.github.com/schettino72/9868c27526a6c5ea554c

    TODO:

    • [x] support delayed task creation for run command
    • [x] make sure commands list, clean, etc create all tasks
    • [x] fix multi-processing
    • [x] make sure implicit dependencies (file_dep -> another_task.target) is respected
    • [x] tests
    opened by schettino72 23
  • plugin system

    plugin system

    Add a plugin system, it should support:

    • [x] commands
    • [x] reporters
    • [x] db-backend
    • [ ] runner
    • [ ] task configuration?

    This plugins could be distributed as separate python packages on PyPI, or just include in the root folder of a project together with dodo.py.

    enhancement 
    opened by schettino72 22
  • Feature dry run

    Feature dry run

    Hello,


    This patch adds an option --dry-run to the run command.

    In this mode changes to the database are prevented by a thin layer around the the database object. Task action and teardown

    execution is skipped.

    I did not test it thoroughly, its my first contribution :-). I did not add tests for the new code. Do you think this can be pulled ? Or what do you require for it being pulled ?

    Thanks for your efforts, Oliver

    stage-development 
    opened by OliverTED 19
  • Add support for Path from pathlib.

    Add support for Path from pathlib.

    Adding support for Path in file_dep is quite easy. But clearly this is only the part of the problem: if someone (like me) uses Path then he want to use them all other the place. It would be quite strange to pass them to file_dep as is but convert to str before passing in targets for example. I think at least targets should support them too and perhaps it would be not very bad idea to support them in CmdAction. @schettino72 what do you think about it?

    opened by Hinidu 18
  • Crashes on Python 3.3 due to missing pathlib

    Crashes on Python 3.3 due to missing pathlib

    I tried switching Nikola’s doit requirement to doit==0.30.0 for Python 3, but it fails on Python 3.3 because pathlib is needed. This happens because pip does not notice the extra dependency. The correct way to do this is with environment markers. We use them in Nikola for doit (in requirements.txt):

    doit>=0.28.0,<=0.29.0 ; python_version <= '3.3'
    doit>=0.28.0 ; python_version >= '3.4'
    

    That can also be used for platform checks (pyinotify/macfsevents).

    opened by Kwpolska 17
  • use Watchdog in auto command

    use Watchdog in auto command

    Problems

    all

    • [x] detect event file move (also check behavior for create and delete)

    windows

    • [ ] check it really works (command line) on python3
    • [ ] Ctrl+C does not work on Windows (confirmed on python2.7 - py3?)
    • [ ] test_cmd_auto test_non_existent_filedep HANGS (test currently skipped)
    • [ ] test_cmd_auto.TestAuto.test_run_wait (py2) https://ci.appveyor.com/project/schettino72/doit/build/1.0.80/job/qxd2507a06hym2pw#L218
    • [ ] test_cmd_auto (py3) several pickle issues
    • [ ] test_filewatch.TestFileWatcher.test_loop (py3) - file events being counted twice https://ci.appveyor.com/project/schettino72/doit/build/1.0.80/job/2reaoab0826imtix#L330

    mac

    • [ ] check it really works (command line) on python?
    • [ ] test_cmd_auto.TestAuto.test_run_wait (all py versions) - error not terminated
    • [ ] test_filewatch testLoop and testExit HANGS (test currently skipped)

    linux

    • [ ] test_filewatch testExit (py2.7) HANGS (test currently skipped)
    opened by schettino72 17
  • `@task_param()` decorator to pass command line parameters to task generators and `@create_after` for class defined tasks

    `@task_param()` decorator to pass command line parameters to task generators and `@create_after` for class defined tasks

    As described in #311 this decorator can be used to pass command line arguments to the task generator. The same parameter definitions are also passed to the generated tasks.

    This branch also includes pull request #307 which implements @create_after() semantics for tasks defined in a class. As discussed on #307 @schettino72 asked to see code for #311 as well. If this PR is merged then #307 will be closed.

    Quick implementation to see if this is what was intended by the issue and to open discussion.

    Work to be done:

    • [x] actually do the parsing!
    • [x] auto compatibility
    • [x] docs since this is a new user-facing feature
    • [x] tests
      • Ensure that a task parameter can be passed to the task generator.
      • Ensure that a task generator parameter can be set from the command line.
      • Ensure that a task parameter can be passed to the task generator defined as a class method.

    Demo:

    $ cat dodo.py
    from doit import create_after, task_param
    
    
    def task_early():
        return {"actions": ["echo early"], "verbosity": 2}
    
    
    @create_after(executed="early")
    @task_param([{"name": "foo", "default": "bar", "long": "foo"}])
    def task_use_param_create_after(foo):
        print(f'Task parameter foo={foo} available at task definition time.')
    
        def runit():
            print(f"param foo={foo}")
    
        return {"actions": [runit], "verbosity": 2}
    
    
    @task_param([{"name": "foo", "default": "bar", "long": "foo"}])
    def task_use_param(foo):
        print(f'Task parameter foo={foo} available at task definition time.')
    
        def runit():
            print(f"param foo={foo}")
    
        return {"actions": [runit], "verbosity": 2}
    
    
    @task_param([{"name": "howmany", "default": 3, "type": int, "long": "howmany"}])
    def task_subtasks(howmany):
        for i in range(howmany):
            yield {"name": i, "actions": [f"echo I can count to {howmany}: {i}"]}
    
    def do_work(foo):
        print(f'Argument foo={foo}')
    
    @task_param([{"name": "foo", "default": "bar", "long": "foo"}])
    def task_use_in_action(foo):
        print(f'When the task action runs it will print {foo}')
    
        return {
            'actions': [do_work],
            'verbosity': 2
        }
    
    @task_param([{'name': 'num_tasks', 'default': 1, 'type': int, 'long': 'num_tasks'}])
    def task_subtask(num_tasks):
        print(f'Generating {num_tasks} subtasks')
    
        def work(task_num, num_tasks):
            print(f'Task {task_num+1} of {num_tasks}')
    
        for i in range(0, num_tasks):
            yield {
                'name': f'task{i}',
                'actions': [(work, (), {'task_num': i})],
                'verbosity': 2
            }
    
    
    $ cat doit.cfg
    [task:use_param_create_after]
        foo = from_doit_cfg
    
    $ doit
    Task parameter foo=bar available at task definition time.
    When the task action runs it will print bar
    Generating 1 subtasks
    .  early
    early
    Task parameter foo=bar available at task definition time.
    .  use_param_create_after
    param foo=bar
    .  use_param
    param foo=bar
    .  subtasks:0
    .  subtasks:1
    .  subtasks:2
    .  use_in_action
    Argument foo=bar
    .  subtask:task0
    Task 1 of 1
    
    
    $ doit info use_param
    Task parameter foo=bar available at task definition time.
    Task parameter foo=bar available at task definition time.
    When the task action runs it will print bar
    Generating 1 subtasks
    
    use_param
    
    status     : run
     * The task has no dependencies.
    
    params     : 
     - {'name': 'foo', 'default': 'bar', 'long': 'foo'}
    
    verbosity  : 2
    
    
    $ doit info use_param_create_after
    Task parameter foo=bar available at task definition time.
    Task parameter foo=bar available at task definition time.
    When the task action runs it will print bar
    Generating 1 subtasks
    
    use_param_create_after
    
    status     : run
     * The task has no dependencies.
    
    params     : 
     - {'name': 'foo', 'default': 'bar', 'long': 'foo'}
    
    verbosity  : 2
    
    
    opened by rbdixon 16
  • Add TOML config loading, autodetect pyproject.toml

    Add TOML config loading, autodetect pyproject.toml

    • fixes #373

    This adds TOML support, with detection of pyproject.toml.

    While I like the suggestion on the parent issue of not using SHOUTY keys, I've tried to make it look as much like the existing config format as is reasonable, and we can iterate from there, but figured being able to comment on (maybe broken) code is better than more issue comment traffic.

    TODO:

    • [x] ~~there are still some ordering issues... right now, values from doit.cfg would be overwritten by pyproject.toml: need to look into the expected behavior from ConfigParser.~~
      • UPDATE: it is indeed FIFO... the merging is still quite naive, e.g. only of the top-level sections, but matches existing behavior
    • [x] tests
      • will it toml?
      • will it pyproject.toml?
    • [x] docs
    opened by bollwyvl 15
  • -s param doesn´t accept multiple arguments since 0.34.0

    -s param doesn´t accept multiple arguments since 0.34.0

    After bump I found out that -s param doesn´t accept multiple tasks as arguments. Not sure whether this was intentional, because it is not included in changelog so just to make sure I am creating this bug report.

    My main use case is that when I run group task (task with no actions) I substitute all tasks that are a direct dependency of group task and run them with -s switch (so it actually runs all direct dependencies instead of doing nothing - group task has no actions on its own) in task_loader. Right now this use case is broken, because only first task is run.

    Please let me know if you need more informations or if I just misunderstood --single param.

    opened by wassupxP 0
  • tasks are uptodate even though their task_dependency is not

    tasks are uptodate even though their task_dependency is not

    Issue: After version bump I found out, that tasks are uptodate even though some of their task_deps are not. Only subtask is rebuilt afterwards and not whole tree to the parent

          task1:up-to-date
          /          \
      task2:run      task3:up-to-date
    

    Not sure whether this was intentional or really a bugfix.

    Workaround: In task_loader inject True or False to "uptodate" for each task in task_deps, depending on status of dependent task.

    # Get current status of dependent task
    result = doit_dependency_manager.get_status(task_object, task_dict)
    # Add result to list as a boolean
    uptodate.append(True if result.status == "up-to-date" else False)
    

    Environment

    1. OS: win
    2. python version: 3.9
    3. doit version: 0.35.0

    Let me know if additional informations or sth is needed.

    opened by wassupxP 0
  • Cleaning all doesn't seem to work

    Cleaning all doesn't seem to work

    Bug description

    When there's the default_tasks specified in the dodo.py file, the doit clean --clean-all doesn't clean all targets.

    from typing import Dict
    
    DOIT_CONFIG = {'default_tasks': ['echo']}
    
    def task_echo() -> Dict:
        return {
            'actions': ['echo xxx']
        }
    
    def task_dir() -> Dict:
        return {
            'actions': ["mkdir xxx"],
            'clean': ["rm -rfv xxx"]
        }
    

    The only way to have the dir task be cleaned is to explicitly specify its name on the command-line, i.e. doit clean dir.

    Environment

    1. OS: Linux Debian Bullseye 11.5
    2. python version: 3.9.2
    3. doit version: 0.36.0

    Proposed fix

    The following change seems to fix the issue for me

    diff --git a/doit/cmd_clean.py b/doit/cmd_clean.py
    index 7cdc90a..899b88d 100644
    --- a/doit/cmd_clean.py
    +++ b/doit/cmd_clean.py
    @@ -99,7 +99,7 @@ class Clean(DoitCmdBase):
             else:
                 # if not cleaning specific task enable clean_dep automatically
                 cleandep = True
    -            if self.sel_tasks is not None:
    +            if self.sel_tasks is not None and not cleanall:
                     clean_list = self._expand(self.sel_tasks)  # default tasks from config
                 else:
                     clean_list = [t.name for t in self.task_list]
    
    opened by tehe 0
  • Allow custom `TaskLoader2.load_tasks` to consume arguments

    Allow custom `TaskLoader2.load_tasks` to consume arguments

    From here. The remaining arguments not parsed by the Command are passed to the TaskLoader. This would be a nice opportunity for the TaskLoader to be able to consume/modify arguments before the arguments are parsed by the main program.

    Proposal

    Allow the result of the TaskLoader2.load_tasks to be either List[Task] or Tuple[List[Task], List[str]] where in the tuple case the second result would be arguments.

    Alternatives

    Right now it's possible to mutate the passed in the argument list to get the desired effect. That is obviously not intended.

    need-sample 
    opened by ktbarrett 1
  • normalize_callable breaking action property for positional arguments

    normalize_callable breaking action property for positional arguments

    Defining a CmdAction with a positional argument leads to untestable task testing. For instance if we use the following minimal task

    from doit.action import CmdAction
    
    
    def task_pos_args():
    
        def show_params(pos):
            return f"echo {pos}"
    
        return {
            'actions':[CmdAction(show_params)],
            'pos_arg': 'pos',
            'verbosity': 2,
        }
    

    and write a test that calls the action with parametrized arguments

    import pytest
    
    
    @pytest.mark.parametrize("pos", ["arg", ["arg1", "arg2"]])
    def test_pos_args(pos):
        cmd_action = task_pos_args()["action"][0]
        cmd = cmd_action.action().format(pos=pos)
        assert cmd == f"echo {pos}"
    

    will fail, because calling the action property on a CmdAction with a positional argument throws the error missing 1 positional argument. https://github.com/pydoit/doit/blob/83309d81a7eb6dda30b9d04a05dd99a2de44192b/doit/action.py#L148-L156 Looking at the above show_params is called as ref on empty args field due to normalize_callable not returning the correct signature. https://github.com/pydoit/doit/blob/83309d81a7eb6dda30b9d04a05dd99a2de44192b/doit/action.py#L17-L23

    Environment

    1. OS: Linux
    2. python version: 3.9.15
    3. doit version: 0.36.0

    Edit: added imports to minimal examples

    opened by mkoenig-dev 1
  • support async Python actions

    support async Python actions

    Motivations

    Two motivations for this (sizeable) request:

    • it may provide a simpler route to parallel action support (which is currently unsupported in Windows and has limitations elsewhere)
    • allow for integration against async libraries (e.g. aiodocker)

    Usage

    1. Declaring a Python async function as a task's action would result in that task's action being pushed onto the event loop. Additional plumbing would be added to the loop as well in order to chain dependent tasks in downstream
    2. Activating a global configuration option would cause doit to automatically run all shell command blocks using an async shell exec adapter such as Python's async subprocess feature

    The work

    I have not looked at doit code yet, but would consider volunteering for some / all of this effort if more-experienced contributors believe it is worth pursuing.

    I did some quick searches through PRs and existing issues and did not see any discussion of Python's async faculties... Sorry if I missed something or if this is already asked and answered

    opened by wahuneke 2
Software build automation tool for Python.

PyBuilder — an easy-to-use build automation tool for Python PyBuilder is a software build tool written in 100% pure Python, mainly targeting Python ap

PyBuilder 1.5k Jan 4, 2023
Buildout is a deployment automation tool written in and extended with Python

Buildout Buildout is a project designed to solve 2 problems: Application-centric assembly and deployment Assembly runs the gamut from stitching togeth

buildout 552 Nov 26, 2022
🔨🐍Make-like build automation tool for Python projects with extensive DSL features.

Pyke (WIP, Beta Release) Make-like build automation tool for Python projects with extensive DSL features. Features: Users can specify tasks, subtasks,

Ire 17 Jul 5, 2022
SCons - a software construction tool

SCons - a software construction tool Welcome to the SCons development tree. The real purpose of this tree is to package SCons for production distribut

SCons Project 1.6k Jan 3, 2023
bitbake tool

Bitbake ======= BitBake is a generic task execution engine that allows shell and Python tasks to be run efficiently and in parallel while working wit

openembedded 336 Dec 27, 2022
This is a simple tool for bootstrapping Chimera systems from binaries. For source builds, you want cports.

chimera-bootstrap This is a simple tool for bootstrapping Chimera systems from binaries. For source builds, you want cports. Simple usage: $ # run as

Chimera Linux 7 Feb 11, 2022
A Star Trek Online build tool in Python

SETS - STO Equipment and Trait Selector A Star Trek Online build tool in Python Description Pre-alpha version of build tool for STO Getting Started De

Star Trek Online Community Developers 7 Nov 12, 2022
A task scheduler with task scheduling, timing and task completion time tracking functions

A task scheduler with task scheduling, timing and task completion time tracking functions. Could be helpful for time management in daily life.

ArthurLCW 0 Jan 15, 2022
organize - The file management automation tool

organize - The file management automation tool

Thomas Feldmann 1.5k Jan 1, 2023
Command line driven CI frontend and development task automation tool.

tox automation project Command line driven CI frontend and development task automation tool At its core tox provides a convenient way to run arbitrary

tox development team 3.1k Jan 4, 2023
IP address management (IPAM) and data center infrastructure management (DCIM) tool.

NetBox is an IP address management (IPAM) and data center infrastructure management (DCIM) tool. Initially conceived by the network engineering team a

NetBox Community 11.8k Jan 7, 2023
A CLI based task manager tool which helps you track your daily task and activity.

CLI based task manager tool This is the simple CLI tool can be helpful in increasing your productivity. More like your todolist. It uses Postgresql as

ritik 1 Jan 19, 2022
Worktory is a python library created with the single purpose of simplifying the inventory management of network automation scripts.

Worktory is a python library created with the single purpose of simplifying the inventory management of network automation scripts.

Renato Almeida de Oliveira 18 Aug 31, 2022
Open source platform for Data Science Management automation

Hydrosphere examples This repo contains demo scenarios and pre-trained models to show Hydrosphere capabilities. Data and artifacts management Some mod

hydrosphere.io 6 Aug 10, 2021
Auto-hms-action - Automation of NU Health Management System

?? Automation of NU Health Management System ?? 長崎大学 健康管理システムの自動化 ?? Usage / 使い方

k5-mot 3 Mar 4, 2022
Facebook account cloning/hacking advanced tool + dictionary attack added | Facebook automation tool

loggef Facebook automation tool, Facebook account hacking and cloning advanced tool + dictionary attack added Warning Use this tool for educational pu

Md Josif Khan 149 Aug 10, 2022
Student-Management-System-in-Python - Student Management System in Python

Student-Management-System-in-Python Student Management System in Python

G.Niruthian 3 Jan 1, 2022
Pythonic task management & command execution.

Welcome to Invoke! Invoke is a Python (2.7 and 3.4+) library for managing shell-oriented subprocesses and organizing executable Python code into CLI-i

null 3.8k Jan 6, 2023
A django integration for huey task queue that supports multi queue management

django-huey This package is an extension of huey contrib djhuey package that allows users to manage multiple queues. Installation Using pip package ma

GAIA Software 32 Nov 26, 2022