Subprocesses for Humans 2.0.

Overview

Delegator.py — Subprocesses for Humans 2.0

Delegator.py is a simple library for dealing with subprocesses, inspired by both envoy and pexpect (in fact, it depends on it!).

This module features two main functions delegator.run() and delegator.chain(). One runs commands, blocking or non-blocking, and the other runs a chain of commands, separated by the standard unix pipe operator: |.

Basic Usage

Basic run functionality:

>>> c = delegator.run('ls')
>>> print c.out
README.rst   delegator.py

>>> c = delegator.run('long-running-process', block=False)
>>> c.pid
35199
>>> c.block()
>>> c.return_code
0

Commands can be passed in as lists as well (e.g. ['ls', '-lrt']), for parameterization.

Basic chain functionality:

# Can also be called with ([['fortune'], ['cowsay']]).
# or, delegator.run('fortune').pipe('cowsay')

>>> c = delegator.chain('fortune | cowsay')
>>> print c.out
  _______________________________________
 / Our swords shall play the orators for \
 | us.                                   |
 |                                       |
 \ -- Christopher Marlowe                /
  ---------------------------------------
         \   ^__^
          \  (oo)\_______
             (__)\       )\/\
                 ||----w |
                 ||     ||

Expect functionality is built-in too, on non-blocking commands:

>>> c.expect('Password:')
>>> c.send('PASSWORD')
>>> c.block()

Other functions:

>>> c.kill()
>>> c.send('SIGTERM', signal=True)

# Only available when block=True, otherwise, use c.out.
>>> c.err
''

# Direct access to pipes.
>>> c.std_err
<open file '<fdopen>', mode 'rU' at 0x10a5351e0>

# Adjust environment variables for the command (existing will be overwritten).
>>> c = delegator.chain('env | grep NEWENV', env={'NEWENV': 'FOO_BAR'})
>>> c.out
NEWENV=FOO_BAR

Installation

$ pip install delegator.py

🍰

Comments
  • how to pipe string to rest of commands

    how to pipe string to rest of commands

    Hi! I would like to simplify my usage of subprocess and so i found your work. It would seem that it does what is advertised to do (just trying to say that is an amazing project) but documentation is a bit lacking ... i do not find a way to pipe a string to a custom chain (something like echo stdout_from_other_process | chain_of_shell_commands_that_will_run_in_my_pseudo_terminal). Could someone give me a hint about it? :)
    Thank you!!

    question 
    opened by adriansev 6
  • pexpect.exceptions.TIMEOUT

    pexpect.exceptions.TIMEOUT

    I'm executing a process that's running for a few minutes before printing some output. After a few seconds I got:

      File "/usr/lib/python2.7/site-packages/delegator.py", line 240, in chain
        data = c.out
      File "/usr/lib/python2.7/site-packages/delegator.py", line 91, in out
        self.__out = self._pexpect_out
      File "/usr/lib/python2.7/site-packages/delegator.py", line 79, in _pexpect_out
        result += self.subprocess.read()
      File "/usr/lib/python2.7/site-packages/pexpect/spawnbase.py", line 413, in read
        self.expect(self.delimiter)
      File "/usr/lib/python2.7/site-packages/pexpect/spawnbase.py", line 321, in expect
        timeout, searchwindowsize, async)
      File "/usr/lib/python2.7/site-packages/pexpect/spawnbase.py", line 345, in expect_list
        return exp.expect_loop(timeout)
      File "/usr/lib/python2.7/site-packages/pexpect/expect.py", line 107, in expect_loop
        return self.timeout(e)
      File "/usr/lib/python2.7/site-packages/pexpect/expect.py", line 70, in timeout
        raise TIMEOUT(msg)
    pexpect.exceptions.TIMEOUT: <pexpect.popen_spawn.PopenSpawn object at 0x1bdb510>
    searcher: searcher_re:
        0: EOF
    <pexpect.popen_spawn.PopenSpawn object at 0x1bdb510>
    searcher: searcher_re:
        0: EOF
    

    There doesn't seem to be a way to control the timeout, is there?

    opened by gjedeer 6
  • module 'delegator' has no attribute 'chain'

    module 'delegator' has no attribute 'chain'

    import delegator

    c = delegator.chain('ls | grep a') c.out


    AttributeError Traceback (most recent call last) in 1 import delegator ----> 2 c = delegator.chain('ls | grep a') AttributeError: module 'delegator' has no attribute 'chain'

    opened by babvin 5
  • Conflict between delegator and delegator.py

    Conflict between delegator and delegator.py

    if I have both delegator and delegaror.py installed python will pick the first one and I'll get an error message saying AttributeError: module 'delegator' has no attribute 'run', is there a way to ensure the right module is being used when both of these packages are installed?

    opened by luisdavim 5
  • Monitoring progress of long-running commands

    Monitoring progress of long-running commands

    The README seems to imply that it is possible to use c.std_out to monitor the progress of long-running commands, but there is no documentation or other guidance as to how one might actually do that. Adding even the briefest of examples to the README would be extraordinarily helpful.

    Does anyone have any suggestions as to how one might monitor the progress of Delegator-spawned long-running commands?

    opened by justinmayer 5
  • reading std_out of non blocking commands

    reading std_out of non blocking commands

    When I try:

    c = delegator.run(['sleep', ' 10'], block=False)
    c.out
    

    The call blocks for 10 secs then produces "" as expected. However if I try:

    c = delegator.run(['sleep', ' 10'], block=False)
    c.std_out.read()
    

    I get UnsupportedOperation: not readable

    Is this expected? I had expected to be able to iterate over c.std_out to monitor progress.

    opened by puterleat 5
  • Add 'binary' run flag to allow running commands with binary output

    Add 'binary' run flag to allow running commands with binary output

    Use case:

    >>> import delegator
    >>> len(delegator.run("head -c16KiB", /dev/urandom").out)                                                                                                                                        
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/podvody/Repos/delegator.py/delegator.py", line 77, in out
        self.__out = self.std_out.read()
      File "/usr/lib64/python3.4/codecs.py", line 319, in decode
        (result, consumed) = self._buffer_decode(data, self.errors, final)
    UnicodeDecodeError: 'utf-8' codec can't decode byte 0xdf in position 0: invalid continuation byte
    >>> len(delegator.run("head -c16KiB /dev/urandom", binary_stdio=True).out)                                                                                                                     
    16384
    
    opened by shaded-enmity 5
  • expect functionality only works when subprocess is flushing

    expect functionality only works when subprocess is flushing

    This annoys the hell out of me, because Python doesn't flush stdout when printing by default.

    Currently, interactive Python scripts must be run with -u to work correctly.

    opened by kennethreitz 5
  • Version 0.0.6 not in master branch & setup.py still says version='0.0.5'

    Version 0.0.6 not in master branch & setup.py still says version='0.0.5'

    In the "releases" page there is version 0.0.6, but it's missing from the master branch, it only exists as a tag. And the setup.py of the 0.0.6 tarball still says version='0.0.5'.

    opened by hexchain 4
  • Only first argument is used

    Only first argument is used

    I just did pip3 install delegator.py and I cannot get multiple arguments to work. As an example:

    import delegator
    
    # as a single argument works fine
    delegator.run(['echo hello']).out
    # outputs: 'hello\n'
    
    # as multiple arguments seems to only use the first one
    delegator.run(['echo', 'hello']).out
    # outputs: '\n', expected output: 'hello\n'
    
    opened by jonmcclung 3
  • run: add command_cls arg to allow for custom Command class

    run: add command_cls arg to allow for custom Command class

    i want to use this library, but would like to make some modifications to Command that may not be for everyone. this seems like a fair way to allow for that without having to monkey patch.

    need more information 
    opened by azban 3
  • set timeout for block type commands

    set timeout for block type commands

    set timeout for block-type commands Signed-off-by: Harshad Reddy Nalla [email protected]

    This will allow a timeout to function on the block-type command runs.

    opened by harshad16 0
  • /usr/bin/find /non/existent/path | /usr/bin/wc -l => c.out = 2 (should be 0)

    /usr/bin/find /non/existent/path | /usr/bin/wc -l => c.out = 2 (should be 0)

    /usr/bin/find /non/existent/path | /usr/bin/wc -l linux command line return: 0 (correct) delegator c.out = 2 (incorrect)

    As per the above, if I run this on the linux command line, I get 0 (correct). But when I pass this through delegator, I get 2.

    path = '/non/existent/path' the_command = f'/usr/bin/find {path} | /usr/bin/wc -l' c = delegator.chain(the_command) output = (c.out).rstrip() print(output) 2

    and for a path that exists, that returns 1900 on the linux command line, delegator returns 1901.

    opened by ghost 0
  • Documentation incorrectly suggests that you can pass a list of args for something you want to run

    Documentation incorrectly suggests that you can pass a list of args for something you want to run

    Because of the "shell": True argument below, passing in a list will not work as expected, which I had to find out the hard way

    @property
    def _default_popen_kwargs(self):
        return {
            "env": os.environ.copy(),
            "stdin": subprocess.PIPE,
            "stdout": subprocess.PIPE,
            "stderr": subprocess.PIPE,
            "shell": True,
            "universal_newlines": True,
            "bufsize": 0,
        }
    
    opened by flyingsymbols 0
  • Memory leaking with delegator.chain()?

    Memory leaking with delegator.chain()?

    I get a memory leak when using certain commands with the chain function. I boiled it down to this simple test case

    import delegator
    
    def run_shell_chain(cmd):
    	return delegator.chain(cmd).out.strip("\n")
    
    for i in range(50000):
    	output = run_shell_chain("iostat -m -y 1 1")
    	print(output)
    
    

    If I watch ps, I can see the memory footprint increase indefinitely.

    I cannot reproduce it with a simple piped hello world command. (echo 'hello world' | sed 's/hello/goodbye/')

    System info

    >>> pkg_resources.get_distribution("delegator.py").version
    '0.1.1'
    
    $ python3 -V
    Python 3.6.8
    
     ~$ uname -a
    4.18.0-25-generic #26~18.04.1-Ubuntu SMP Thu Jun 27 07:28:31 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
    
    

    delegator-chain-with-original-cmd

    opened by psilocyber 0
  • hang on chain if pipe nothing to awk

    hang on chain if pipe nothing to awk

    delegator.chain("printf '' | awk '{print $1}'") would hang, while delegator.run("printf '' | awk '{print $1}'") wouldn't. It seems that the command would block in pexpect

    ^C---------------------------------------------------------------------------
    KeyboardInterrupt                         Traceback (most recent call last)
    <ipython-input-3-fc2320a6bab7> in <module>
    ----> 1 delegator.chain("printf '' | awk '{print $1}'")
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/delegator.py in chain(command, timeout, cwd, env)
        308             c.subprocess.sendeof()
        309
    --> 310         data = c.out
        311
        312     return c
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/delegator.py in out(self)
        126             self.__out = self.std_out.read()
        127         else:
    --> 128             self.__out = self._pexpect_out
        129
        130         return self.__out
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/delegator.py in _pexpect_out(self)
        114             result += self.subprocess.after
        115
    --> 116         result += self.subprocess.read()
        117         return result
        118
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/pexpect/spawnbase.py in read(self, size)
        439         if size < 0:
        440             # delimiter default is EOF
    --> 441             self.expect(self.delimiter)
        442             return self.before
        443
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/pexpect/spawnbase.py in expect(self, pattern, timeout, searchwindowsize, async_, **kw)
        339         compiled_pattern_list = self.compile_pattern_list(pattern)
        340         return self.expect_list(compiled_pattern_list,
    --> 341                 timeout, searchwindowsize, async_)
        342
        343     def expect_list(self, pattern_list, timeout=-1, searchwindowsize=-1,
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/pexpect/spawnbase.py in expect_list(self, pattern_list, timeout, searchwindowsize, async_, **kw)
        367             return expect_async(exp, timeout)
        368         else:
    --> 369             return exp.expect_loop(timeout)
        370
        371     def expect_exact(self, pattern_list, timeout=-1, searchwindowsize=-1,
    
    ~/.pyenv/versions/3.7.0/lib/python3.7/site-packages/pexpect/expect.py in expect_loop(self, timeout)
        111                 incoming = spawn.read_nonblocking(spawn.maxread, timeout)
        112                 if self.spawn.delayafterread is not None:
    --> 113                     time.sleep(self.spawn.delayafterread)
        114                 if timeout is not None:
        115                     timeout = end_time - time.time()
    
    KeyboardInterrupt:
    

    My delegator version is 0.1.1

    opened by RPing 0
Owner
Amit Tripathi
Software engineer @aws
Amit Tripathi
Deep Learning for humans

Keras: Deep Learning for Python Under Construction In the near future, this repository will be used once again for developing the Keras codebase. For

Keras 57k Jan 9, 2023
Python job scheduling for humans.

schedule Python job scheduling for humans. Run Python functions (or any other callable) periodically using a friendly syntax. A simple to use API for

Dan Bader 10.4k Jan 2, 2023
Topic Modelling for Humans

gensim – Topic Modelling in Python Gensim is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Targ

RARE Technologies 13.8k Jan 3, 2023
Machine Learning toolbox for Humans

Reproducible Experiment Platform (REP) REP is ipython-based environment for conducting data-driven research in a consistent and reproducible way. Main

Yandex 662 Nov 20, 2022
Topic Modelling for Humans

gensim – Topic Modelling in Python Gensim is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Targ

RARE Technologies 13.8k Jan 2, 2023
Web Content Retrieval for Humans™

Lassie Lassie is a Python library for retrieving basic content from websites. Usage >>> import lassie >>> lassie.fetch('http://www.youtube.com/watch?v

Mike Helmick 571 Dec 29, 2022
Pythonic HTML Parsing for Humans™

Requests-HTML: HTML Parsing for Humans™ This library intends to make parsing HTML (e.g. scraping the web) as simple and intuitive as possible. When us

Python Software Foundation 12.9k Jan 1, 2023
Asynchronous Python HTTP Requests for Humans using Futures

Asynchronous Python HTTP Requests for Humans Small add-on for the python requests http library. Makes use of python 3.2's concurrent.futures or the ba

Ross McFarland 2k Dec 30, 2022
Pythonic HTML Parsing for Humans™

Requests-HTML: HTML Parsing for Humans™ This library intends to make parsing HTML (e.g. scraping the web) as simple and intuitive as possible. When us

Python Software Foundation 12.9k Jan 1, 2023
A universal package of scraper scripts for humans

Scrapera is a completely Chromedriver free package that provides access to a variety of scraper scripts for most commonly used machine learning and data science domains.

null 299 Dec 15, 2022
Web Content Retrieval for Humans™

Lassie Lassie is a Python library for retrieving basic content from websites. Usage >>> import lassie >>> lassie.fetch('http://www.youtube.com/watch?v

Mike Helmick 570 Dec 19, 2022
Python Data Validation for Humans™.

validators Python data validation for Humans. Python has all kinds of data validation tools, but every one of them seems to require defining a schema

Konsta Vesterinen 670 Jan 9, 2023
Python Data Structures for Humans™.

Schematics Python Data Structures for Humans™. About Project documentation: https://schematics.readthedocs.io/en/latest/ Schematics is a Python librar

Schematics 2.5k Dec 28, 2022
Python job scheduling for humans.

schedule Python job scheduling for humans. Run Python functions (or any other callable) periodically using a friendly syntax. A simple to use API for

Dan Bader 10.4k Jan 2, 2023
SQL for Humans™

Records: SQL for Humans™ Records is a very simple, but powerful, library for making raw SQL queries to most relational databases. Just write SQL. No b

Kenneth Reitz 6.9k Jan 7, 2023
Python Development Workflow for Humans.

Pipenv: Python Development Workflow for Humans [ ~ Dependency Scanning by PyUp.io ~ ] Pipenv is a tool that aims to bring the best of all packaging wo

Python Packaging Authority 23.5k Jan 1, 2023
Deep Learning for humans

Keras: Deep Learning for Python Under Construction In the near future, this repository will be used once again for developing the Keras codebase. For

Keras 50.7k Feb 12, 2021
Topic Modelling for Humans

gensim – Topic Modelling in Python Gensim is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Targ

RARE Technologies 11.7k Feb 12, 2021
Topic Modelling for Humans

gensim – Topic Modelling in Python Gensim is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. Targ

RARE Technologies 11.7k Feb 18, 2021
Knowledge Management for Humans using Machine Learning & Tags

HyperTag helps humans intuitively express how they think about their files using tags and machine learning. Represent how you think using tags. Find what you look for using semantic search for your text documents (yes, even PDF's) and images.

Ravn Tech, Inc. 166 Jan 7, 2023