pip-run - dynamic dependency loader for Python

Related tags

pip-run
Overview
tests Code style: Black https://readthedocs.org/projects/pip-run/badge/?version=latest

pip-run provides on-demand temporary package installation for a single interpreter run.

It replaces this series of commands (or their Windows equivalent):

$ virtualenv --python pythonX.X --system-site-packages $temp/env
$ $temp/env/bin/pip install pkg1 pkg2 -r reqs.txt
$ $temp/env/bin/python ...
$ rm -rf $temp/env

With this single-line command:

$ pythonX.X -m pip-run pkg1 pkg2 -r reqs.txt -- ...

Features include

  • Downloads missing dependencies and makes their packages available for import.
  • Installs packages to a special staging location such that they're not installed after the process exits.
  • Relies on pip to cache downloads of such packages for reuse.
  • Leaves no trace of its invocation (except files in pip's cache).
  • Supersedes installed packages when required.
  • Relies on packages already satisfied [1].
  • Re-uses the pip tool chain for package installation.

pip-run is not intended to solve production dependency management, but does aim to address the other, one-off scenarios around dependency management:

  • trials and experiments
  • build setup
  • test runners
  • just in time script running
  • interactive development
  • bug triage

pip-run is a compliment to Pip and Virtualenv and Setuptools, intended to more readily address the on-demand needs.

[1] Except when a requirements file is used.

Installation

pip-run is meant to be installed in the system site packages alongside pip, though it can also be installed in a virtualenv.

Usage

  • as script launcher
  • as runtime dependency context manager
  • as interactive interpreter in dependency context
  • as module launcher (akin to python -m)

Invoke pip-run from the command-line using the console entry script (simply pip-run) or using the module executable ( python -m pip-run). This latter usage is particularly convenient for testing a command across various Python versions.

Parameters following pip-run are passed directly to pip install, so pip-run numpy will install numpy (reporting any work done during the install) and pip-run -q -r requirements.txt will quietly install all the requirements listed in a file called requirements.txt. Any environment variables honored by pip are also honored.

Following the parameters to pip install, one may optionally include a -- after which any parameters will be passed to a Python interpreter in the context.

Examples

The examples folder in this project includes some examples demonstrating the power and usefulness of the project. Read the docs on those examples for instructions.

In many of these examples, the option -q is passed to pip-run to suppress the output from pip.

Module Script Runner

Perhaps the most powerful usage of pip-run is its ability to invoke executable modules and packages via runpy (aka python -m):

$ pip-run -q pycowsay -- -m pycowsay "moove over, pip-run"

  -------------------
< moove over, pip-run >
  -------------------
   \   ^__^
    \  (oo)\_______
       (__)\       )\/\
           ||----w |
           ||     ||

cowsay example animation

Interactive Interpreter

pip-run also offers a painless way to run a Python interactive interpreter in the context of certain dependencies:

$ /clean-install/python -m pip-run -q boto
>>> import boto
>>>

Command Runner

Note that everything after the -- is passed to the python invocation, so it's possible to have a one-liner that runs under a dependency context:

$ python -m pip-run -q requests -- -c "import requests; print(requests.get('https://pypi.org/project/pip-run').status_code)"
200

As long as pip-run is installed in each of Python environments on the system, this command can be readily repeated on the other python environments by specifying the relevant interpreter:

$ python2.7 -m pip-run ...

or on Windows:

$ py -2.7 -m pip-run ...

Experiments and Testing

Because pip-run provides a single-command invocation, it is great for experiments and rapid testing of various package specifications.

Consider a scenario in which one wishes to create an environment where two different versions of the same package are installed, such as to replicate a broken real-world environment. Stack two invocations of pip-run to get two different versions installed:

$ pip-run -q keyring==21.8.0 -- -m pip-run -q keyring==22.0.0 -- -c "import importlib.metadata, pprint; pprint.pprint([dist._path for dist in importlib.metadata.distributions() if dist.metadata['name'] == 'keyring'])"
[PosixPath('/var/folders/03/7l0ffypn50b83bp0bt07xcch00n8zm/T/pip-run-a3xvd267/keyring-22.0.0.dist-info'),
PosixPath('/var/folders/03/7l0ffypn50b83bp0bt07xcch00n8zm/T/pip-run-1fdjsgfs/keyring-21.8.0.dist-info')]

Script Runner

Let's say you have a script that has a one-off purpose. It's either not part of a library, where dependencies are normally declared, or it is normally executed outside the context of that library. Still, that script probably has dependencies, say on requests. Here's how you can use pip-run to declare the dependencies and launch the script in a context where those dependencies have been resolved.

First, add a __requires__ directive at the head of the script:

#!/usr/bin/env python

__requires__ = ['requests']

import requests

req = requests.get('https://pypi.org/project/pip-run')
print(req.status_code)

Then, simply invoke that script with pip-run:

$ python -m pip-run -q -- myscript.py
200

The format for requirements must follow PEP 508.

pip-run also recognizes a global __index_url__ attribute. If present, this value will supply --index-url to pip with the attribute value, allowing a script to specify a custom package index:

#!/usr/bin/env python

__requires__ = ['my_private_package']
__index_url__ = 'https://my.private.index/'

import my_private_package
...

Supplying parameters to Pip

If you've been using pip-run, you may have defined some requirements in the __requires__ of a script, but now you wish to install those to a more permanent environment. pip-run provides a routine to facilitate this case:

$ python -m pip_run.read-deps script.py
my_dependency

If you're on Unix, you may pipe this result directly to pip:

$ pip install $(python -m pip_run.read-deps script.py)

And since pipenv uses the same syntax, the same technique works for pipenv:

$ pipenv install $(python -m pip_run.read-deps script.py)

How Does It Work

pip-run effectively does the following:

  • pip install -t $TMPDIR
  • PYTHONPATH=$TMPDIR python
  • cleanup

For specifics, see pip_run.run().

Limitations

  • Due to limitations with pip, pip-run cannot run with "editable" (-e) requirements.
  • pip-run uses a sitecustomize module to ensure that .pth files in the requirements are installed. As a result, any environment that has a sitecustomize module will find that module masked when running under pip-run.

Comparison with pipx

The pipx project is another mature project with similar goals. Both projects expose a project and its dependencies in ephemeral environments. The main difference is pipx primarily exposes Python binaries (console scripts) from those environments whereas pip-run exposes a Python context (including runpy scripts).

Feature pip-run pipx
user-mode operation
invoke console scripts  
invoke runpy modules  
run standalone scripts  
interactive interpreter with deps  
re-use existing environment  
ephemeral environments
persistent environments  
PEP 582 support  
Specify optional dependencies  
Python 2 support  

Comparison with virtualenvwrapper mktmpenv

The mkvirtualenv project attempts to address some of the use-cases that pip-run solves, especially with the mktmpenv command, which destroys the virtualenv after deactivation. The main difference is that pip-run is transient only for the invocation of a single command, while mktmpenv lasts for a session.

Feature pip-run mktmpenv
create temporary package environment
re-usable across python invocations  
portable  
one-line invocation  
multiple interpreters in session  
run standalone scripts  
interactive interpreter with deps
re-use existing environment  
ephemeral environments
persistent environments  

Integration

The author created this package with the intention of demonstrating the capability before integrating it directly with pip in a command such as pip run. After proposing the change, the idea was largely rejected in pip 3971.

If you would like to see this functionality made available in pip, please upvote or comment in that ticket.

Versioning

pip-run uses semver, so you can use this library with confidence about the stability of the interface, even during periods of great flux.

Testing

Invoke tests with tox.

Issues
  • Git URLs no longer supported in __requires__?

    Git URLs no longer supported in __requires__?

    Somewhere between versions 2.15.1 and 3.0, rwt stopped supporting (specifically, it appears to flat-out ignore) package specifiers of the form git+ssh://[email protected]/repo.git in __requires__ lists, though it continues to support them in requirements.txt files. I see no mention of this change in CHANGES.rst; was it intentional? If not, could we have the old behavior back?

    opened by jwodder 8
  • Automatic dependency management

    Automatic dependency management

    Hey @jaraco. I am reading about the project and for some reason I want to believe that rwt is able to figure out and fetch dependencies automatically, but so far I haven't found proof of that - we still need to create a custom "import" section just for rwt. In this respect rwt is still manual dependency management like http://docs.groovy-lang.org/latest/html/documentation/grape.html

    enhancement 
    opened by techtonik 6
  • Change `-m pip-run` in README to `-m pip_run`

    Change `-m pip-run` in README to `-m pip_run`

    Recent versions of Python (not sure when it started; 3.9.0? 3.9.1?) no longer support using the -m option with an invalid module name. This PR thus updates the documentation to invoke pip-run with -m in a way that is guaranteed to work with all Python versions.

    opened by jwodder 5
  • Add a console entry point

    Add a console entry point

    I gave this a try, and my immediate thought was to run it as "rwt". I know the docs say to use "python -m rwt", but maybe add a console entry point as a convenience?

    opened by pfmoore 5
  • setup.py containing __requires__ fails when run without rwt

    setup.py containing __requires__ fails when run without rwt

    hi @jaraco,

    Here's a setup.py which defines __requires__ for use with rwt tool. It's taken straight from your README.rst for the sake of example.

    #!/usr/bin/env python
    
    __requires__ = ['setuptools', 'setuptools_scm']
    
    from setuptools import setup
    
    setup(
        name='test_rwt',
        use_scm_version=True,
        setup_requires=__requires__,
    )
    

    Now, if I try to run this without also invoking python -m rwt -- setup.py ..., like the usual:

    $ python3 setup.py --help
    

    I get this error from pkg_resources:

    Traceback (most recent call last):
      File "setup_rwt.py", line 5, in <module>
        from setuptools import setup
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/setuptools/__init__.py", line 12, in <module>
        import setuptools.version
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/setuptools/version.py", line 1, in <module>
        import pkg_resources
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3018, in <module>
        @_call_aside
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3002, in _call_aside
        f(*args, **kwargs)
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3031, in _initialize_master_working_set
        working_set = WorkingSet._build_master()
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 654, in _build_master
        ws.require(__requires__)
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 962, in require
        needed = self.resolve(parse_requirements(requirements))
      File "/usr/local/var/pyenv/versions/3.6.0/Python.framework/Versions/3.6/lib/python3.6/site-packages/pkg_resources/__init__.py", line 848, in resolve
        raise DistributionNotFound(req, requirers)
    pkg_resources.DistributionNotFound: The 'setuptools_scm' distribution was not found and is required by the application
    

    The error happens when a requirement listed in __requires__ (like setuptools_scm here) is not already installed in the current environment.

    It seems like pkg_resources (imported by setuptools) also attempts to use the __requires__ variable, in WorkinSet._build_master method:

    https://github.com/pypa/setuptools/blob/089cdeb489a0fa94d11b7307b54210ef9aa40511/pkg_resources/init.py#L640-L658

    This behavior is documented in the pkg_resources docs: http://setuptools.readthedocs.io/en/latest/pkg_resources.html#workingset-objects

    All distributions available directly on sys.path will be activated automatically when pkg_resources is imported. This behaviour can cause version conflicts for applications which require non-default versions of those distributions. To handle this situation, pkg_resources checks for a requires attribute in the main module when initializing the default working set, and uses this to ensure a suitable version of each affected distribution is activated. For example:

    __requires__ = ["CherryPy < 3"] # Must be set before pkg_resources import
    import pkg_resources
    

    In your REAME.rst, you say that this technique:

    When invoked with rwt, the dependencies will be assured before the script is run, or if run with setuptools, the dependencies will be loaded using the older technique, so the script is backward compatible.

    However, it appears to me that one cannot have such a backward-compatible setup.py which also takes advantage of rwt via the __requires__ variable, because once one uses that, it fails as one tries to run it in the old-fashioned way.

    Am I missing something? Thank you for your help.

    In the meantime I think I'll just use python3 -m rwt -r requirements/setup.txt -- setup.py ... instead of __requires__.

    opened by anthrotype 5
  • Add PEP 518 build-requires support

    Add PEP 518 build-requires support

    Although the PEP 518 support added by pip 10 goes a long way toward supporting the use cases of installing dependencies for the "install" and "wheel [build]" steps, it is of much less help for many of the other use-cases that setuptools setup_requires facilitates. A few prominent ones:

    • setup.py test
    • setup.py pytest (and other custom commands)
    • setup.py check_docs

    But more generally, anything that invokes setup.py for its distutils-based behaviors.

    @pganssle has proposed that Setuptools should present a CLI that also supports PEP 518 for this use-case, and I'm convinced something like that is needed, as I also as hard as I try am finding it difficult to convert projects away from the tried-and-true paradigms to whole new ones without real issues arising.

    But it also occurs to me that there is a project that has support for installing packages on demand and which might be a suitable bridge for setup.py, and that's this project (RWT).

    RWT already builds on pip and builds throw away environments for the duration of a command (not dissimilar to what setup-requires does). And RWT was written with supplanting many setup-requires use-cases.

    What RWT doesn't have is support for PEP 518 build requirements, but since it's a thin wrapper around pip, if pip were to supply an argument to pip install, namely --build-reqs, RWT could build on that also, and it would remove the need for multiple packages to have build-requirement installer support.

    Here's how it would work. If invoked on a directory containing a pyproject.toml (or referencing such a file), pip would install the build requirements:

    (myenv) $ pip install --build-reqs .
    ... pip installs build requirements defined in pyproject.toml (permanently)
    

    But in the case where a user does not wish to install the build requirements into the environment, but does wish to invoke a setup.py command with the build requirements present, they could use rwt as the tool to do so:

    $ rwt --build-reqs . -- setup.py test
    ... rwt invokes `install --build-reqs` into a temporary directory, adds to PYTHONPATH, then invokes setup.py test.
    

    I was originally thinking that pip should expose a new command install-build-reqs, but then I realized that rwt only wraps pip install, so a parameter to pip install makes a lot more sense.

    Alternatively, rwt could implement this support more directly (without any modifications to pip) by adding a parameter that (a) bypasses default behavior of passing parameters to pip and (b) reads the pyproject.toml and passes the parsed deps to pip install instead.

    Advantages of this proposed approach:

    • separation of concerns (temporary, isolated install of build deps from command execution)
    • substantial code re-use without needing to extract a library for PEP-518
    • applies to many more use-cases outside of setuptools (works with other build tools)
    • developer does not need setuptools as a prerequisite (only pip and rwt... and only pip if rwt were to become pip run as proposed in pypa/pip#3971)
    • setuptools won't need to build yet another tool for installing transient requirements

    Disadvantages:

    • Developer would need yet another tool (rwt) to gain the functionality (barring pypa/pip#3971).
    • rwt design doesn't keep the build environment around the way setup-requires does, so the startup cost is paid at each invocation (a cost which is substantially mitigated in some cases by pip caches).

    @pganssle - if you haven't already, would you give rwt a trial run (I suggest having it installed in your global environment; it's lightweight and often immensely useful) and after let me know what you think about the proposal.

    @benoit-pierre I'd be interested in your opinion also.

    opened by jaraco 4
  • Give pip_run.read-deps an option for newline-terminated output

    Give pip_run.read-deps an option for newline-terminated output

    Feature request: Give python -m pip_run.read-deps a --newline (or whatever you want to call it) option that would cause the individual requirements to be output on separate lines. This would make it easy to create requirements.txt files for __requires__-using scripts, especially when dealing with requirements with markers like importlib-metadata>=3.6; python_version < "3.10", which are currently output with spaces embedded, making it difficult to separate them from adjacent requirements.

    opened by jwodder 4
  • Namespace packages fail when namespace is installed

    Namespace packages fail when namespace is installed

    Distinct from #1, where the namespace package isn't recognized at all, there's another issue that affects all versions of Python.

    $ python -m rwt backports.functools_lru_cache -- -c "import backports.functools_lru_cache"
    Loading requirements using backports.functools_lru_cache
    $ python -c "import backports"                                                          
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
    ImportError: No module named 'backports'
    $ python -m pip install backports.unittest_mock
    Collecting backports.unittest_mock
      Downloading backports.unittest_mock-1.1.1-py2.py3-none-any.whl
    Installing collected packages: backports.unittest-mock
    Successfully installed backports.unittest-mock-1.1.1
    $ python -c "import backports"
    $ python -m rwt backports.functools_lru_cache -- -c "import backports.functools_lru_cache"
    Loading requirements using backports.functools_lru_cache
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
    ImportError: No module named 'backports.functools_lru_cache'
    

    The problem is that the -nspkg.pth handler is interfering with the PEP-420 loader. Remove it and both packages import nicely.

    $ rm /python/lib/python3.5/site-packages/backports.unittest_mock-1.1.1-py3.5-nspkg.pth
    $ python -m rwt backports.functools_lru_cache -- -c "import backports.functools_lru_cache; import backports.unittest_mock"
    Loading requirements using backports.functools_lru_cache
    
    opened by jaraco 4
  • check for existing packages

    check for existing packages

    opened by hagarwa3 3
  • Install fails on Xenial

    Install fails on Xenial

    On Ubuntu Xenial with sudo aptitude install -y python3-pip, and python3 -m pip install --user rwt:

    [email protected]:~$ python3 -m rwt pymongo
    Loading requirements using pymongo
    Exception:
    Traceback (most recent call last):
      File "/usr/lib/python3/dist-packages/pip/basecommand.py", line 209, in main
        status = self.run(options, args)
      File "/usr/lib/python3/dist-packages/pip/commands/install.py", line 335, in run
        prefix=options.prefix_path,
      File "/usr/lib/python3/dist-packages/pip/req/req_set.py", line 732, in install
        **kwargs
      File "/usr/lib/python3/dist-packages/pip/req/req_install.py", line 837, in install
        self.move_wheel_files(self.source_dir, root=root, prefix=prefix)
      File "/usr/lib/python3/dist-packages/pip/req/req_install.py", line 1039, in move_wheel_files
        isolated=self.isolated,
      File "/usr/lib/python3/dist-packages/pip/wheel.py", line 247, in move_wheel_files
        prefix=prefix,
      File "/usr/lib/python3/dist-packages/pip/locations.py", line 153, in distutils_scheme
        i.finalize_options()
      File "/usr/lib/python3.5/distutils/command/install.py", line 273, in finalize_options
        raise DistutilsOptionError("can't combine user with prefix, "
    distutils.errors.DistutilsOptionError: can't combine user with prefix, exec_prefix/home, or install_(plat)base
    Traceback (most recent call last):
      File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
        "__main__", mod_spec)
      File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
        exec(code, run_globals)
      File "/home/jaraco/.local/lib/python3.5/site-packages/rwt/__main__.py", line 12, in <module>
        with deps.load(*pip_args) as home:
      File "/usr/lib/python3.5/contextlib.py", line 59, in __enter__
        return next(self.gen)
      File "/home/jaraco/.local/lib/python3.5/site-packages/rwt/deps.py", line 42, in load
        subprocess.check_call(cmd)
      File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '('/usr/bin/python3', '-m', 'pip', 'install', '-q', '-t', '/tmp/rwt-sp95m3xg', 'pymongo')' returned non-zero exit status 2
    
    wontfix 
    opened by jaraco 2
  • Environment caching

    Environment caching

    I have a number of scripts for my personal use and I found pip-run very useful for managing my dependencies. However it still takes several seconds for pip to install the packages, even if it already uses the download cache of pip.

    I propose adding an option (e.g. --cache) where the directory containing the packages is not deleted after the program exits, and it can be reused as long as the dependencies list does not change. Similar tools (e.g. kotlin-main-kts) also have caching that makes rerunning a script practically free.

    There are several things that need to be considered

    • The cached environments need to be managed, since they take up disk space (e.g. numpy takes up around 70MiB). A simple solution can be keep using /tmp and let the system manages it.
    • Identification of a cached environment. I think using the hash of the dependency list should work.

    I think it is best to keep the functionality simple, and in the worst case just destroy and recreate the environment. But I do expect it to be able to avoid recreation of the environment if I run the same script twice.

    opened by knthmn 3
  • Use --requirements behavior when not using requirements file

    Use --requirements behavior when not using requirements file

    I have a scenario where if I make the following requirements.txt file:

    cryptography<=3.2.1,>2.9.2
    azure-storage-blob==12.8.1
    

    and pass it with -r, everything is fine.

    If I add this to the script i'm running, and remove the -r flag:

    __requires__ = [
        'cryptography<=3.2.1,>2.9.2',
        'azure-storage-blob>12',
    ]
    

    I get an error because my site-packages has jwt 1.1.0 and cryptography 3.2.1 installed:

    ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
    jwt 1.1.0 requires cryptography<=3.2.1,>2.9.2, but you have cryptography 3.4.7 which is incompatible.
    

    What i'd like:

    • a flag that will give me the -r behavior without having to make another separate file

    What I'd love:

    • a flag that will make it not matter what I have installed in site-packages at all. I just want a one line command that will run my script regardless of what might have been installed with pip on the machine in the past.
    opened by fcrick 4
  • Shared redesign of embedded requirements feature

    Shared redesign of embedded requirements feature

    Background

    In pypa/pip#3971, key members of the PyPA have leveled critiques of the embedded requirements feature, where a user can supply requirements and other common installer directives to signal to the tool how (best) to run the script. This feature emerged from the use-case where a script author would like to distribute a single file and have that file be runnable by any number of end users with minimal guidance, allowing the file to be published to a gist or alongside other scripts in a directory without needing additional context for executability.

    Goals

    • the tool should be able to infer install requirements and index URL.
    • the tool must be able to parse these directives without executing the script (as executing the script should be able to rely on the presence of those items).
    • the syntax should be as intuitive as possible. As a corollary, the syntax should aim to re-use syntax familiar to the user (primarily the author, but also the end user).
    • (optional) the tool should be able to infer additional installer directives such as --extra-index-url or --quiet.
    opened by jaraco 4
Releases(v8.6.0)
Owner
Jason R. Coombs
Jason R. Coombs
Python dependency management and packaging made easy.

Poetry: Dependency Management for Python Poetry helps you declare, manage and install dependencies of Python projects, ensuring you have the right sta

Poetry 16.8k Oct 22, 2021
pip-run - dynamic dependency loader for Python

pip-run provides on-demand temporary package installation for a single interpreter run. It replaces this series of commands (or their Windows equivale

Jason R. Coombs 61 Oct 2, 2021
A Poetry plugin for dynamically extracting the package version.

Poetry Version Plugin A Poetry plugin for dynamically extracting the package version. It can read the version from a file __init__.py with: # __init__

Sebastián Ramírez 148 Oct 9, 2021
A set of tools to keep your pinned Python dependencies fresh.

pip-tools = pip-compile + pip-sync A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pi

Jazzband 5.3k Oct 22, 2021
Python Development Workflow for Humans.

Pipenv: Python Development Workflow for Humans [ ~ Dependency Scanning by PyUp.io ~ ] Pipenv is a tool that aims to bring the best of all packaging wo

Python Packaging Authority 22.4k Oct 24, 2021
:package: :fire: Python project management. Manage packages: convert between formats, lock, install, resolve, isolate, test, build graph, show outdated, audit. Manage venvs, build package, bump version.

THE PROJECT IS ARCHIVED Forks: https://github.com/orsinium/forks DepHell -- project management for Python. Why it is better than all other tools: Form

DepHell 1.7k Oct 22, 2021
Install and Run Python Applications in Isolated Environments

pipx — Install and Run Python Applications in Isolated Environments Documentation: https://pipxproject.github.io/pipx/ Source Code: https://github.com

null 4.2k Oct 22, 2021
The Python package installer

pip - The Python Package Installer pip is the package installer for Python. You can use pip to install packages from the Python Package Index and othe

Python Packaging Authority 7.5k Oct 23, 2021
Conan - The open-source C/C++ package manager

Conan Decentralized, open-source (MIT), C/C++ package manager. Homepage: https://conan.io/ Github: https://github.com/conan-io/conan Docs: https://doc

Conan.io 5.3k Oct 22, 2021
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 263 Oct 17, 2021
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 262 Oct 11, 2021
For when Poetry just doesn't work.

Ballad For when Poetry just doesn't work. Have you tried setting up Poetry, but something doesn't work? Maybe you're... Trying to implement Github Act

BD103 4 Sep 22, 2021
The Fast Cross-Platform Package Manager

The Fast Cross-Platform Package Manager part of mamba-org Package Manager mamba Package Server quetz Package Builder boa mamba Mamba is a reimplementa

Mamba 2.3k Oct 24, 2021
Solaris IPS: Image Packaging System

Solaris Image Packaging System Introduction The image packaging system (IPS) is a software delivery system with interaction with a network repository

Oracle 47 Sep 17, 2021
Simple Library Management made with Python

Installation pip install mysql-connector-python NOTE: You must make a database (library) & and table (books, student) to hold all data. Languange and

SonLyte 8 Oct 13, 2021
pipreqs - Generate pip requirements.txt file based on imports of any project. Looking for maintainers to move this project forward.

pipreqs - Generate requirements.txt file for any project based on imports Installation pip install pipreqs Usage Usage: pipreqs [options] <path>

Vadim Kravcenko 3.7k Oct 22, 2021
A PDM plugin that packs your packages into a zipapp

pdm-packer A PDM plugin that packs your packages into a zipapp Requirements pdm-packer requires Python >=3.7 Installation If you have installed PDM wi

Frost Ming 10 Sep 23, 2021
[DEPRECATED] YUM package manager

⛔ This project is deprecated. Please use DNF, the successor of YUM. YUM Yum is an automatic updater and installer for rpm-based systems. Included prog

null 94 Aug 5, 2021
Package manager based on libdnf and libsolv. Replaces YUM.

Dandified YUM Dandified YUM (DNF) is the next upcoming major version of YUM. It does package management using RPM, libsolv and hawkey libraries. For m

null 911 Oct 17, 2021