A set of tools to keep your pinned Python dependencies fresh.

Overview

Jazzband PyPI version Supported Python versions GitHub Actions build status Coverage

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar to pip, pip-tools must be installed in each of your project's virtual environments:

$ source /path/to/venv/bin/activate
(venv)$ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated your project's virtual environment.

Example usage for pip-compile

The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either setup.py or requirements.in.

Run it with pip-compile or python -m piptools compile. If you use multiple Python versions, you can run pip-compile as py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile on other systems.

pip-compile should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.

Note: ensure you don't have requirements.txt if you compile setup.py or requirements.in from scratch, otherwise, it might interfere.

Requirements from setup.py

Suppose you have a Django project, and want to pin it for production. If you have a setup.py with install_requires=['django'], then run pip-compile without any arguments:

$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via my_django_project (setup.py)
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

pip-compile will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

Without setup.py

If you don't use setup.py (it's easy to write one), you can create a requirements.in file to declare the Django dependency:

# requirements.in
django

Now, run pip-compile requirements.in:

$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile requirements.in
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via -r requirements.in
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

And it will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

Using hashes

If you would like to use Hash-Checking Mode available in pip since version 8.0, pip-compile offers --generate-hashes flag:

$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile --generate-hashes requirements.in
#
asgiref==3.2.3 \
    --hash=sha256:7e06d934a7718bf3975acbf87780ba678957b87c7adc056f13b6215d610695a0 \
    --hash=sha256:ea448f92fc35a0ef4b1508f53a04c4670255a3f33d22a81c8fc9c872036adbe5 \
    # via django
django==3.0.3 \
    --hash=sha256:2f1ba1db8648484dd5c238fb62504777b7ad090c81c5f1fd8d5eb5ec21b5f283 \
    --hash=sha256:c91c91a7ad6ef67a874a4f76f58ba534f9208412692a840e1d125eb5c279cb0a \
    # via -r requirements.in
pytz==2019.3 \
    --hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
    --hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be \
    # via django
sqlparse==0.3.0 \
    --hash=sha256:40afe6b8d4b1117e7dff5504d7a8ce07d9a1b15aeeade8a2d10f130a834f8177 \
    --hash=sha256:7c3dca29c022744e95b547e867cee89f4fce4373f3549ccd8797d8eb52cdb873 \
    # via django

Updating requirements

To update all packages, periodically re-run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-package or -P flag:

# only update the django package
$ pip-compile --upgrade-package django

# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests

# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine --upgrade and --upgrade-package in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:

$ pip-compile --upgrade --upgrade-package 'requests<3.0'

Output File

To output the pinned requirements in a filename other than requirements.txt, use --output-file. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:

$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use --output-file=-:

$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt

Forwarding options to pip

Any valid pip flags or arguments may be passed on with pip-compile's --pip-args option, e.g.

$ pip-compile requirements.in --pip-args '--retries 10 --timeout 30'

Configuration

You might be wrapping the pip-compile command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND environment variable.

$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    ./pipcompilewrapper
#
asgiref==3.2.3
    # via django
django==3.0.3
    # via -r requirements.in
pytz==2019.3
    # via django
sqlparse==0.3.0
    # via django

Workflow for layered requirements

If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.

For example, if you have a Django project where you want the newest 2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two *.in files, one for each layer:

# requirements.in
django<2.2

At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already selected for production in requirements.txt.

# dev-requirements.in
-c requirements.txt
django-debug-toolbar

First, compile requirements.txt as usual:

$ pip-compile
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile
#
django==2.1.15
    # via -r requirements.in
pytz==2019.3
    # via django

Now compile the dev requirements and the requirements.txt file is used as a constraint:

$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile dev-requirements.in
#
django-debug-toolbar==2.2
    # via -r dev-requirements.in
django==2.1.15
    # via
    #   -c requirements.txt
    #   django-debug-toolbar
pytz==2019.3
    # via
    #   -c requirements.txt
    #   django
sqlparse==0.3.0
    # via django-debug-toolbar

As you can see above, even though a 2.2 release of Django is available, the dev requirements only include a 2.1 version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.

To install requirements in production stage use:

$ pip-sync

You can install requirements in development stage by:

$ pip-sync requirements.txt dev-requirements.txt

Version control integration

You might use pip-compile as a hook for the pre-commit. See pre-commit docs for instructions. Sample .pre-commit-config.yaml:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 5.0.0
    hooks:
      - id: pip-compile

You might want to customize pip-compile args by configuring args and/or files, for example:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 5.0.0
    hooks:
      - id: pip-compile
        files: ^requirements/production\.(in|txt)$
        args: [--index-url=https://example.com, requirements/production.in]

Example usage for pip-sync

Now that you have a requirements.txt, you can use pip-sync to update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt contents.

Run it with pip-sync or python -m piptools sync. If you use multiple Python versions, you can also run py -X.Y -m piptools sync on Windows and pythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.

Be careful: pip-sync is meant to be used only with a requirements.txt generated by pip-compile.

$ pip-sync
Uninstalling flake8-2.4.1:
  Successfully uninstalled flake8-2.4.1
Collecting click==4.1
  Downloading click-4.1-py2.py3-none-any.whl (62kB)
    100% |................................| 65kB 1.8MB/s
  Found existing installation: click 4.0
    Uninstalling click-4.0:
      Successfully uninstalled click-4.0
Successfully installed click-4.1

To sync multiple *.txt dependency lists, just pass them in via command line arguments, e.g.

$ pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default to requirements.txt.

Any valid pip install flags or arguments may be passed with pip-sync's --pip-args option, e.g.

$ pip-sync requirements.txt --pip-args '--no-cache-dir --no-deps'

If you use multiple Python versions, you can run pip-sync as py -X.Y -m piptools sync ... on Windows and pythonX.Y -m piptools sync ... on other systems.

Note: pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade to upgrade those packages.

Should I commit requirements.in and requirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit both requirements.in and requirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below), then you must commit a seperate output file for each Python environment. We suggest to use the {env}-requirements.txt format (ex: win32-py3.7-requirements.txt, macos-py3.6-requirements.txt, etc.).

Cross-environment usage of requirements.in/requirements.txt and pip-compile

The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.6, 3.7, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.

As the resulting requirements.txt can differ for each environment, users must execute pip-compile on each Python environment separately to generate a requirements.txt valid for each said environment. The same requirements.in can be used as the source file for all environments, using PEP 508 environment markers as needed, the same way it would be done for regular pip cross-environment usage.

If the generated requirements.txt remains exactly the same for all Python environments, then it can be used across Python environments safely. But users should be careful as any package update can introduce environment-dependant dependencies, making any newly generated requirements.txt environment-dependant too. As a general rule, it's advised that users should still always execute pip-compile on each targeted Python environment to avoid issues.

Other useful tools

Deprecations

This section lists pip-tools features that are currently deprecated.

  • --index/--no-index command-line options, use instead --emit-index-url/--no-emit-index-url (since 5.2.0).
  • In future versions, the --allow-unsafe behavior will be enabled by default. Use --no-allow-unsafe to keep the old behavior. It is recommended to pass the --allow-unsafe now to adapt to the upcoming change.

Versions and compatibility

The table below summarizes the latest pip-tools versions with the required pip and Python versions. Generally, pip-tools supports the same Python versions as the required pip versions.

pip-tools pip Python
4.5.* 8.1.3 - 20.0.2 2.7, 3.5 - 3.8
5.0.0 - 5.3.0 20.0 - 20.1.1 2.7, 3.5 - 3.8
5.4.0 20.1 - 20.3.* 2.7, 3.5 - 3.8
5.5.0 20.1 - 20.3.* 2.7, 3.5 - 3.9
6.0.0 20.1 - 20.3.* 3.6 - 3.9
Comments
  • Fixup relative and absolute path handling

    Fixup relative and absolute path handling

    Initial Summary (Outdated)
    
    Rewrite input file paths as relative to the output file, or as absolutes if using stdout:
    
    - Change click arg output_file from click.File to click.Path,
      so we can get its absolute path
    - Change to output file's parent folder before opening it,
      helping upstream pip code to use correct relative paths
    - Return to the starting dir after compilation,
      for test or other calls from code which don't expect dir changes
    - Rewrite src_files as absolute paths as soon as we have them,
      to resolve relative paths properly (CLI args are relative to CWD)
    - When deriving the output path from a single input path, stay safer and more predictable,
      particularly if the basename has no dot, and the path does (see example in #1067)
    - Rewrite src_files as relative to the output path once that's known,
      unless output is stdout
    - Don't overwrite an input file ending in '.txt' when deriving the output file path
    - Add tests:
        - test_annotation_relative_paths
        - test_input_file_with_txt_extension
        - test_input_file_without_extension_and_dotted_path
    
    Fixes #1107
    
    Contributes to #1067
    
    Minor tag-alongs:
      - fix some comment typos
      - git-ignore sublime text project and workspace files
    
    QUESTIONS:
    1. If output is stdout, should paths (in annotations) be absolute,
      or relative to current working folder? 
      With this PR, they're absolute, and I think that is appropriate.
    2. With this PR, `output_file` changes its type, early on. 
      Does that bother anyone? It starts as a `click.Path`, then 
      after paths are resolved by our logic, it's replaced by a `click.File`. 
      It's a small window of `click.Path`-ness. An example consequence
      is seen in this PR's change to `test_writer.py`.
    3. What is the best result of using `name.txt` as an input file, without specifying the output file?
      With this PR, it outputs to `name.txt.txt`, which is the best I can think of.
    4. What additional tests would be good to have, if any? 
      More annotation variants, like with `-c`? More complicated relative paths?
    
    **Changelog-friendly one-liners**: 
    - Rewrite input file paths as relative to the output file, or as absolutes if using stdout
    - Don't overwrite an input file ending in '.txt' when deriving the output file path
    - Don't confuse dots in folder names with file extensions when deriving the output file path
    
    
    Summary, Take 2 (also outdated)

    Fixup relative and absolute path handling:

    These changes have been made with the general guideline of storing paths as absolute as soon as we can, and rendering them as relative or absolute as needed.

    | Path | Initial Interpretation | Output Format (file) | Output Format (stdout) | | --- | --- | --- | --- | | source file | relative to invocation dir | (annotation) relative to output file | absolute | | ireq from source file | relative to its source file | relative to output file, unless initially absolute | absolute | | ireq from --upgrade-package | relative to invocation dir | ~relative to output file~ I think: relative to output file if passed as relative, absolute if passed as absolute, pathless if passed as pathless | absolute | | git+file: ireq from source file | relative to its source file | absolute (pip doesn't support relative paths in that form) | absolute |

    Itemized Changes by File

    utils.py

    • Changed:
      • format_requirement:
        • Add optional str kwarg from_dir:
          • If used, it'll rewrite local paths as relative (to from_dir).
        • Replace alternative path separators in relpaths with forward slashes.
        • Use pip's path_to_url for abs paths.
        • Ensure fragment is attached if present originally.
    • Added:
      • Function abs_ireq:

        def abs_ireq(ireq: InstallRequirement, from_dir: str) -> InstallRequirement:
            """
            Return the given InstallRequirement if its source isn't a relative path;
            Otherwise, return a new one with the relative path rewritten as absolute.
        
            In this case, an extra attribute is added: _was_relative,
            which is always True when present at all.
            """
        
      • Context manager working_dir:

        @contextmanager
        def working_dir(folder: Optional[str]) -> Iterator[None]:
            """Change the current directory within the context, then change it back."""
        
      • Function fragment_string:

        def fragment_string(ireq: InstallRequirement) -> str:
            """
            Return a string like "#egg=pkgname&subdirectory=folder", or "".
            """
        

    pip_compat.py

    • Changed:
      • parse_requirements:
        • Add optional str kwarg from_dir to parse_requirements. If left to its default, None, the parent of the source file is used. Either way it's passed to abs_ireq, so any yielded local ireqs have absolute .links, and some have ._was_relative.
        • Ensure pip's install_req_from_parsed_requirement is called from a sensible folder, to better resolve relative paths; and try to detect if each ireq was initially relative, to "manually" mark the resulting (absolute) ireq with _was_relative.

    compile.py

    • Change Click argument type for the output file from File to Path. When Click's File object is initialized with the absolute path, that full path is preserved as the .name attribute. So we now instantiate the output File ourselves after resolving its absolute path.
    • Resolve src_files to their absolute paths.
    • When deriving the output path from a single input path, ensure it's properly adjacent to the input, and stay safer and more predictable when the basename has no dot and the path does, or the input file ends in .txt (see #1067, #1107, and tests below).
    • Use abs_ireq when collecting upgrade_install_reqs (--upgrade-package), passing the invocation dir as from_dir.
    • No support for relative paths is introduced for setup.py install_requires, given the discussion @ https://discuss.python.org/t/what-is-the-correct-interpretation-of-path-based-pep-508-uri-reference/2815/18
    • Ensure a suitable from_dir is passed to parse_requirements when parsing from setup.py or stdin, which really parses a temp file. This means setup.py's parent folder, or the invocation dir if the source is stdin.

    writer.py

    • Added:
      • comes_from_line_project_re pattern for parsing and rewriting comes_from strs that point to setup.pys and pyproject.tomls.
    • Changed:
      • strip_comes_from_line_re:
        • Extend/replace the pattern as comes_from_line_re, with named groups for opts (-r/-c), path, and line_num.
      • _comes_from_as_string:
        • Add optional str kwarg from_dir. If the ireq.comes_from is already a str and from_dir is passed, in addition to stripping the line number as before, rewrite the path as relative.
        • Add handling for comes_from_line_project_re matches.
      • _format_requirement:
        • If the ireq has ._was_relative and the output is a file, pass the output file's parent as from_dir to format_requirement, ensuring the written path for the ireq is relative in that case.
        • Pass the parent of the output file, if any, as from_dir to _comes_from_as_string.

    test_cli_compile.py

    • Added:
      • test_relative_local_package
        • Relative paths are properly resolved between input, output, and local packages.
        • Input file paths/URIs can be relative, as long as they start with file: or ..
      • test_input_file_with_txt_extension
        • Compile an input file ending in .txt to a separate output file (*.txt.txt), without overwriting the input file.
      • test_input_file_without_extension_and_dotted_path
        • Compile a file without an extension, in a subdir with a dot, into an input-adjacent file with .txt as the extension.
      • test_annotation_relative_paths
        • Annotations referencing reqs.in files use paths relative to the reqs.txt.
      • test_local_vcs_package
        • git+file urls are rewritten to use absolute paths, and otherwise remain intact.
    • Changed:
      • test_duplicate_reqs_combined
        • Use pip's path_to_url to detect the normalized package path in URL form.

    test_writer.py

    • Changed:
      • writer fixture:
        • Open an output file object to pass to OutputWriter, rather than passing the click ctx entry (now just a Path).
      • test_write_header:
        • Access the user-supplied output file path via writer.click_ctx.params["output_file"] (now just a Path), rather than checking that for a .name.
      • test_iter_lines__hash_missing:
        • Use regex match to recognize Windows drive names.
      • test_iter_lines__no_warn_if_only_unhashable_packages:
        • Use regex match to recognize Windows drive names.
    • Added:
      • test_format_requirement_annotation_source_ireqs

    test_utils.py

    • Changed:
      • test_format_requirement_editable_local_path
        • Use regex match to recognize Windows drive names.
    • Added:
      • test_working_dir
      • test_local_abs_ireq_preserves_source_ireqs

    Changelog-friendly one-liners:

    • Support relative paths in input files, as long as they lead with file:, <vcs>+file:, or ..
    • If a local requirement path is relative in the input, interpret it as relative to that input file, and write it as relative to the output file, if any. Otherwise, write the absolute path.
    • Rewrite input file paths (in annotations) as relative to the output file, or as absolute if using stdout.
    • Don't overwrite an input file ending in '.txt' when deriving the output file path.
    • Don't confuse dots in folder names with file extensions when deriving the output file path.
    • Write requirement paths using forward slashes rather than backslashes, on Windows.

    Changelog-friendly one-liners:

    • Support relative paths in input files, as long as they lead with file:, <vcs>+file:, or ..
    • Relative paths in input files become relative paths in output files.
    • pip-compile will interpret relative paths in an input file as relative to that input file, rather than the current folder, if --read-relative-to-input is passed.
    • pip-compile will reconstruct relative req paths as relative to the output file, rather than the current folder, if --write-relative-to-output is passed.
    • pip-sync will interpret relative paths in an input file as relative to that input file, rather than the current folder, if --read-relative-to-input is passed.
    • Annotation paths are now relative to the output file.
    • Don't overwrite an input file ending in '.txt' when deriving the output file path.
    • Don't confuse dots in folder names with file extensions when deriving the output file path.
    • Include extras more reliably in output lines, like pkg[extra1,extra2].

    • Fixes #1107
    • Fixes #204
    • Fixes #1165
    • Related #1067
    • Related #453 (Not addressed in this PR)
    • Related #673
    • Related #702
    Contributor checklist
    • [x] Provided the tests for the changes.
    • [ ] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md on release).
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    enhancement 
    opened by AndydeCleyre 103
  • Add support for pip's 2020 dependency resolver

    Add support for pip's 2020 dependency resolver

    What's new?

    Added new option --resolver [backtracking|legacy] to pip-compile (default is legacy).

    How to use?

    To enable 2020 dependency resolver run pip-compile --resolver=backtracking.

    Backtracking resolver example

    $ echo "oslo.utils==1.4.0" | pip-compile - --resolver=backtracking --allow-unsafe --annotation-style=line -qo-
    #
    # This file is autogenerated by pip-compile with python 3.8
    # To update, run:
    #
    #    pip-compile --allow-unsafe --annotation-style=line --output-file=- --resolver=backtracking -
    #
    babel==2.10.3             # via oslo-i18n, oslo-utils
    iso8601==1.0.2            # via oslo-utils
    netaddr==0.8.0            # via oslo-utils
    netifaces==0.11.0         # via oslo-utils
    oslo-i18n==2.1.0          # via oslo-utils
    oslo-utils==1.4.0         # via -r -
    pbr==0.11.1               # via oslo-i18n, oslo-utils
    pytz==2022.1              # via babel
    six==1.16.0               # via oslo-i18n, oslo-utils
    
    # The following packages are considered to be unsafe in a requirements file:
    pip==22.1.2               # via pbr
    

    Legacy resolver example

    $ echo "oslo.utils==1.4.0" | pip-compile - --resolver=legacy --allow-unsafe -qo-
    Could not find a version that matches pbr!=0.7,!=2.1.0,<1.0,>=0.6,>=2.0.0 (from oslo.utils==1.4.0->-r -)
    Tried: 0.5.2.5.g5b3e942, 0.5.0, 0.5.1, 0.5.2, 0.5.4, 0.5.5, 0.5.6, 0.5.7, 0.5.8, 0.5.10, 0.5.11, 0.5.12, 0.5.13, 0.5.14, 0.5.15, 0.5.16, 0.5.17, 0.5.18, 0.5.19, 0.5.20, 0.5.21, 0.5.22, 0.5.23, 0.6, 0.7.0, 0.8.0, 0.8.1, 0.8.2, 0.9.0, 0.9.0, 0.10.0, 0.10.0, 0.10.1, 0.10.1, 0.10.2, 0.10.2, 0.10.3, 0.10.3, 0.10.4, 0.10.4, 0.10.5, 0.10.5, 0.10.6, 0.10.6, 0.10.7, 0.10.7, 0.10.8, 0.10.8, 0.11.0, 0.11.0, 0.11.1, 0.11.1, 1.0.0, 1.0.0, 1.0.1, 1.0.1, 1.1.0, 1.1.0, 1.1.1, 1.1.1, 1.2.0, 1.2.0, 1.3.0, 1.3.0, 1.4.0, 1.4.0, 1.5.0, 1.5.0, 1.6.0, 1.6.0, 1.7.0, 1.7.0, 1.8.0, 1.8.0, 1.8.1, 1.8.1, 1.9.0, 1.9.0, 1.9.1, 1.9.1, 1.10.0, 1.10.0, 2.0.0, 2.0.0, 2.1.0, 2.1.0, 3.0.0, 3.0.0, 3.0.1, 3.0.1, 3.1.0, 3.1.0, 3.1.1, 3.1.1, 4.0.0, 4.0.0, 4.0.1, 4.0.1, 4.0.2, 4.0.2, 4.0.3, 4.0.3, 4.0.4, 4.0.4, 4.1.0, 4.1.0, 4.1.1, 4.1.1, 4.2.0, 4.2.0, 4.3.0, 4.3.0, 5.0.0, 5.0.0, 5.1.0, 5.1.0, 5.1.1, 5.1.1, 5.1.2, 5.1.2, 5.1.3, 5.1.3, 5.2.0, 5.2.0, 5.2.1, 5.2.1, 5.3.0, 5.3.0, 5.3.1, 5.3.1, 5.4.0, 5.4.0, 5.4.1, 5.4.1, 5.4.2, 5.4.2, 5.4.3, 5.4.3, 5.4.4, 5.4.4, 5.4.5, 5.4.5, 5.5.0, 5.5.0, 5.5.1, 5.5.1, 5.6.0, 5.6.0, 5.7.0, 5.7.0, 5.8.0, 5.8.0
    There are incompatible versions in the resolved dependencies:
      pbr!=0.7,<1.0,>=0.6 (from oslo.utils==1.4.0->-r -)
      pbr!=2.1.0,>=2.0.0 (from oslo.i18n==5.1.0->oslo.utils==1.4.0->-r -)
    
    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [x] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [x] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    enhancement resolver 
    opened by atugushev 65
  • Annotate primary requirements and VCS dependencies

    Annotate primary requirements and VCS dependencies

    Resolves #881 Resolves #293

    This change brings annotations to primary requirements in the compilation output.

    The annotation may be merely a reqs-in source:

    django-debug-toolbar==2.2  # via -r requirements.in (line 2)
    

    or it may additionally include reverse dependencies:

    django==3.0.3             # via -r requirements.in (line 1), django-debug-toolbar
    

    Existing tests are modified to either adjust their expectations, or compile with --no-annotations if annotations are irrelevant. Two tests have been inverted and renamed:

    -test_format_requirement_not_for_primary
    +test_format_requirement_for_primary
    -test_format_requirement_not_for_primary_lower_case
    +test_format_requirement_for_primary_lower_case
    

    Changelog-friendly one-liner: Primary requirements and VCS dependencies now get annotated with any source .in files and reverse dependencies

    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md on release).
    • [x] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).

    Please review these changes with the following questions in mind:

    1. Is the new annotation string as desired?
    2. Is it desirable to add an option to disable this new behavior?
    3. Should the non-annotation-focused tests needing modification be, generally, expecting an exact annotation? Or should we simply use --no-annotate?
    4. In which cases might an InstallRequirement's comes_from attribute be a str, or an InstallRequirement? The following alternatives seem to result in the same output. Which is saner or preferred, or handles potential edge cases better?
     required_by |= {
         src_ireq.comes_from
    -    for src_ireq in ireq._source_ireqs
         if isinstance(src_ireq.comes_from, str)
    +    else src_ireq.comes_from.name.lower()
    +    for src_ireq in ireq._source_ireqs
    +    if src_ireq.comes_from
     }
    

    code in context

    1. If/when all looks good, want it squashed?
    enhancement 
    opened by AndydeCleyre 40
  • pip-compile failure pep517/in_process/_in_process.py get_requires_for_build_wheel

    pip-compile failure pep517/in_process/_in_process.py get_requires_for_build_wheel

    My pip-compile was working fine yesterday and is now failing.

    I don't know what changed. I know that my requirements.txt did not change and my pip-tools version did not change either.

    I have seen issues #1535 and #1390 but no workaround works for me.

    Environment Versions

    1. OS Type: Linux
    2. Python version: Python 3.8.10
    3. pip version: pip 21.3.1
    4. pip-tools version: pip-compile, version 6.4.0

    Steps to replicate

    $ cat setup.py
    from setuptools import setup
    
    
    setup(
        name='apollo',
        install_requires=['conan==1.45.0']
        )
    
    $ cat build/requirements.txt
    bottle==0.12.19
        # via conan
    certifi==2021.10.8
        # via requests
    charset-normalizer==2.0.12
        # via requests
    colorama==0.4.4
        # via conan
    conan==1.45.0
        # via apollo (setup.py)
    distro==1.6.0
        # via conan
    fasteners==0.17.3
        # via conan
    idna==3.3
        # via requests
    jinja2==3.0.3
        # via conan
    markupsafe==2.1.0
        # via jinja2
    node-semver==0.6.1
        # via conan
    patch-ng==1.17.4
        # via conan
    pluginbase==1.0.1
        # via conan
    pygments==2.11.2
        # via conan
    pyjwt==1.7.1
        # via conan
    python-dateutil==2.8.2
        # via conan
    pyyaml==5.4.1
        # via conan
    requests==2.27.1
        # via conan
    six==1.16.0
        # via
        #   conan
        #   python-dateutil
    tqdm==4.62.3
        # via conan
    urllib3==1.26.8
        # via
        #   conan
        #   requests
    

    Expected result

    success

    Actual result

    $ pip-compile --output-file build/requirements.txt
    ERROR: WARNING: You are using pip version 21.3.1; however, version 22.0.4 is available.
    ERROR: You should consider upgrading via the '/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/python -m pip install --upgrade pip' command.
    Traceback (most recent call last):
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/pip-compile", line 8, in <module>
        sys.exit(cli())
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/click/decorators.py", line 26, in new_func
        return f(get_current_context(), *args, **kwargs)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/piptools/scripts/compile.py", line 408, in cli
        dist = meta.load(os.path.dirname(os.path.abspath(src_file)))
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 71, in load
        path = Path(build_as_zip(builder))
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 58, in build_as_zip
        builder(dest=out_dir)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 53, in build
        _prep_meta(hooks, env, dest)
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/meta.py", line 28, in _prep_meta
        reqs = hooks.get_requires_for_build_wheel({})
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 172, in get_requires_for_build_wheel
        return self._call_hook('get_requires_for_build_wheel', {
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 322, in _call_hook
        self._subprocess_runner(
      File "/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/wrappers.py", line 75, in quiet_subprocess_runner
        check_output(cmd, cwd=cwd, env=env, stderr=STDOUT)
      File "/usr/lib/python3.8/subprocess.py", line 415, in check_output
        return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
      File "/usr/lib/python3.8/subprocess.py", line 516, in run
        raise CalledProcessError(retcode, process.args,
    subprocess.CalledProcessError: Command '['/data/homes/jcoulon/.gradle/pyvenv-apollo/bin/python', '/data/homes/jcoulon/.gradle/pyvenv-apollo/lib/python3.8/site-packages/pep517/in_process/_in_process.py', 'get_requires_for_build_wheel', '/tmp/tmpr0u030tn']' returned non-zero exit status 1.
    
    opened by jeremy-coulon 38
  • Pip10 update

    Pip10 update

    Update pip-tools for pip10 compatibility (and backwards compatibility with pip9)

    Contributor checklist
    • [x] Provided the tests for the changes
    • [x] Requested (or received) a review from another contributor
    • [x] Gave a clear one-line description in the PR (that the maintainers can add to CHANGELOG.md afterwards).

    /cc @vphilippon

    I didn't add tests, since I just rebuilt existing functionality using pip10 as well. This should work with both versions of pip and is all green locally

    opened by techalchemy 38
  • Dependency handling in requirements when updating packages

    Dependency handling in requirements when updating packages

    So, here is the promised brain dump, sorry for the length.

    Right now naively updating requirements can lead to dependency conflicts. For instance, let's say I want to add raven to my project but pinned to a specific version:

    $ pip install raven==1.9.4
    …
    Successfully installed raven simplejson
    

    So raven needs simplejson. Now I run pip freeze and get in my requirements.txt:

    raven==1.9.4
    simplejson==2.4.0
    

    Some time later I run pip-review and get (this is not what you'd get right now):

    raven==2.0.2 is available (you have 1.9.4)
    simplejson==2.6.2 is available (you have 2.4.0)
    

    Note that the newer simplejson was already available when I initially installed raven, but raven needed simplejson>=2.3.0,<2.5.0. Raven 2.0.2 does as well, but this still encourages me to upgrade simplejson when I shouldn't.

    The current version of raven dropped the >=2.3.0,<2.5.0 part so now we can get the latest and greatest raven and simplejson safely.

    My point is that when updating dependencies, checking for conflicts is very hard to do by hand. This needs to be automated with a tool that yells at the developer when an update leads to a version conflict.

    Ruby gets this right with Bundler. gem install bundle, create a Gemfile with the following content:

    source :rubygems
    gem 'compass-less-plugin'
    

    And run bundle install. This installs the required package and its dependencies and creates a Gemfile.lock file:

    GEM
      remote: http://rubygems.org/
      specs:
        chunky_png (1.2.6)
        compass (0.12.2)
          chunky_png (~> 1.2)
          fssm (>= 0.2.7)
          sass (~> 3.1)
        compass-less-plugin (1.0)
          compass (>= 0.10)
        fssm (0.2.9)
        sass (3.2.1)
    
    PLATFORMS
      ruby
    
    DEPENDENCIES
      compass-less-plugin
    

    Gemfile.lock is like requirements.txt with pinned versions (not everything is pinned here but should probably be): when creating a new environment and running bundle install, bundler looks at the .lock file to install what's specified.

    Then there is a bunch of commands that bundle provides. For instance, to list available updates (running this on a bundle created months ago):

    $ bundle outdated
    Fetching gem metadata from http://rubygems.org/.....
    
    Outdated gems included in the bundle:
      * chunky_png (1.2.6 > 1.2.5)
      * fssm (0.2.9 > 0.2.8.1)
      * sass (3.2.1 > 3.1.12)
      * compass (0.12.2 > 0.11.7)
    

    Updating compass-less-plugin and its dependencies can be done in one command (bundle update compass-less-plugin) and does so while checking for version conflicts.

    Sorry if you're already familiar with all this. Now I'll try to explain how we can make improve requirements.txt by using this approach.

    First, instead of putting all the requirements in requirements.txt, people would only list first-level deps, pinned. So for raven:

    raven==1.9.4
    

    Then some tool provided by pip-tools compiles this into the full requirements list, into an other file (like Gemfile and Gemfile.lock but with less noise):

    raven==1.9.4
    simplejson==2.4.0
    

    The key point is that this tool builds the whole dependency tree for all the top-level requirements and dumps it as a safely-installable-with-no-conflicts requirements file, which pip can just use.

    So next time raven is updated and doesn't require an old simplejson, the tool can update the simplejson requirement. When raven drops simplejson to use python's built-in json implementation, the 2nd-level requirement can be dropped as well, automatically.

    Other use case: requests which used to have dependencies on oauthlib, certifi, chardet and doesn't anymore (and oauthlib needed rsa or pyasn1 or whatever). If I just need requests I'll list in my top-level requirements and the tool will pin or drop the dependencies if they're not needed when I upgrade requests itself.

    And finally, this tool could prevent me from installing package X and Y which need Z<1.0 and Z>1.1.

    That's the theory and I think pip already does some version conflict checks but that's not enough to guarantee safe updates. Now in practice, I think the dependency information is not provided by the PyPI API and requires the whole package to be fetched to actually extract it (or maybe create.io provides that info). So that's annoying but doable, and pip-tools seems like a nice place to experiment with such things.

    I think buildout does check for dependency conflicts but I never managed to wrap my head around it.

    What do you think? I'm happy to start a proof-of-concept that could be integrated in this project.

    opened by brutasse 38
  • Workflow for layered requirements (e.g. prod<-test<-dev requirements)?

    Workflow for layered requirements (e.g. prod<-test<-dev requirements)?

    Say I have

    requirements.in:

    Django~=1.8.0
    

    And also

    requirements-dev.in:

    django-debug-toolbar
    

    How can I run pip-compile on requirements-dev.in, where it will also take into account the requirements in requirements.in when figuring out which versions to use?

    For now I have an ad-hoc script that compiles requirements.in first, then requirements-dev.in has -r requirements.txt as its first line. Is this an okay workflow? I'm worried that in the future if I add a dependency it will try and update a bunch of stuff I don't want it to update, but I haven't actually used this tool long enough to determine whether that's truly a problem. Wondering if anyone else has used pip-tools in this fashion and has any advice?

    PR wanted docs 
    opened by dan-passaro 37
  • Periods get converted to dashes in package name

    Periods get converted to dashes in package name

    I have a requirements.txt file with the following:

    My.Package~=1.0
    My.Sub.Package~=1.1
    

    When I run python -m piptools compile, the periods in the package name get converted to dashes:

    my-package==1.0.0
        # via -r requirements.txt
    my-sub-package==1.1.1
        # via
        #   -r requirements.txt
        #   my.package
    

    I need the periods to stay periods. I have no control over the names of the packages. Some package names may have both periods and dashes, such as My.More-Complex.Package, and I don't want that to change to my-more-complex-package.

    Yes, the output file above functions correctly, but we're doing extra parsing on it that's breaking because the periods are now dashes.

    Edit: Per comment below, if the periods are converted to dashes to be consistent with pip, then I'd prefer all periods can converted to dashes, including # my.package to # my-package in the example above.

    writer 
    opened by sawatsky 36
  • Add --newline=[LF|CRLF|native|preserve] option to compile, to override the line separator characters used

    Add --newline=[LF|CRLF|native|preserve] option to compile, to override the line separator characters used

    pip-compile gains an option with ~two~ ~three~ four valid choices: --newline=[LF|CRLF|native|preserve], which can be used to override the guessed newline character used in the output file. The default is ~native~ preserve, which ~uses os.linesep~ tries to be consistent with an existing output file, or input file, or ~FALLBACK_VALUE (native, or LF? TBD)~ falls back to LF, in that order.

    This aims to address #1448.

    ~Note: poll for fallback value~

    Contributor checklist
    • [ ] Provided the tests for the changes.
    • [ ] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [ ] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    opened by AndydeCleyre 36
  • virtualenv issue

    virtualenv issue

    I'm not exactly sure what's going on, but with a barebones requirements.txt file within a virtualenv, pip-sync is failing.

    (venv)➜  pip-tools  pip list
    pip (7.1.2)
    setuptools (18.2)
    wheel (0.24.0)
    (venv)➜  pip-tools  pip-sync
    Cannot uninstall requirement appnope, not installed
    Traceback (most recent call last):
      File "/usr/local/bin/pip-sync", line 11, in <module>
        sys.exit(cli())
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 716, in __call__
        return self.main(*args, **kwargs)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 696, in main
        rv = self.invoke(ctx)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 889, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/usr/local/lib/python2.7/site-packages/click/core.py", line 534, in invoke
        return callback(*args, **kwargs)
      File "/usr/local/lib/python2.7/site-packages/piptools/scripts/sync.py", line 68, in cli
        pip_flags=pip_flags))
      File "/usr/local/lib/python2.7/site-packages/piptools/sync.py", line 137, in sync
        check_call(['pip', 'uninstall', '-y'] + pip_flags + sorted(to_uninstall))
      File "/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 540, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['pip', 'uninstall', '-y', 'appnope', 'aws-shell', 'awscli', 'boto3', 'botocore', 'colorama', 'configobj', 'decorator', 'docutils', 'flake8', 'futures', 'gnureadline', 'ipython', 'ipython-genutils', 'isort', 'jmespath', 'mccabe', 'path.py', 'pep8', 'pexpect', 'pickleshare', 'prompt-toolkit', 'ptyprocess', 'pyasn1', 'pyflakes', 'pygments', 'python-dateutil', 'requests', 'rsa', 'simplegeneric', 'speedtest-cli', 'traitlets', 'virtualenv', 'wcwidth']' returned non-zero exit status 1
    

    In my current directory and virtual environment, pip-sync is trying to uninstall globally installed packages.

    Possibly related to #277.

    opened by zackhsi 35
  • pip-review?

    pip-review?

    Hello,

    Thanks for all the great work on pip-tools! I like the way that the project is heading (pip-compile and pip-sync look pretty cool).

    I just noticed that the latest release has removed pip-review. I was wondering what the new equivalent is (as I found this tool very useful)?

    eg. every night I build a new pyvenv, install required pip libraries and run pip-review to send an email out to the devs letting them know if any libraries require upgrading.

    opened by gavinjackson 33
  • Add `--no-index` flag to `pip-compile`

    Add `--no-index` flag to `pip-compile`

    Now we can finally bring back pip-compile --no-index in favour of consistency with pip-sync --no-index and pip install --no-index.

    Background:

    • https://github.com/jazzband/pip-tools/issues/373
    • https://github.com/jazzband/pip-tools/pull/1130
    • https://github.com/jazzband/pip-tools/pull/1234
    Contributor checklist
    • [x] Provided the tests for the changes.
    • [x] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [x] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    enhancement cli 
    opened by atugushev 0
  • Fix flaky tests that depend on verbosity level

    Fix flaky tests that depend on verbosity level

    Partially addresses #1720.

    log.verbosity is not thread safe. Better to pass verbosity as DI to avoid flaky tests.

    Contributor checklist
    • [ ] Provided the tests for the changes.
    • [ ] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [ ] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    tests skip-changelog 
    opened by atugushev 1
  • Introduce `--add-package/-A` pip-compile flag

    Introduce `--add-package/-A` pip-compile flag

    See #1730 for context. This new flag allows specifying packages on the command line instead of forcing them to come from a requirements file. May offer improved usage compared to piping from stdin as described in the linked issue.

    Let me know your thoughts 🙂

    (I'm waiting to add tests until after I hear whether the implementation is favorable compared to e.g. using a different flag name, supplementing pip-args, etc.)

    Contributor checklist
    • [ ] Provided the tests for the changes.
    • [x] Assure PR title is short, clear, and good to be included in the user-oriented changelog
    Maintainer checklist
    • [ ] Assure one of these labels is present: backwards incompatible, feature, enhancement, deprecation, bug, dependency, docs or skip-changelog as they determine changelog listing.
    • [ ] Assign the PR to an existing or new milestone for the target version (following Semantic Versioning).
    opened by ntjess 0
  • Feature: Adding dependencies through CLI

    Feature: Adding dependencies through CLI

    What's the problem this feature will solve?

    Some libraries allow many backing modules for the same functionality. Two very popular examples are OpenCv and Qt. The python module cv2 is provided by opencv-python-headless, opencv-python-contrib, opencv-python, and a few more pypi packages. Similarly, Qt APIs are available via PySide5, PySide6, etc.

    Often, rather than adding an extra argument to pip installs, developers will simply check if one of many compatible libraries is available and raise an error otherwise. qtpy and pyqtgraph are popular examples of this: Qt is not specified in their requirements file; the user is expected to install this separately and the import fails if it isn't found. pip-compile doesn't handle these cases well -- currently, the only way requirements can be specified is through a requirements file.

    TLDR: It is cumbersome to install packages that should exist in the user's environment, but did not come from requirements.txt.

    Describe the solution you'd like

    Ideally, pip-compile can support cli dependencies in its pip-args flag, since according to the docs:

    Any valid pip flags or arguments may be passed on with pip-compile’s --pip-args option

    pip-compile ./pyproject.toml --pip-args "pyside6 opencv-python-headless"
    # Or a separate flag like --inline-requirements "opencv-python-headless"
    

    This would allow pip-compile to handle these cases without forcing a requirements-tmp.txt for dependencies not managed or specified by the library at hand, and fits within the pip-args capabilities of adding cli dependencies.

    Alternative Solutions

    Devs can make an additional requirements file that specifies these needs, i.e.:

    pip-compile pyproject.toml requirements-cv2.txt
    

    which holds the relevant information. However, this leads to quite a bit of repo bloat if a separate additional requirements file must be specified for each similar (single) dependency. This is why e.g. packages like qtpy don't have an explicit requirements.txt file indicating this information.

    Additional context

    I would be happy to work on the PR if this idea is positively received. The solution should be quite easy to implement using pip-compile's existing logic, by simply extending reqs to include these additional options. It has the nice benefit of still printing in the "autogenerated" message, so reproducibility will not be lost.

    As a side note, the CLI already has an (undocumented) ability to accept requirements from stdin, so non-requirements.txt dependencies are already considered appropriate with existing logic.

    As another side note, stdin dependencies are lost in the "autogeneration" comment! This solution has the added benefit of retaining those dependencies on multiple runs of pip-compile since stdin can simply be directed as a inline-requirements arg.

    I.e. run echo pyside6 | pip-compile - -o requirements.txt and you will get a file with this comment at the top:

    #
    # This file is autogenerated by pip-compile with python 3.9
    # To update, run:
    #
    #    pip-compile --output-file='.\requirements.txt' -
    

    Note! The original stdin text, pyside6, is lost. This would not happen if it was stored as a CLI option.

    feature request cli 
    opened by ntjess 1
  • Flaky tests

    Flaky tests

    test_bad_setup_file:

    • first appearance on d8e272f8d8af0ff2d0dfa07af43f5a6cfa4ed211
    • failed on 1c67754c2654873456dd7970a596528515898e3a
    • passed on 5b3fa3cc9fbdaa92b72bbe96286c82ecfe1da78a
    Details
    def test_bad_setup_file(runner):
            with open("setup.py", "w") as package:
                package.write("BAD SYNTAX")
        
            out = runner.invoke(cli, [])
        
    >       assert out.exit_code == 2
    E       AssertionError: assert 1 == 2
    E        +  where 1 = <Result CalledProcessError(1, ['/private/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/build-env-gdwusm65/bin/python...p517', '--no-warn-script-location', '-r', '/var/folders/24/8k48jl6d249_n_qfxwsl6xvm0000gn/T/build-reqs-ynj6e9y3.txt'])>.exit_code
    

    test_direct_reference_with_extras:

    • failed
    • passed on a92a6af3245cdfac5512210c045735821513055f
    • I've noticed test_direct_reference_with_extras doesn't have @pytest.mark.network marker.

    ✅ sync tests - addressed in #1743

    • test_sync_install_temporary_requirement_file
    • test_sync_up_to_date
    • test_sync_verbose
    • test_sync_uninstall_pip_command
    • https://github.com/jazzband/pip-tools/actions/runs/3515128716/jobs/5890003279
    Details
    2022-11-21T14:12:58.8111071Z _________________ test_sync_install_temporary_requirement_file _________________
    2022-11-21T14:12:58.8111819Z [gw1] linux -- Python 3.7.15 /home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python
    2022-11-21T14:12:58.8112434Z 
    2022-11-21T14:12:58.8112749Z from_line = <function install_req_from_line at 0x7f87d86bf4d0>
    2022-11-21T14:12:58.8113460Z from_editable = <function install_req_from_editable at 0x7f87d86bf290>
    2022-11-21T14:12:58.8114104Z mocked_tmp_req_file = <MagicMock name='NamedTemporaryFile()' id='140221262482128'>
    2022-11-21T14:12:58.8114907Z 
    2022-11-21T14:12:58.8115409Z     def test_sync_install_temporary_requirement_file(
    2022-11-21T14:12:58.8116116Z         from_line, from_editable, mocked_tmp_req_file
    2022-11-21T14:12:58.8116515Z     ):
    2022-11-21T14:12:58.8118554Z         with mock.patch("piptools.sync.run") as run:
    2022-11-21T14:12:58.8119006Z             to_install = {from_line("django==1.8")}
    2022-11-21T14:12:58.8119727Z             sync(to_install, set())
    2022-11-21T14:12:58.8120142Z             run.assert_called_once_with(
    2022-11-21T14:12:58.8121000Z                 [sys.executable, "-m", "pip", "install", "-r", mocked_tmp_req_file.name],
    2022-11-21T14:12:58.8121460Z >               check=True,
    2022-11-21T14:12:58.8122018Z             )
    2022-11-21T14:12:58.8122295Z 
    2022-11-21T14:12:58.8122829Z tests/test_sync.py:275: 
    2022-11-21T14:12:58.8123244Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    2022-11-21T14:12:58.8124129Z /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/unittest/mock.py:889: in assert_called_once_with
    2022-11-21T14:12:58.8124666Z     return self.assert_called_with(*args, **kwargs)
    2022-11-21T14:12:58.8125306Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    2022-11-21T14:12:58.8125613Z 
    2022-11-21T14:12:58.8126178Z _mock_self = <MagicMock name='run' id='140221260076624'>
    2022-11-21T14:12:58.8126941Z args = (['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'install', '-r', 'requirements.txt'],)
    2022-11-21T14:12:58.8127727Z kwargs = {'check': True}
    2022-11-21T14:12:58.8128466Z expected = ((['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'install', '-r', 'requirements.txt'],), {'check': True})
    2022-11-21T14:12:58.8129349Z _error_message = <function NonCallableMock.assert_called_with.<locals>._error_message at 0x7f87ce689710>
    2022-11-21T14:12:58.8130190Z actual = call(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'install', '-r', 'requirements.txt', '-q'], check=True)
    2022-11-21T14:12:58.8131102Z cause = None
    2022-11-21T14:12:58.8131387Z 
    2022-11-21T14:12:58.8135362Z     def assert_called_with(_mock_self, *args, **kwargs):
    2022-11-21T14:12:58.8135953Z         """assert that the mock was called with the specified arguments.
    2022-11-21T14:12:58.8136602Z     
    2022-11-21T14:12:58.8137051Z         Raises an AssertionError if the args and keyword args passed in are
    2022-11-21T14:12:58.8137737Z         different to the last call to the mock."""
    2022-11-21T14:12:58.8138150Z         self = _mock_self
    2022-11-21T14:12:58.8138740Z         if self.call_args is None:
    2022-11-21T14:12:58.8139180Z             expected = self._format_mock_call_signature(args, kwargs)
    2022-11-21T14:12:58.8140073Z             raise AssertionError('Expected call: %s\nNot called' % (expected,))
    2022-11-21T14:12:58.8140512Z     
    2022-11-21T14:12:58.8142017Z         def _error_message():
    2022-11-21T14:12:58.8142552Z             msg = self._format_mock_failure_message(args, kwargs)
    2022-11-21T14:12:58.8143184Z             return msg
    2022-11-21T14:12:58.8143598Z         expected = self._call_matcher((args, kwargs))
    2022-11-21T14:12:58.8144254Z         actual = self._call_matcher(self.call_args)
    2022-11-21T14:12:58.8144681Z         if expected != actual:
    2022-11-21T14:12:58.8145338Z             cause = expected if isinstance(expected, Exception) else None
    2022-11-21T14:12:58.8145841Z >           raise AssertionError(_error_message()) from cause
    2022-11-21T14:12:58.8146948Z E           AssertionError: Expected call: run(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'install', '-r', 'requirements.txt'], check=True)
    2022-11-21T14:12:58.8147958Z E           Actual call: run(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'install', '-r', 'requirements.txt', '-q'], check=True)
    2022-11-21T14:12:58.8148640Z 
    2022-11-21T14:12:58.8149374Z /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/unittest/mock.py:878: AssertionError
    2022-11-21T14:12:58.8150269Z _____________________________ test_sync_up_to_date _____________________________
    2022-11-21T14:12:58.8150938Z [gw1] linux -- Python 3.7.15 /home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python
    2022-11-21T14:12:58.8151533Z 
    2022-11-21T14:12:58.8151873Z capsys = <_pytest.capture.CaptureFixture object at 0x7f87ce7de910>
    2022-11-21T14:12:58.8152573Z runner = <click.testing.CliRunner object at 0x7f87ce5f6310>
    2022-11-21T14:12:58.8152920Z 
    2022-11-21T14:12:58.8153390Z     def test_sync_up_to_date(capsys, runner):
    2022-11-21T14:12:58.8153785Z         """
    2022-11-21T14:12:58.8154471Z         Everything up-to-date should be printed.
    2022-11-21T14:12:58.8154882Z         """
    2022-11-21T14:12:58.8155437Z         sync(set(), set())
    2022-11-21T14:12:58.8155842Z         captured = capsys.readouterr()
    2022-11-21T14:12:58.8156603Z >       assert captured.out.splitlines() == ["Everything up-to-date"]
    2022-11-21T14:12:58.8157733Z E       AssertionError: assert [] == ['Everything up-to-date']
    2022-11-21T14:12:58.8158350Z E         Right contains one more item: 'Everything up-to-date'
    2022-11-21T14:12:58.8159865Z E         Full diff:
    2022-11-21T14:12:58.8160204Z E         - ['Everything up-to-date']
    2022-11-21T14:12:58.8160503Z E         + []
    2022-11-21T14:12:58.8160639Z 
    2022-11-21T14:12:58.8160932Z /home/runner/work/pip-tools/pip-tools/tests/test_sync.py:377: AssertionError
    2022-11-21T14:12:58.8161395Z ______________________________ test_sync_verbose _______________________________
    2022-11-21T14:12:58.8161894Z [gw1] linux -- Python 3.7.15 /home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python
    2022-11-21T14:12:58.8162134Z 
    2022-11-21T14:12:58.8162317Z run = <MagicMock name='run' id='140221259670224'>
    2022-11-21T14:12:58.8162779Z from_line = <function install_req_from_line at 0x7f87d86bf4d0>
    2022-11-21T14:12:58.8162964Z 
    2022-11-21T14:12:58.8163093Z     @mock.patch("piptools.sync.run")
    2022-11-21T14:12:58.8163374Z     def test_sync_verbose(run, from_line):
    2022-11-21T14:12:58.8163604Z         """
    2022-11-21T14:12:58.8163942Z         The -q option has to be passed to every pip calls.
    2022-11-21T14:12:58.8164185Z         """
    2022-11-21T14:12:58.8164444Z         sync({from_line("django==1.8")}, {from_line("click==4.0")})
    2022-11-21T14:12:58.8164726Z         assert run.call_count == 2
    2022-11-21T14:12:58.8164979Z         for call in run.call_args_list:
    2022-11-21T14:12:58.8165226Z             run_args = call[0][0]
    2022-11-21T14:12:58.8165513Z >           assert "-q" not in run_args
    2022-11-21T14:12:58.8166127Z E           AssertionError: assert '-q' not in ['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', '-q', ...]
    2022-11-21T14:12:58.8166434Z 
    2022-11-21T14:12:58.8166638Z tests/test_sync.py:390: AssertionError
    2022-11-21T14:12:58.8167066Z _______________________ test_sync_uninstall_pip_command ________________________
    2022-11-21T14:12:58.8167568Z [gw1] linux -- Python 3.7.15 /home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python
    2022-11-21T14:12:58.8167809Z 
    2022-11-21T14:12:58.8168008Z run = <MagicMock name='run' id='140221259671504'>
    2022-11-21T14:12:58.8168171Z 
    2022-11-21T14:12:58.8168294Z     @mock.patch("piptools.sync.run")
    2022-11-21T14:12:58.8168571Z     def test_sync_uninstall_pip_command(run):
    2022-11-21T14:12:58.8168866Z         to_uninstall = ["six", "django", "pytz", "click"]
    2022-11-21T14:12:58.8169104Z     
    2022-11-21T14:12:58.8169315Z         sync(set(), to_uninstall)
    2022-11-21T14:12:58.8169561Z         run.assert_called_once_with(
    2022-11-21T14:12:58.8169972Z             [sys.executable, "-m", "pip", "uninstall", "-y", *sorted(to_uninstall)],
    2022-11-21T14:12:58.8170246Z >           check=True,
    2022-11-21T14:12:58.8170455Z         )
    2022-11-21T14:12:58.8170572Z 
    2022-11-21T14:12:58.8170739Z tests/test_sync.py:480: 
    2022-11-21T14:12:58.8171081Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    2022-11-21T14:12:58.8171580Z /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/unittest/mock.py:889: in assert_called_once_with
    2022-11-21T14:12:58.8171949Z     return self.assert_called_with(*args, **kwargs)
    2022-11-21T14:12:58.8172223Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
    2022-11-21T14:12:58.8172377Z 
    2022-11-21T14:12:58.8172588Z _mock_self = <MagicMock name='run' id='140221259671504'>
    2022-11-21T14:12:58.8173139Z args = (['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', 'click', ...],)
    2022-11-21T14:12:58.8173550Z kwargs = {'check': True}
    2022-11-21T14:12:58.8174106Z expected = ((['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', 'click', ...],), {'check': True})
    2022-11-21T14:12:58.8174630Z _error_message = <function NonCallableMock.assert_called_with.<locals>._error_message at 0x7f87ce6890e0>
    2022-11-21T14:12:58.8175324Z actual = call(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', '-q', 'click', 'django', 'pytz', 'six'], check=True)
    2022-11-21T14:12:58.8175704Z cause = None
    2022-11-21T14:12:58.8175836Z 
    2022-11-21T14:12:58.8175977Z     def assert_called_with(_mock_self, *args, **kwargs):
    2022-11-21T14:12:58.8176313Z         """assert that the mock was called with the specified arguments.
    2022-11-21T14:12:58.8176577Z     
    2022-11-21T14:12:58.8176855Z         Raises an AssertionError if the args and keyword args passed in are
    2022-11-21T14:12:58.8177188Z         different to the last call to the mock."""
    2022-11-21T14:12:58.8177429Z         self = _mock_self
    2022-11-21T14:12:58.8177663Z         if self.call_args is None:
    2022-11-21T14:12:58.8177962Z             expected = self._format_mock_call_signature(args, kwargs)
    2022-11-21T14:12:58.8178485Z             raise AssertionError('Expected call: %s\nNot called' % (expected,))
    2022-11-21T14:12:58.8178763Z     
    2022-11-21T14:12:58.8178957Z         def _error_message():
    2022-11-21T14:12:58.8179241Z             msg = self._format_mock_failure_message(args, kwargs)
    2022-11-21T14:12:58.8179501Z             return msg
    2022-11-21T14:12:58.8179802Z         expected = self._call_matcher((args, kwargs))
    2022-11-21T14:12:58.8180096Z         actual = self._call_matcher(self.call_args)
    2022-11-21T14:12:58.8180356Z         if expected != actual:
    2022-11-21T14:12:58.8180644Z             cause = expected if isinstance(expected, Exception) else None
    2022-11-21T14:12:58.8180981Z >           raise AssertionError(_error_message()) from cause
    2022-11-21T14:12:58.8181754Z E           AssertionError: Expected call: run(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', 'click', 'django', 'pytz', 'six'], check=True)
    2022-11-21T14:12:58.8182593Z E           Actual call: run(['/home/runner/work/pip-tools/pip-tools/.tox/pipmain-coverage/bin/python', '-m', 'pip', 'uninstall', '-y', '-q', 'click', 'django', 'pytz', 'six'], check=True)
    2022-11-21T14:12:58.8182928Z 
    2022-11-21T14:12:58.8183248Z /opt/hostedtoolcache/Python/3.7.15/x64/lib/python3.7/unittest/mock.py:878: AssertionError
    
    tests ci 
    opened by atugushev 0
  • pip-sync errors when merging multiple requirements.txt files that point at the same editable install

    pip-sync errors when merging multiple requirements.txt files that point at the same editable install

    pip-sync raises an AttributeError: 'NoneType' object has no attribute 'specifier' error when you try and run it with multiple requirements files that each include an editable install for the same package.

    Environment Versions

    1. OS Type: Linux
    2. Python version: Python 3.9.11
    3. pip version: pip 22.3
    4. pip-tools version: pip-compile, version 6.9.0

    Steps to replicate

    1. Create a setup.py, dev_requirements.in, requirements.in file
    # setup.py
    from setuptools import find_packages, setup
    setup(name="a", version="0.0.1", packages=find_packages())
    
    # dev_requirements.in
    -e file:.
    
    # requirements.in
    -e file:.
    
    1. Run pip-compile
    pip-compile requirements.in
    pip-compile dev_requirements.in
    
    1. Run pip-sync
    pip-sync requirements.in  dev_requirements.in
    

    Expected result

    I expected the editable package to install once. Similar to running pip install -r requirements.txt -r dev_requirements.txt

    Actual result

    pip-sync raised an error Full stack trace

    ➜  scratch pip-sync  requirements.txt dev_requirements.txt 
    Traceback (most recent call last):
      File "/home/vivek/.pyenv/versions/3.9.11/bin/pip-sync", line 8, in <module>
        sys.exit(cli())
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/click/core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/click/core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/piptools/scripts/sync.py", line 146, in cli
        merged_requirements = sync.merge(requirements, ignore_conflicts=force)
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/piptools/sync.py", line 115, in merge
        if ireq.specifier != existing_ireq.specifier:
      File "/home/vivek/.pyenv/versions/3.9.11/lib/python3.9/site-packages/pip/_internal/req/req_install.py", line 245, in specifier
        return self.req.specifier
    AttributeError: 'NoneType' object has no attribute 'specifier'
    

    ...

    needs reproduce 
    opened by vivster7 2
Releases(6.10.0)
  • 6.10.0(Nov 14, 2022)

    Features:

    • Deprecate pip-compile --resolver=legacy (#1724). Thanks @atugushev
    • Prompt user to use the backtracking resolver on errors (#1719). Thanks @maxfenv
    • Add support for Python 3.11 final (#1708). Thanks @hugovk
    • Add --newline=[LF|CRLF|native|preserve] option to pip-compile (#1652). Thanks @AndydeCleyre

    Bug Fixes:

    • Fix inconsistent handling of constraints comments with backtracking resolver (#1713). Thanks @mkniewallner
    • Fix some encoding warnings in Python 3.10 (PEP 597) (#1614). Thanks @GalaxySnail

    Other Changes:

    • Update pip-tools version in the README's pre-commit examples (#1701). Thanks @Kludex
    • Document use of the backtracking resolver (#1718). Thanks @maxfenv
    • Use HTTPS in a readme link (#1716). Thanks @Arhell
    Source code(tar.gz)
    Source code(zip)
  • 6.9.0(Oct 5, 2022)

    Features:

    • Add --all-extras flag to pip-compile (#1630). Thanks @apljungquist
    • Support Exclude Package with custom unsafe packages (#1509). Thanks @hmc-cs-mdrissi

    Bug Fixes:

    • Fix compile cached vcs packages (#1649). Thanks @atugushev
    • Include py.typed in wheel file (#1648). Thanks @FlorentJeannot

    Other Changes:

    • Add pyproject.toml & modern packaging to introduction. (#1668). Thanks @hynek
    Source code(tar.gz)
    Source code(zip)
  • 6.8.0(Jun 30, 2022)

  • 6.7.0(Jun 28, 2022)

    Features:

    • Support for the importlib.metadata metadata implementation (#1632). Thanks @richafrank

    Bug Fixes:

    • Instantiate a new accumulator InstallRequirement for combine_install_requirements output (#1519). Thanks @richafrank

    Other Changes:

    • Replace direct usage of the pep517 module with the build module, for loading project metadata (#1629). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.6.2(May 23, 2022)

  • 6.6.1(May 13, 2022)

  • 6.6.0(Apr 6, 2022)

    Features:

    • Add support for pip>=22.1 (#1607). Thanks @atugushev

    Bug Fixes:

    • Ensure pip-compile --dry-run --quiet still shows what would be done, while omitting the dry run message (#1592). Thanks @AndydeCleyre
    • Fix --generate-hashes when hashes are computed from files (#1540). Thanks @RazerM
    Source code(tar.gz)
    Source code(zip)
  • 6.5.1(Feb 8, 2022)

  • 6.5.0(Feb 4, 2022)

  • 6.4.0(Oct 12, 2021)

  • 6.3.1(Oct 8, 2021)

    Bug Fixes:

    • Ensure pip-tools unions dependencies of multiple declarations of a package with different extras (#1486). Thanks @richafrank
    • Allow comma-separated arguments for --extra (#1493). Thanks @AndydeCleyre
    • Improve clarity of help text for options supporting multiple (#1492). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.3.0(Sep 21, 2021)

    Features:

    • Enable single-line annotations with pip-compile --annotation-style=line (#1477). Thanks @AndydeCleyre
    • Generate PEP 440 direct reference whenever possible (#1455). Thanks @FlorentJeannot
    • PEP 440 Direct Reference support (#1392). Thanks @FlorentJeannot

    Bug Fixes:

    • Change log level of hash message (#1460). Thanks @plannigan
    • Allow passing --no-upgrade option (#1438). Thanks @ssbarnea
    Source code(tar.gz)
    Source code(zip)
  • 6.2.0(Jun 22, 2021)

    Features:

    • Add --emit-options/--no-emit-options flags to pip-compile (#1123). Thanks @atugushev
    • Add --python-executable option for pip-sync (#1333). Thanks @MaratFM
    • Log which python version was used during compile (#828). Thanks @graingert

    Bug Fixes:

    • Fix pip-compile package ordering (#1419). Thanks @adamsol
    • Add --strip-extras option to pip-compile for producing constraint compatible output (#1404). Thanks @ssbarnea
    • Fix click v7 version_option compatibility (#1410). Thanks @FuegoFro
    • Pass package_name explicitly in click.version_option decorators for compatibility with click>=8.0 (#1400). Thanks @nicoa

    Other Changes:

    • Document updating requirements with pre-commit hooks (#1387). Thanks @microcat49
    • Add setuptools and wheel dependencies to the setup.cfg (#889). Thanks @jayvdb
    • Improve instructions for new contributors (#1394). Thanks @FlorentJeannot
    • Better explain role of existing requirements.txt (#1369). Thanks @mikepqr
    Source code(tar.gz)
    Source code(zip)
  • 6.1.0(Apr 14, 2021)

    Features:

    • Add support for pyproject.toml or setup.cfg as input dependency file (PEP-517) for pip-compile (#1356). Thanks @orsinium
    • Add pip-compile --extra option to specify extras_require dependencies (#1363). Thanks @orsinium

    Bug Fixes:

    • Restore ability to set compile cache with env var PIP_TOOLS_CACHE_DIR (#1368). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 6.0.1(Mar 15, 2021)

  • 6.0.0(Mar 13, 2021)

    Backwards Incompatible Changes:

    • Remove support for EOL Python 3.5 and 2.7 (#1243). Thanks @jdufresne
    • Remove deprecated --index/--no-index option from pip-compile (#1234). Thanks @jdufresne

    Features:

    • Use pep517 to parse dependencies metadata from setup.py (#1311). Thanks @astrojuanlu

    Bug Fixes:

    • Fix a bug where pip-compile with setup.py would not include dependencies with environment markers (#1311). Thanks @astrojuanlu
    • Prefer === over == when generating requirements.txt if a dependency was pinned with === (#1323). Thanks @IceTDrinker
    • Fix a bug where pip-compile with setup.py in nested folder would generate setup.txt output file (#1324). Thanks @peymanslh
    • Write out default index when it is provided as --extra-index-url (#1325). Thanks @fahrradflucht

    Dependencies:

    • Bump pip minimum version to >= 20.3 (#1340). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 5.5.0(Dec 30, 2020)

    Features:

    • Add Python 3.9 support (1222). Thanks @jdufresne
    • Improve formatting of long "via" annotations (1237). Thanks @jdufresne
    • Add --verbose and --quiet options to pip-sync (1241). Thanks @jdufresne
    • Add --no-allow-unsafe option to pip-compile (1265). Thanks @jdufresne

    Bug Fixes:

    • Restore PIP_EXISTS_ACTION environment variable to its previous state when resolve dependencies in pip-compile (1255). Thanks @jdufresne

    Dependencies:

    • Remove six dependency in favor pip's vendored six (1240). Thanks @jdufresne

    Improved Documentation:

    • Add pip-requirements.el (for Emacs) to useful tools to README (#1244). Thanks @jdufresne
    • Add supported Python versions to README (#1246). Thanks @jdufresne
    Source code(tar.gz)
    Source code(zip)
  • 5.4.0(Nov 21, 2020)

    Features:

    • Add pip>=20.3 support (1216). Thanks @atugushev and @AndydeCleyre
    • Exclude --no-reuse-hashes option from «command to run» header (1197). Thanks @graingert

    Dependencies:

    • Bump pip minimum version to >= 20.1 (1191). Thanks @atugushev and @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
  • 5.3.1(Jul 31, 2020)

  • 5.3.0(Jul 26, 2020)

    Features:

    • Add -h alias for --help option to pip-sync and pip-compile (1163). Thanks @jan25
    • Add pip>=20.2 support (1168). Thanks @atugushev
    • pip-sync now exists with code 1 on --dry-run (1172). Thanks @francisbrito
    • pip-compile now doesn't resolve constraints from -c constraints.txtthat are not (yet) requirements (1175). Thanks @clslgrnc
    • Add --reuse-hashes/--no-reuse-hashes options to pip-compile (1177). Thanks @graingert
    Source code(tar.gz)
    Source code(zip)
  • 5.2.1(Jun 9, 2020)

  • 5.2.0(May 27, 2020)

    Features:

    • Show basename of URLs when pip-compile generates hashes in a verbose mode (1113). Thanks @atugushev
    • Add --emit-index-url/--no-emit-index-url options to pip-compile (1130). Thanks @atugushev

    Bug Fixes:

    • Fix a bug where pip-compile would ignore some of package versions when PIP_PREFER_BINARY is set on (1119). Thanks @atugushev
    • Fix leaked URLs with credentials in the debug output of pip-compile (1146). Thanks @atugushev
    • Fix a bug where URL requirements would have name collisions (1149). Thanks @geokala

    Deprecations:

    • Deprecate --index/--no-index in favor of --emit-index-url/--no-emit-index-url options in pip-compile (1130). Thanks @atugushev

    Other Changes:

    • Switch to setuptools declarative syntax through setup.cfg (1141). Thanks @jdufresne
    Source code(tar.gz)
    Source code(zip)
  • 5.1.2(May 5, 2020)

  • 5.1.1(May 1, 2020)

  • 5.1.0(Apr 27, 2020)

    Features:

    • Show progress bar when downloading packages in pip-compile verbose mode (#949). Thanks @atugushev
    • pip-compile now gets hashes from PyPI JSON API (if available) which significantly increases the speed of hashes generation (#1109). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 5.0.0(Apr 16, 2020)

    Backwards Incompatible Changes:

    • pip-tools now requires pip>=20.0 (previosly 8.1.x - 20.0.x). Windows users, make sure to use python -m pip install pip-tools to avoid issues with pip self-update from now on (#1055). Thanks @atugushev
    • --build-isolation option now set on by default for pip-compile (#1060). Thanks @hramezani

    Features:

    • Exclude requirements with non-matching markers from pip-sync (#927). Thanks @AndydeCleyre
    • Add pre-commit hook for pip-compile (#976). Thanks @atugushev
    • pip-compile and pip-sync now pass anything provided to the new --pip-args option on to pip (#1080). Thanks @AndydeCleyre
    • pip-compile output headers are now more accurate when -- is used to escape filenames (#1080). Thanks @AndydeCleyre
    • Add pip>=20.1 support (#1088). Thanks @atugushev

    Bug Fixes:

    • Fix a bug where editables that are both direct requirements and constraints wouldn't appear in pip-compile output (#1093). Thanks @richafrank
    • pip-compile now sorts format controls (--no-binary/--only-binary) to ensure consistent results (#1098). Thanks @richafrank

    Improved Documentation:

    • Add cross-environment usage documentation to README (#651). Thanks @vphilippon
    • Add versions compatibility table to README (#1106). Thanks @atugushev
    Source code(tar.gz)
    Source code(zip)
  • 4.5.1(Feb 26, 2020)

    Bug Fixes:

    • Strip line number annotations such as "(line XX)" from file requirements, to prevent diff noise when modifying input requirement files (#1075). Thanks @adamchainz

    Improved Documentation:

    • Updated README example outputs for primary requirement annotations (#1072). Thanks @richafrank
    Source code(tar.gz)
    Source code(zip)
  • 4.5.0(Feb 20, 2020)

    4.5.0 (2020-02-20)

    Features:

    • Primary requirements and VCS dependencies are now get annotated with any source .in files and reverse dependencies (#1058). Thanks @AndydeCleyre

    Bug Fixes:

    • Always use normalized path for cache directory as it is required in newer versions of pip (#1062). Thanks @kammala

    Improved Documentation:

    • Replace outdated link in the README with rationale for pinning (#1053). Thanks @m-aciek
    Source code(tar.gz)
    Source code(zip)
  • 4.4.1(Jan 31, 2020)

    Bug Fixes:

    • Fix a bug where pip-compile would keep outdated options from requirements.txt (#1029). Thanks @atugushev
    • Fix the No handlers could be found for logger "pip.*" error by configuring the builtin logging module (#1035). Thanks @vphilippon
    • Fix a bug where dependencies of relevant constraints may be missing from output file (#1037). Thanks @jeevb
    • Upgrade the minimal version of click from 6.0 to 7.0 version in setup.py (#1039). Thanks @hramezani
    • Ensure that depcache considers the python implementation such that (for example) cpython3.6 does not poison the results of pypy3.6 (#1050). Thanks @asottile

    Improved Documentation:

    • Make the README more imperative about installing into a project's virtual environment to avoid confusion (#1023). Thanks @tekumara
    • Add a note to the README about how to install requirements on different stages to Workflow for layered requirements section (#1044). Thanks @hramezani
    Source code(tar.gz)
    Source code(zip)
  • 4.4.0(Jan 21, 2020)

    Features:

    • Add --cache-dir option to pip-compile (#1022). Thanks @richafrank
    • Add pip>=20.0 support (#1024). Thanks @atugushev

    Bug Fixes:

    • Fix a bug where pip-compile --upgrade-package would upgrade those passed packages not already required according to the *.in and *.txt files (#1031). Thanks @AndydeCleyre
    Source code(tar.gz)
    Source code(zip)
Owner
Jazzband
Jazzband
PokerFace is a Python package for various poker tools.

PokerFace is a Python package for various poker tools. The following features are present in PokerFace... Types for cards and their componen

Juho Kim 10 Nov 12, 2022
A PDM plugin that packs your packages into a zipapp

pdm-packer A PDM plugin that packs your packages into a zipapp Requirements pdm-packer requires Python >=3.7 Installation If you have installed PDM wi

Frost Ming 19 Jul 28, 2022
Install All Basic Termux Packages To Your Phone

~All-Packages~ The Easiest Way To Install All Termux Packages ?? Tool By ⒹⓈ᭄ʜʏᴅʀᴀ✘๛ˢᴸ ?? Contact Me On ?? >> AVAILABLE ON : Termux >> TESTED ON : Term

ⒹⓈ ʜʏͥᴅᷧʀᷟᴀ✘๛ˢᴸ 7 Nov 12, 2022
If you have stars in your Pipfile and you don't want them, this project is for you!

unstar-pipfile If you have stars in your Pipfile, this project is for you! unstar-pipfile is a tool to scan Pipfile.lock and replace any stars in Pipf

null 2 Jul 26, 2022
Dotpkg - Package manager for your dotfiles

Dotpkg A package manager for your dotfiles. Usage First make sure to have Python

FW 4 Mar 18, 2022
Python dependency management and packaging made easy.

Poetry: Dependency Management for Python Poetry helps you declare, manage and install dependencies of Python projects, ensuring you have the right sta

Poetry 22.7k Nov 21, 2022
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 334 Nov 18, 2022
Python PyPi staging server and packaging, testing, release tool

devpi: PyPI server and packaging/testing/release tool This repository contains three packages comprising the core devpi system on the server and clien

null 614 Nov 15, 2022
The Python Package Index

Warehouse Warehouse is the software that powers PyPI. See our development roadmap, documentation, and architectural overview. Getting Started You can

Python Packaging Authority 3.1k Nov 18, 2022
The Python package installer

pip - The Python Package Installer pip is the package installer for Python. You can use pip to install packages from the Python Package Index and othe

Python Packaging Authority 8.4k Nov 19, 2022
Python dependency management and packaging made easy.

Poetry: Dependency Management for Python Poetry helps you declare, manage and install dependencies of Python projects, ensuring you have the right sta

Poetry 22.7k Nov 25, 2022
Install and Run Python Applications in Isolated Environments

pipx — Install and Run Python Applications in Isolated Environments Documentation: https://pipxproject.github.io/pipx/ Source Code: https://github.com

null 5.7k Nov 19, 2022
:package: :fire: Python project management. Manage packages: convert between formats, lock, install, resolve, isolate, test, build graph, show outdated, audit. Manage venvs, build package, bump version.

THE PROJECT IS ARCHIVED Forks: https://github.com/orsinium/forks DepHell -- project management for Python. Why it is better than all other tools: Form

DepHell 1.7k Nov 21, 2022
An installation and dependency system for Python

Pyflow Simple is better than complex - The Zen of Python Pyflow streamlines working with Python projects and files. It's an easy-to-use CLI app with a

David O'Connor 1.2k Nov 21, 2022
pip-run - dynamic dependency loader for Python

pip-run provides on-demand temporary package installation for a single interpreter run. It replaces this series of commands (or their Windows equivale

Jason R. Coombs 78 Nov 19, 2022
Python Development Workflow for Humans.

Pipenv: Python Development Workflow for Humans [ ~ Dependency Scanning by PyUp.io ~ ] Pipenv is a tool that aims to bring the best of all packaging wo

Python Packaging Authority 23.5k Nov 24, 2022
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 334 Nov 18, 2022
Simple Library Management made with Python

Installation pip install mysql-connector-python NOTE: You must make a database (library) & and table (books, student) to hold all data. Languange and

SonLyte 10 Oct 21, 2021
Example for how to package a Python library based on Cython.

Cython sample module This project is an example of a module that can be built using Cython. It is an upgrade from a similar model developed by Arin Kh

Juan José García Ripoll 4 Aug 28, 2022