Poetry PEP 517 Build Backend & Core Utilities

Overview

Poetry Core

PyPI version Python Versions License: MIT Code style: black

A PEP 517 build backend implementation developed for Poetry. This project is intended to be a light weight, fully compliant, self-contained package allowing PEP 517 compatible build frontends to build Poetry managed projects.

Usage

In most cases, the usage of this package is transparent to the end-user as it is either made use by Poetry itself or a PEP 517 frontend (eg: pip).

In order to enable the use poetry-core as your build backend, the following snippet must be present in your project's pyproject.toml file.

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

Once this is present, a PEP 517 frontend like pip can build and install your project from source without the need for Poetry or any of it's dependencies.

# install to current environment
pip install /path/to/poetry/managed/project

# build a wheel package
pip wheel /path/to/poetry/managed/project

Why is this required?

Prior to the release of version 1.1.0, Poetry was a build as a project management tool that included a PEP 517 build backend. This was inefficient and time consuming in majority cases a PEP 517 build was required. For example, both pip and tox (with isolated builds) would install Poetry and all dependencies it required. Most of these dependencies are not required when the objective is to simply build either a source or binary distribution of your project.

In order to improve the above situation, poetry-core was created. Shared functionality pertaining to PEP 517 build backends, including reading lock file, pyproject.toml and building wheel/sdist, were implemented in this package. This makes PEP 517 builds extremely fast for Poetry managed packages.

Comments
  • Introduce support for relative include of non-package modules (

    Introduce support for relative include of non-package modules ("workspaces")

    The purpose of the changes here is to enable Workspace support. A workspace is a place for code and projects. Within the workspace, code can be shared. A workspace is usually at the root of your repository.

    To identify a workspace in a Python repo, an empty workspace.toml file is put at the top of the workspace. Future plugins that extends workspaces could use that file to store configuration and settings.

    The feature in this pull request will make this this plugin redundant 😄 (I am the author of that plugin)

    Why workspaces? A workspace can contain more than one project. Different projects will likely use the same code. A very simplistic example would be a logger. To avoid code duplication, code could be moved out from the project into packages, and each project can reference the package from the project specific pyproject.toml file.

    This requires that Poetry allows package includes (note the difference from dependencies) that are "outside" of the project path, but within a workspace. That's what this pull request will do.

    An example & simplified tree workspace structure (note the namespacing for shared package includes):

    projects/
      my_app/
        pyproject.toml (including a shared package)
    
      my_service/
        pyproject.toml (including other shared packages)
    
    shared/
      my_namespace/
        my_package/
          __init__.py
          code.py
    
        my_other_package/
          __init__.py
          code.py
    
    workspace.toml (a file that tells the plugin where to find the workspace root)
    

    I think this feature resolves the issues raised in: https://github.com/python-poetry/poetry/issues/936 and probably also https://github.com/python-poetry/poetry/issues/2270

    • [ ] Added tests for changed code.
    • [ ] Updated documentation for changed code.
    opened by DavidVujic 36
  • Added script file feature

    Added script file feature

    Resolves: python-poetry#241

    • [x] Added tests for changed code.
    • [x] Updated documentation for changed code.

    Feature: ability to add script files to distribution packages

    This one is a long overdue, see: https://github.com/sdispater/poetry/issues/241 (June 2018)

    The implementation is based on the official specification by @abn https://github.com/python-poetry/poetry/issues/2310

    This is a very basic feature of setup.py See here: docs.

    Note: this is not the same as entry_point! This setuptools feature basically just copies over the given files to the virtualenv's bin folder so that it'll be available on the $PATH and you can simply invoke it.

    Example

    [tools.poetry]
    package = "my_package"
    
    [tools.poetry.scripts]
    migration = { source = "bin/run_db_migrations.sh", type = "file" }
    setup_script = { source = "bin/run_setup_script.sh" } # type can be omitted!
    

    Then:

    poetry install
    poetry run sh run_db_migrations.sh
    poetry run sh run_setup_script.sh
    

    Testing

    • I added a couple of automated tests
    • I tested this manually with one of our repositories and it worked
    Feature 
    opened by peterdeme 36
  • Fix bug/crash when source dir is outside project dir

    Fix bug/crash when source dir is outside project dir

    Hello poetry team,

    here is a draft fix for issue python-poetry#6521. It does not currently include a test, as I am not sure how to write a test for it. I tried perusing the unit tests in tests/masonry/builders/test_builder.py and got a bit lost :sweat_smile: Any help or guidance on unit tests would be much appreciated

    Documentation... I'm not sure if that's necessary? The purpose of this change is to make poetry do the right thing, instead of failing unexpectedly... I will defer to your judgement on that.

    replace usage of pathlib.Path.relative_to with os.path.relpath, as per https://github.com/python-poetry/poetry/issues/5621

    Resolves: python-poetry#6521

    • [ ] Added tests for changed code.
    • [ ] Updated documentation for changed code.
    opened by alextremblay 16
  • Fixes unstable next breaking version when major is 0

    Fixes unstable next breaking version when major is 0

    Resolves: https://github.com/python-poetry/poetry/issues/6519

    • [x] Added tests for changed code.
    • [ ] Updated documentation for changed code.

    What this PR contains:

    • moves Version.stable to PEP440Version.stable as Version.stable was breaking the precision half of the time.
    • adds bunch of test
    • fixes Version.next_breaking to allow for version with major == 0

    Extra info

    It would be great to have to have a poetry build step that checks the requires_dist output against poetry to make sure the package is actually usable.

    opened by mazinesy 14
  • Allow appending git username / password to dependency

    Allow appending git username / password to dependency

    Resolves: https://github.com/python-poetry/poetry/issues/2062, https://github.com/python-poetry/poetry/issues/2348. Builds on top of #96.

    Changes in #96 (copied from here):

    This PR allows appending deployment keys to usernames as used by gitlab.

    Furthermore a new property is_unsafe is introduced for ParsedUrl, which can be used for cli command like poetry add to easily return whether the git dependency contains a password.

    According to gitlab's docs a + is allowed in usernames. This is fixed as well.

    Fixes: python-poetry/poetry#2062

    (This was initially submitted to the poetry repository: python-poetry/poetry#2169)

    Changes I made on top of @finswimmer's work in #96 (rebased to abe9fe5):

    • Add tests for when x-oauth-basic GitHub URL is used in package link (https://github.com/python-poetry/poetry/issues/2348).

    • Add warning and log for when a dependency stores a password in plain text by using the is_unsafe property (based on discussion in https://github.com/python-poetry/poetry/pull/2169#issuecomment-705587436).

    • [x] Added tests for changed code.

    • [x] ~Updated documentation for changed code.~ NA

    Hoping this is at a good enough point to review and consider merging into main, to be included for a soon-ish release.

    opened by setu4993 14
  • Feature: Declare ext_modules and libraries in pyproject.toml

    Feature: Declare ext_modules and libraries in pyproject.toml

    PR summary

    A declarative approach to extensions using pyproject.toml. Eliminates the need for build.py in many cases.

    How to use

    Extension modules can now be declared in pyproject.toml:

    [tool.poetry.ext_modules.mymodule] sources = ["mymodule/mymodule.c"]

    The new logic will validate this config and use it to construct a distutils.extension.Extension object to be passed to distutils.setup()

    Config properties

    The new logic adds support for all options normally passed to distutils.extension.Extension(), and also within the libraries argument to distutils::setup(). The full list of supported config properties are:

    Extension modules [tool.poetry.ext_modules]

    • sources (required) - A list of source filenames or globs
    • include_dirs - A list of directories to search for C/C++ header files
    • define_macros - A list of macros to define; each macro is defined using a 2-tuple (name, value)
    • undef_macros - A list of macros to undefine explicitly
    • library_dirs - A list of directories to search for C/C++ libraries at link time
    • libraries - A list of library names (not filenames or paths) to link against
    • runtime_library_dirs - A list of directories to search for C/C++ libraries at run time
    • extra_objects - A list of paths or globs of extra files to link with
    • extra_compile_args - Any extra platform- and compiler-specific information to use when compiling the source files
    • extra_link_args - Any extra platform- and compiler-specific information to use when linking object files together
    • export_symbols - A list of symbols to be exported from a shared extension
    • depends - A list of paths or globs of files that the extension depends on
    • language - The extension language (i.e. 'c', 'c++', 'objc')
    • optional - Boolean, specifies that a build failure in the extension should not abort the build process

    C libraries [tool.poetry.libraries]

    • sources (required) - A list of source filenames or globs
    • include_dirs - A list of directories to search for C/C++ header files
    • macros - A list of macros to define; each macro is defined using a 2-tuple (name, value)

    Why?

    I wanted a tool that would limit everything to a single configuration file to do: dependency management, packaging and publishing.

    By eliminating the need for build.py in many cases, this PR better accomplishes the stated intent of Poetry (reducing project complexity).

    What if I still want to use build.py?

    This doesn't stop you, it just gives you another option. If your build logic is too complex (e.g. moving files around), the declarative approach may not be sufficient.

    To promote clarity and to reduce the complexity of build logic, this PR doesn't allow you to mix both approaches.

    opened by bostonrwalker 12
  • Type checking

    Type checking

    Resolves: python-poetry#

    • [ ] Added tests for changed code.
    • [ ] Updated documentation for changed code.

    More typechecking.

    Things that I am not entirely comfortable with:

    • returning empty strings from VCSDependency.reference() and VCSDependency.pretty_constraint()
    • https://github.com/python-poetry/poetry-core/blob/26e978d3b72ff58039ac921524b5ed47cb8b7353/src/poetry/core/packages/constraints/any_constraint.py#L25 looks like nonsense, what we surely want to return here is some sort of negation of other?
    • I've thrown in a few exceptions for places in constraints where cases simply weren't handled. The code would have failed anyway, I've just made it explicit: but still WIBNI these cases were handled
    opened by dimbleby 12
  • Added trusted repository option

    Added trusted repository option

    opened by maayanbar13 12
  • Explicitly print gitignored

    Explicitly print gitignored

    Resolves: python-poetry/poetry#6443

    • [ ] Added tests for changed code.
    • [ ] Updated documentation for changed code.

    Didn't add tests as it's just a matter of calling the logger. I also fixed a typo, changing explicitely_<excluded | included> to explicitly_<excluded | included>.

    PR for https://github.com/python-poetry/poetry/issues/6443

    opened by evanrittenhouse 11
  • set up mypy hook for incremental adoption

    set up mypy hook for incremental adoption

    as per the comment here - https://github.com/python-poetry/poetry/pull/4510

    for some reason i had to use different config to successfully exclude the _vendor directory

    opened by danieleades 11
  • Allow non-existant path dependencies

    Allow non-existant path dependencies

    Currently poetry lock --no-update fails when you remove any path dependency because it first loads all dependencies from the lock file (which still has a reference to the dependency you just deleted) and it fails at that point because when DirectoryDependency is instantiated with a non-existent path it immediately throws an exception.

    This change allows the process to continue, which is the same behavior VCS dependencies have.

    If the pyproject.toml has a path dependency that doesn't exist we will want to error for poetry install and poetry lock (poetry build with path dependencies is already kinda a no-go). poetry install already fails gracefully (if you lock the project then delete the path dependency and try to install; i.e. when no locking happens before we start installing), I opened https://github.com/python-poetry/poetry/pull/6844 to make poetry lock fail gracefully.

    opened by adriangb 10
  • performance fix for simplified marker simplifications

    performance fix for simplified marker simplifications

    While working on full support for overlapping markers, I encountered a combinatorial explosion when building a union of markers (see test case).

    The issue is that we introduced cnf without introducing a counterpart of union_simplify, which leads to unnecessary big markers. Therefore, I added intersect_simplify analoguous to union_simplify.

    Further, the simplifications in #530 went a bit too far in removing the common_markers/unique_markers simplification in union_simplify. I reverted this removal and added analogue functionality to intersect_simplify.

    Comparing the number of items in itertools.product in dnf of union (after cnf) shows the benefit of the simplifications:

    |union (see test case)|number and length of markers in conjunction after cnf|number of items in itertools.product in dnf| |---|---|---| without PR | 31 (1-4) | > 2**44 with simple intersect_simplifiy | 9 (1-2) | 256 with revival of common_markers/unique_markers simplification | 3 (1-2) | 4

    opened by radoering 1
  • markers: fix `get_python_constraint_by_marker()` for multi markers and marker unions without `python_version`

    markers: fix `get_python_constraint_by_marker()` for multi markers and marker unions without `python_version`

    Actually the same as #307 which only covered single markers.

    The actual fix is in only() of MultiMarker and MarkerUnion. The test shows that get_python_constraint_by_marker() is fixed.

    There were some test cases testing the faulty behaviour of only(). I did not only change the expectation (which would have been "" in most cases) but also the input in some cases to cover some more interesting cases.

    • [x] Added tests for changed code.
    • [ ] Updated documentation for changed code.
    opened by radoering 1
  • builders/wheel: Ensure dist-info is written determinisically

    builders/wheel: Ensure dist-info is written determinisically

    glob() returns values in "on disk" order. To make the RECORD file deterministic and consistent between builds we need to sort the data before adding to the records list.

    Signed-off-by: Richard Purdie [email protected]

    Resolves: python-poetry#

    • [ ] Added tests for changed code.
    • [ ] Updated documentation for changed code.
    opened by rpurdie 3
  • PEP 440 compliance: do not implicitly allow pre-releases

    PEP 440 compliance: do not implicitly allow pre-releases

    Extracted from #402. Besides being a precondition of #402 it's a fix on its own. And it requires some coordination with downstream.

    Requires: python-poetry/poetry#7225 and python-poetry/poetry#7236

    PEP 440 says:

    Pre-releases of any kind, including developmental releases, are implicitly excluded from all version specifiers, unless they are already present on the system, explicitly requested by the user, or if the only available version that satisfies the version specifier is a pre-release.

    Nothing special about "> pre-release" or ">= pre-release"

    opened by radoering 1
  • validate extras against dependencies and in schema

    validate extras against dependencies and in schema

    Resolves: python-poetry/poetry/issues/7226

    • [x] Added tests for changed code.
    • [ ] Updated documentation for changed code.

    With this change, extras are validated to only contain valid characters and extras that reference dependencies that can't be found in the main dependency group raise a warning in poetry check (not in poetry lock/poetry install, though)

    opened by Panaetius 1
Releases(1.4.0)
Owner
Poetry
Python packaging and dependency management made easy
Poetry
Proquabet - Convert your prose into proquints and then you essentially have Vogon poetry

Proquabet Turn your prose into a constant stream of encrypted and meaningless-so

Milo Fultz 2 Oct 10, 2022
Honor's thesis project analyzing whether the GPT-2 model can more effectively generate free-verse or structured poetry.

gpt2-poetry The following code is for my senior honor's thesis project, under the guidance of Dr. Keith Holyoak at the University of California, Los A

Ashley Kim 2 Jan 9, 2022
NLP Core Library and Model Zoo based on PaddlePaddle 2.0

PaddleNLP 2.0拥有丰富的模型库、简洁易用的API与高性能的分布式训练的能力,旨在为飞桨开发者提升文本建模效率,并提供基于PaddlePaddle 2.0的NLP领域最佳实践。

null 6.9k Jan 1, 2023
Basic Utilities for PyTorch Natural Language Processing (NLP)

Basic Utilities for PyTorch Natural Language Processing (NLP) PyTorch-NLP, or torchnlp for short, is a library of basic utilities for PyTorch NLP. tor

Michael Petrochuk 2.1k Jan 1, 2023
Basic Utilities for PyTorch Natural Language Processing (NLP)

Basic Utilities for PyTorch Natural Language Processing (NLP) PyTorch-NLP, or torchnlp for short, is a library of basic utilities for PyTorch NLP. tor

Michael Petrochuk 1.9k Feb 3, 2021
Basic Utilities for PyTorch Natural Language Processing (NLP)

Basic Utilities for PyTorch Natural Language Processing (NLP) PyTorch-NLP, or torchnlp for short, is a library of basic utilities for PyTorch NLP. tor

Michael Petrochuk 1.9k Feb 18, 2021
API for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend

gpt-j-api ?? An API to interact with the GPT-J language model. You can use and test the model in two different ways: Streamlit web app at http://api.v

Víctor Gallego 276 Dec 31, 2022
Backend for the Autocomplete platform. An AI assisted coding platform.

Introduction A custom predictor allows you to deploy your own prediction implementation, useful when the existing serving implementations don't fit yo

Tatenda Christopher Chinyamakobvu 1 Jan 31, 2022
Build Text Rerankers with Deep Language Models

Reranker is a lightweight, effective and efficient package for training and deploying deep languge model reranker in information retrieval (IR), question answering (QA) and many other natural language processing (NLP) pipelines. The training procedure follows our ECIR paper Rethink Training of BERT Rerankers in Multi-Stage Retrieval Pipeline using a localized constrastive esimation (LCE) loss.

Luyu Gao 140 Dec 6, 2022
An easier way to build neural search on the cloud

An easier way to build neural search on the cloud Jina is a deep learning-powered search framework for building cross-/multi-modal search systems (e.g

Jina AI 17.1k Jan 9, 2023
An easier way to build neural search on the cloud

An easier way to build neural search on the cloud Jina is a deep learning-powered search framework for building cross-/multi-modal search systems (e.g

Jina AI 2.3k Feb 18, 2021
open-information-extraction-system, build open-knowledge-graph(SPO, subject-predicate-object) by pyltp(version==3.4.0)

中文开放信息抽取系统, open-information-extraction-system, build open-knowledge-graph(SPO, subject-predicate-object) by pyltp(version==3.4.0)

null 7 Nov 2, 2022
A tool helps build a talk preview image by combining the given background image and talk event description

talk-preview-img-builder A tool helps build a talk preview image by combining the given background image and talk event description Installation and U

PyCon Taiwan 4 Aug 20, 2022
Prithivida 690 Jan 4, 2023
Idea is to build a model which will take keywords as inputs and generate sentences as outputs.

keytotext Idea is to build a model which will take keywords as inputs and generate sentences as outputs. Potential use case can include: Marketing Sea

Gagan Bhatia 364 Jan 3, 2023
txtai: Build AI-powered semantic search applications in Go

txtai: Build AI-powered semantic search applications in Go txtai executes machine-learning workflows to transform data and build AI-powered semantic s

NeuML 49 Dec 6, 2022
Use Tensorflow2.7.0 Build OpenAI'GPT-2

TF2_GPT-2 Use Tensorflow2.7.0 Build OpenAI'GPT-2 使用最新tensorflow2.7.0构建openai官方的GPT-2 NLP模型 优点 使用无监督技术 拥有大量词汇量 可实现续写(堪比“xx梦续写”) 实现对话后续将应用于FloatTech的Bot

Watermelon 9 Sep 13, 2022
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 345 Dec 28, 2022
A PyPI mirror client according to PEP 381 http://www.python.org/dev/peps/pep-0381/

This is a PyPI mirror client according to PEP 381 + PEP 503 http://www.python.org/dev/peps/pep-0381/. bandersnatch >=4.0 supports Linux, MacOSX + Wind

Python Packaging Authority 345 Dec 28, 2022
Pydocstringformatter - A tool to automatically format Python docstrings that tries to follow recommendations from PEP 8 and PEP 257.

Pydocstringformatter A tool to automatically format Python docstrings that tries to follow recommendations from PEP 8 and PEP 257. See What it does fo

Daniël van Noord 31 Dec 29, 2022