AnyIO is an asynchronous networking and concurrency library that works on top of either asyncio or trio.

Overview
Build Status Code Coverage Documentation Gitter chat

AnyIO is an asynchronous networking and concurrency library that works on top of either asyncio or trio. It implements trio-like structured concurrency (SC) on top of asyncio, and works in harmony with the native SC of trio itself.

Applications and libraries written against AnyIO's API will run unmodified on either asyncio or trio. AnyIO can also be adopted into a library or application incrementally – bit by bit, no full refactoring necessary. It will blend in with native libraries of your chosen backend.

Documentation

View full documentation at: https://anyio.readthedocs.io/

Features

AnyIO offers the following functionality:

  • Task groups (nurseries in trio terminology)
  • High level networking (TCP, UDP and UNIX sockets)
    • Happy eyeballs algorithm for TCP connections (more robust than that of asyncio on Python 3.8)
    • async/await style UDP sockets (unlike asyncio where you still have to use Transports and Protocols)
  • A versatile API for byte streams and object streams
  • Inter-task synchronization and communication (locks, conditions, events, semaphores, object streams)
  • Worker threads
  • Subprocesses
  • Asynchronous file I/O (using worker threads)
  • Signal handling

AnyIO also comes with its own pytest plugin which also supports asynchronous fixtures. It even works with the popular Hypothesis library.

Comments
  • synchronous Event set

    synchronous Event set

    I'd like to use anyio for compatibility across asyncio and Trio, but async Event.set() (due to Curio) is a non-starter. Event is a widely use primitive. I'm guessing that an async set() would double the number of functions needing to be async in my program, and hence greatly increase the race conditions I'd need to ponder. It's advised in "The Function Colour Myth" to keep async function percentage to a minimum.

    It's not like I'm explicitly opting to not support Curio either, because my intended use case depends on Quart, which itself only supports asyncio and Trio.

    It seems like there could just be a synchronous_set() method in the ABC, and Curio could raise NotImplementedError. Isn't there already precedent with FileStreamWrapper.send_eof()?.

    duplicate 
    opened by belm0 64
  • Issue with nested task groups and cancellation

    Issue with nested task groups and cancellation

    As far as I can tell, this issue is different from #34.

    When an outer task group cancels a task containing an inner task group, the inner task group does not re-raise the CancelledError:

    import trio
    import anyio
    import pytest
    
    
    @pytest.mark.anyio
    async def test_anyio():
    
        async def g():
            async with anyio.create_task_group() as group:
                await anyio.sleep(1)
            assert False
    
        async def f():
            async with anyio.create_task_group() as group:
                await group.spawn(g)
                await anyio.sleep(0)
                await group.cancel_scope.cancel()
    
        await f()
    
    
    @pytest.mark.trio
    async def test_trio():
    
        async def g():
            async with trio.open_nursery() as nursery:
                await trio.sleep(1)
            assert False
    
        async def f():
            async with trio.open_nursery() as nursery:
                nursery.start_soon(g)
                await trio.sleep(0)
                nursery.cancel_scope.cancel()
    
        await f()
    
    

    Reports:

    test_anyio.py::test_anyio[asyncio] FAILED
    test_anyio.py::test_anyio[curio] FAILED
    test_anyio.py::test_anyio[trio] FAILED                  
    test_anyio.py::test_trio PASSED        
    

    As a side note, test_anyio[trio] does not fail for the same reason as the other test_anyio: it raises an ExceptionGroup containing two trio.Cancelled exceptions.

    bug 
    opened by vxgmichel 35
  • anyio/3.3.0: failures running tests

    anyio/3.3.0: failures running tests

    Hello, i'm packaging the latest version of anyio for Debian, and we are seeing some errors running tests

    first we get this:

    ======================================= test session starts ========================================
    platform linux -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
    rootdir: /build/python-anyio-3.3.0, configfile: pyproject.toml
    plugins: mock-3.6.1
    collected 6 items / 19 errors                                                                      
    
    ============================================== ERRORS ==============================================
    ________________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_compat.py _________________
    'anyio' not found in `markers` configuration option
    _______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_debugging.py _______________
    'anyio' not found in `markers` configuration option
    _______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_eventloop.py _______________
    'anyio' not found in `markers` configuration option
    ________________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_fileio.py _________________
    'anyio' not found in `markers` configuration option
    ______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_from_thread.py ______________
    'anyio' not found in `markers` configuration option
    _______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_lowlevel.py ________________
    'anyio' not found in `markers` configuration option
    ________________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_signals.py ________________
    'anyio' not found in `markers` configuration option
    ________________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_sockets.py ________________
    'anyio' not found in `markers` configuration option
    _____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_subprocesses.py ______________
    'anyio' not found in `markers` configuration option
    ____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_synchronization.py ____________
    'anyio' not found in `markers` configuration option
    ______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_taskgroups.py _______________
    'anyio' not found in `markers` configuration option
    ______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_to_process.py _______________
    'anyio' not found in `markers` configuration option
    _______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/test_to_thread.py _______________
    'anyio' not found in `markers` configuration option
    ___________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_buffered.py ____________
    'anyio' not found in `markers` configuration option
    _____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_file.py ______________
    'anyio' not found in `markers` configuration option
    ____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_memory.py _____________
    'anyio' not found in `markers` configuration option
    ____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_stapled.py ____________
    'anyio' not found in `markers` configuration option
    _____________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_text.py ______________
    'anyio' not found in `markers` configuration option
    ______________ ERROR collecting .pybuild/cpython3_3.9/build/tests/streams/test_tls.py ______________
    'anyio' not found in `markers` configuration option
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 19 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    ======================================== 19 errors in 0.14s ========================================
    

    but adding a pytest.ini file with:

    [pytest]
    markers =
      anyio
      network
    

    it goes a bit better but still fails with:

    ======================================= test session starts ========================================
    platform linux -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-0.13.0
    rootdir: /build/python-anyio-3.3.0/.pybuild/cpython3_3.9/build, configfile: pytest.ini
    plugins: mock-3.6.1
    collected 377 items / 2 errors / 375 selected                                                      
    
    ============================================== ERRORS ==============================================
    ____________________________ ERROR collecting tests/test_taskgroups.py _____________________________
    In test_start_native_host_cancelled: function uses no argument 'anyio_backend'
    _____________________________ ERROR collecting tests/test_to_thread.py _____________________________
    In test_asyncio_cancel_native_task: function uses no argument 'anyio_backend'
    ===================================== short test summary info ======================================
    ERROR tests/test_taskgroups.py
    ERROR tests/test_to_thread.py
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 2 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    ======================================== 2 errors in 0.31s =========================================
    

    which is indeed correct, since those 2 functions dont use the parametrize parameter.

    any idea how to address the last issue? should you also add a pytest.ini to the repo? (i noticed there's a reference in pyproject.toml, but Debian doesnt use poetry yet, and even when copying those options, tests fail with a traceback in pytest)

    thanks!

    opened by sandrotosi 30
  • Socket listeners do not get close notifications

    Socket listeners do not get close notifications

    If one task is waiting for a socket to become ready for reading or writing and another task closes it, the former task is left waiting forever. Trio has a native mechanism to handle this gracefully, and for the other two we could whip up a solution fairly easily.

    bug 
    opened by agronholm 30
  • A way to collect results upon exiting task group

    A way to collect results upon exiting task group

    In asyncio, there's await asyncio.gather(..a..lot..of..tasks).

    Is it possible to somehow emulate this with anyio?

    I'm thinking of something like this maybe:

    async with create_task_group() as tg:
        await tg.spawn(t1)
        await tg.spawn(t2)
    
    print(tg.results)  # t1_res, t2_res
    
    wontfix 
    opened by webknjaz 26
  • Possible race condition in get_asynclib when threading

    Possible race condition in get_asynclib when threading

    I'm not quite sure how to reproduce and it's also possible I'm doing something wrong that's causing this. I figured I'd open this just so anyone else that might be experiencing something similar has something to try as a fix, so feel free to close if there's not enough info or shouldn't be solved in AnyIO.

    Here's the traceback I was seeing when trying to use anyio from a thread:

    Traceback (most recent call last):
      File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/threading.py", line 926, in _bootstrap_inner
        self.run()
      File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/threading.py", line 870, in run
        self._target(*self._args, **self._kwargs)
    ...
    ...
    ...
      File "/opt/hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/anyio/_core/_tasks.py", line 74, in create_task_group
        return get_asynclib().TaskGroup()
    AttributeError: module 'anyio._backends._asyncio' has no attribute 'TaskGroup'
    

    This seemed very weird until I got another traceback claiming that anyio._backends._asyncio was partially initialized which clued me into the fact that this was a race condition.

    The solution was just to import anyio._backends._asyncio at the top of my file before starting the thread that imports anyio.

    opened by rmorshea 24
  • Add support for wrappers of callback-based APIs

    Add support for wrappers of callback-based APIs

    First off, thanks for writing and maintaining anyio. It's a useful and well-written library.

    I'm in the process of writing an anyio wrapper around paho-mqtt. Here is an example from the paho-mqtt readme:

    # The callback for when a PUBLISH message is received from the server.
    def on_message(client, userdata, msg):
        print(msg.topic+" "+str(msg.payload))
    
    client = mqtt.Client()
    client.on_message = on_message
    

    To bridge the gap between the old-school callbacks and the async domain, I use create_memory_object_stream:

    send_stream, receive_stream = create_memory_object_stream()
    
    def _send_to_stream(client, userdata, msg):
        try:
            send_stream.send_nowait(msg)  # This is an async function so I can't call it in this context
        except WouldBlock:
            ...
    
    client.on_message = _send_to_stream
    

    This approach doesn't work, however, since send_nowait is an async function. From a cursory look, I get the impression that curio support is the culprit. Because Event.set is async in curio, send_nowait is also async.

    Is there some way around this? Can I somehow specify that I don't need curio support and get the non-async version of Event.set and, in turn, the non-async version of send_nowait?

    If there currently isn't such functionality, would you be interested in a pull request that adds this?

    Many other old-school APIs are callback-based. RxPy is another example. I'd also like to write anyio helpers for RxPy to bridge the gap to the async domain. Currently, my efforts are blocked by the above-mentioned restriction.

    This issue relates to #99.

    enhancement 
    opened by frederikaalund 24
  • 3.3.0: pytest is failing

    3.3.0: pytest is failing

    Just normal build, install and test cycle used on building package from non-root account:

    • "setup.py build"
    • "setup.py install --root </install/prefix>"
    • "pytest with PYTHONPATH pointing to setearch and sitelib inside </install/prefix>
    + PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.0-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.0-2.fc35.x86_64/usr/lib/python3.8/site-packages
    + /usr/bin/pytest -ra
    =========================================================================== test session starts ============================================================================
    platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
    benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
    rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0, configfile: pyproject.toml, testpaths: tests
    plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, asyncio-0.15.1, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
    collected 1242 items
    
    tests/test_compat.py .......................................................................................                                                         [  7%]
    tests/test_debugging.py FF.....................                                                                                                                      [  8%]
    tests/test_eventloop.py .........                                                                                                                                    [  9%]
    tests/test_fileio.py .........................s...........s...........................................................sss........................................... [ 21%]
    ....................                                                                                                                                                 [ 22%]
    tests/test_from_thread.py ............................................................................                                                               [ 28%]
    tests/test_lowlevel.py ...........................                                                                                                                   [ 30%]
    tests/test_pytest_plugin.py FFFF..                                                                                                                                   [ 31%]
    tests/test_signals.py .........                                                                                                                                      [ 32%]
    tests/test_sockets.py .............................................................................................................................................. [ 43%]
    .................................................................................................................................................................... [ 56%]
    .....................                                                                                                                                                [ 58%]
    tests/test_subprocesses.py ..................                                                                                                                        [ 59%]
    tests/test_synchronization.py ...................................................................................................                                    [ 67%]
    tests/test_taskgroups.py ........................................................................................................................................... [ 79%]
    .....................................s                                                                                                                               [ 82%]
    tests/test_to_process.py .....................                                                                                                                       [ 83%]
    tests/test_to_thread.py ........................                                                                                                                     [ 85%]
    tests/streams/test_buffered.py ............                                                                                                                          [ 86%]
    tests/streams/test_file.py ..............................                                                                                                            [ 89%]
    tests/streams/test_memory.py .................................................................                                                                       [ 94%]
    tests/streams/test_stapled.py ..................                                                                                                                     [ 95%]
    tests/streams/test_text.py ...............                                                                                                                           [ 97%]
    tests/streams/test_tls.py ....................................                                                                                                       [100%]
    
    ================================================================================= FAILURES =================================================================================
    _______________________________________________________________________ test_main_task_name[asyncio] _______________________________________________________________________
    tests/test_debugging.py:37: in test_main_task_name
        for loop in [obj for obj in gc.get_objects()
    tests/test_debugging.py:38: in <listcomp>
        if isinstance(obj, asyncio.AbstractEventLoop)]:
    /usr/lib/python3.8/site-packages/itsdangerous/_json.py:24: in __getattribute__
        warnings.warn(
    E   DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. Use Python's 'json' module instead.
    ___________________________________________________________________ test_main_task_name[asyncio+uvloop] ____________________________________________________________________
    tests/test_debugging.py:37: in test_main_task_name
        for loop in [obj for obj in gc.get_objects()
    tests/test_debugging.py:38: in <listcomp>
        if isinstance(obj, asyncio.AbstractEventLoop)]:
    /usr/lib/python3.8/site-packages/itsdangerous/_json.py:24: in __getattribute__
        warnings.warn(
    E   DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. Use Python's 'json' module instead.
    _______________________________________________________________________________ test_plugin ________________________________________________________________________________
    /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:65: in test_plugin
        result.assert_outcomes(passed=3 * len(get_all_backends()), skipped=len(get_all_backends()))
    E   AssertionError: assert {'errors': 6,...pped': 0, ...} == {'errors': 0,...pped': 2, ...}
    E     Omitting 3 identical items, use -vv to show
    E     Differing items:
    E     {'skipped': 0} != {'skipped': 2}
    E     {'errors': 6} != {'errors': 0}
    E     {'passed': 2} != {'passed': 6}
    E     Use -v to get the full diff
    --------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
    ============================= test session starts ==============================
    platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
    cachedir: .pytest_cache
    benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
    hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
    rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_plugin0
    plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
    collecting ... collected 8 items
    
    test_plugin.py::test_marked_test[asyncio] PASSED                         [ 12%]
    test_plugin.py::test_marked_test[trio] PASSED                            [ 25%]
    test_plugin.py::test_async_fixture_from_marked_test[asyncio] ERROR       [ 37%]
    test_plugin.py::test_async_fixture_from_marked_test[trio] ERROR          [ 50%]
    test_plugin.py::test_async_fixture_from_sync_test[asyncio] ERROR         [ 62%]
    test_plugin.py::test_async_fixture_from_sync_test[trio] ERROR            [ 75%]
    test_plugin.py::test_skip_inline[asyncio] ERROR                          [ 87%]
    test_plugin.py::test_skip_inline[trio] ERROR                             [100%]
    
    ==================================== ERRORS ====================================
    ________ ERROR at setup of test_async_fixture_from_marked_test[asyncio] ________
    
    args = (), kwargs = {}
    request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_marked_test[asyncio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    _________ ERROR at setup of test_async_fixture_from_marked_test[trio] __________
    
    args = (), kwargs = {}
    request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_marked_test[trio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    _________ ERROR at setup of test_async_fixture_from_sync_test[asyncio] _________
    
    args = (), kwargs = {}
    request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_sync_test[asyncio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    __________ ERROR at setup of test_async_fixture_from_sync_test[trio] ___________
    
    args = (), kwargs = {}
    request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_sync_test[trio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    _________________ ERROR at setup of test_skip_inline[asyncio] __________________
    
    args = (), kwargs = {}
    request = <SubRequest 'some_feature' for <Function test_skip_inline[asyncio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    ___________________ ERROR at setup of test_skip_inline[trio] ___________________
    
    args = (), kwargs = {}
    request = <SubRequest 'some_feature' for <Function test_skip_inline[trio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    =========================== short test summary info ============================
    ERROR test_plugin.py::test_async_fixture_from_marked_test[asyncio] - Exceptio...
    ERROR test_plugin.py::test_async_fixture_from_marked_test[trio] - Exception: ...
    ERROR test_plugin.py::test_async_fixture_from_sync_test[asyncio] - Exception:...
    ERROR test_plugin.py::test_async_fixture_from_sync_test[trio] - Exception: As...
    ERROR test_plugin.py::test_skip_inline[asyncio] - Exception: Asynchronous fix...
    ERROR test_plugin.py::test_skip_inline[trio] - Exception: Asynchronous fixtur...
    ========================= 2 passed, 6 errors in 0.14s ==========================
    pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
    _______________________________________________________________________________ test_asyncio _______________________________________________________________________________
    /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:138: in test_asyncio
        result.assert_outcomes(passed=2, failed=1, errors=2)
    E   AssertionError: assert {'errors': 3,...pped': 0, ...} == {'errors': 2,...pped': 0, ...}
    E     Omitting 4 identical items, use -vv to show
    E     Differing items:
    E     {'errors': 3} != {'errors': 2}
    E     {'passed': 0} != {'passed': 2}
    E     Use -v to get the full diff
    --------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
    ============================= test session starts ==============================
    platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
    cachedir: .pytest_cache
    benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
    hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
    rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_asyncio0
    plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
    collecting ... collected 4 items
    
    test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method ERROR [ 25%]
    test_asyncio.py::test_callback_exception_during_test FAILED              [ 50%]
    test_asyncio.py::test_callback_exception_during_setup ERROR              [ 75%]
    test_asyncio.py::test_callback_exception_during_teardown ERROR           [100%]
    
    ==================================== ERRORS ====================================
    ____ ERROR at setup of TestClassFixtures.test_class_fixture_in_test_method _____
    
    args = (), kwargs = {'anyio_backend': 'asyncio'}
    request = <SubRequest 'async_class_fixture' for <Function test_class_fixture_in_test_method>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    ____________ ERROR at setup of test_callback_exception_during_setup ____________
    
    args = (), kwargs = {}
    request = <SubRequest 'setup_fail_fixture' for <Function test_callback_exception_during_setup>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    __________ ERROR at setup of test_callback_exception_during_teardown ___________
    
    args = (), kwargs = {}
    request = <SubRequest 'teardown_fail_fixture' for <Function test_callback_exception_during_teardown>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    =================================== FAILURES ===================================
    _____________________ test_callback_exception_during_test ______________________
    
        def callback():
            nonlocal started
            started = True
    >       raise Exception('foo')
    E       Exception: foo
    
    test_asyncio.py:22: Exception
    =========================== short test summary info ============================
    FAILED test_asyncio.py::test_callback_exception_during_test - Exception: foo
    ERROR test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method
    ERROR test_asyncio.py::test_callback_exception_during_setup - Exception: Asyn...
    ERROR test_asyncio.py::test_callback_exception_during_teardown - Exception: A...
    ========================= 1 failed, 3 errors in 0.12s ==========================
    pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
    ________________________________________________________________________ test_autouse_async_fixture ________________________________________________________________________
    /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:175: in test_autouse_async_fixture
        result.assert_outcomes(passed=len(get_all_backends()))
    E   AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
    E     Omitting 4 identical items, use -vv to show
    E     Differing items:
    E     {'errors': 2} != {'errors': 0}
    E     {'passed': 0} != {'passed': 2}
    E     Use -v to get the full diff
    --------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
    ============================= test session starts ==============================
    platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
    cachedir: .pytest_cache
    benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
    hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
    rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_autouse_async_fixture0
    plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
    collecting ... collected 2 items
    
    test_autouse_async_fixture.py::test_autouse_backend[asyncio] ERROR       [ 50%]
    test_autouse_async_fixture.py::test_autouse_backend[trio] ERROR          [100%]
    
    ==================================== ERRORS ====================================
    _______________ ERROR at setup of test_autouse_backend[asyncio] ________________
    
    args = (), kwargs = {'anyio_backend_name': 'asyncio'}
    request = <SubRequest 'autouse_async_fixture' for <Function test_autouse_backend[asyncio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    _________________ ERROR at setup of test_autouse_backend[trio] _________________
    
    args = (), kwargs = {'anyio_backend_name': 'trio'}
    request = <SubRequest 'autouse_async_fixture' for <Function test_autouse_backend[trio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    =========================== short test summary info ============================
    ERROR test_autouse_async_fixture.py::test_autouse_backend[asyncio] - Exceptio...
    ERROR test_autouse_async_fixture.py::test_autouse_backend[trio] - Exception: ...
    ============================== 2 errors in 0.10s ===============================
    pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
    __________________________________________________________________ test_cancel_scope_in_asyncgen_fixture ___________________________________________________________________
    /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:202: in test_cancel_scope_in_asyncgen_fixture
        result.assert_outcomes(passed=len(get_all_backends()))
    E   AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
    E     Omitting 4 identical items, use -vv to show
    E     Differing items:
    E     {'errors': 2} != {'errors': 0}
    E     {'passed': 0} != {'passed': 2}
    E     Use -v to get the full diff
    --------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
    ============================= test session starts ==============================
    platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
    cachedir: .pytest_cache
    benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
    hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
    rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_cancel_scope_in_asyncgen_fixture0
    plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
    collecting ... collected 2 items
    
    test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[asyncio] ERROR [ 50%]
    test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[trio] ERROR [100%]
    
    ==================================== ERRORS ====================================
    __________ ERROR at setup of test_cancel_in_asyncgen_fixture[asyncio] __________
    
    args = (), kwargs = {}
    request = <SubRequest 'asyncgen_fixture' for <Function test_cancel_in_asyncgen_fixture[asyncio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    ___________ ERROR at setup of test_cancel_in_asyncgen_fixture[trio] ____________
    
    args = (), kwargs = {}
    request = <SubRequest 'asyncgen_fixture' for <Function test_cancel_in_asyncgen_fixture[trio]>>
    
        def wrapper(*args, **kwargs):  # type: ignore
            request = kwargs["request"]
            if strip_request:
                del kwargs["request"]
    
            # if neither the fixture nor the test use the 'loop' fixture,
            # 'getfixturevalue' will fail because the test is not parameterized
            # (this can be removed someday if 'loop' is no longer parameterized)
            if "loop" not in request.fixturenames:
    >           raise Exception(
                    "Asynchronous fixtures must depend on the 'loop' fixture or "
                    "be used in tests depending from it."
                )
    E           Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
    
    /usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
    =========================== short test summary info ============================
    ERROR test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[asyncio]
    ERROR test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[trio]
    ============================== 2 errors in 0.10s ===============================
    pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
    ========================================================================= short test summary info ==========================================================================
    SKIPPED [1] tests/test_fileio.py:119: Drive only makes sense on Windows
    SKIPPED [1] tests/test_fileio.py:159: Only makes sense on Windows
    SKIPPED [3] tests/test_fileio.py:318: os.lchmod() is not available
    SKIPPED [1] tests/test_taskgroups.py:967: Cancel messages are only supported on py3.9+
    FAILED tests/test_debugging.py::test_main_task_name[asyncio] - DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. U...
    FAILED tests/test_debugging.py::test_main_task_name[asyncio+uvloop] - DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous...
    FAILED tests/test_pytest_plugin.py::test_plugin - AssertionError: assert {'errors': 6,...pped': 0, ...} == {'errors': 0,...pped': 2, ...}
    FAILED tests/test_pytest_plugin.py::test_asyncio - AssertionError: assert {'errors': 3,...pped': 0, ...} == {'errors': 2,...pped': 0, ...}
    FAILED tests/test_pytest_plugin.py::test_autouse_async_fixture - AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
    FAILED tests/test_pytest_plugin.py::test_cancel_scope_in_asyncgen_fixture - AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
    ================================================================ 6 failed, 1230 passed, 6 skipped in 35.12s ================================================================
    pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
    
    opened by kloczek 22
  • Handle missing trio gracefully

    Handle missing trio gracefully

    Make it possible to use anyio without actually having to install trio. This involves modifying get_all_backends() to only return the backends that are actually present, and teaching the tests to skip trio if it is not importable.

    opened by mgorny 19
  • I get

    I get "anyio._backends._asyncio.ExceptionGroup: 0 exceptions were raised in the task group"

    The following code reproduces the issue. Sorry about the length of it.

    from __future__ import annotations
    
    from contextlib import suppress
    from pathlib import Path
    from subprocess import STDOUT, CalledProcessError
    from typing import Optional, Sequence, Union
    
    import anyio
    from anyio.abc import Process
    from anyio.streams.file import FileReadStream
    from anyio.streams.text import TextReceiveStream
    
    
    async def run_process(
        # tg: TaskGroup,
        command: Union[str, Sequence[str]],
        *,
        input_for_stdin: Optional[Path] = None,
        raise_on_rc: Optional[bool] = None,
    ) -> None:
        """Run the given command as a foreground process.
    
        Unlike `anyio.run_process`, this streams data to/from the process while the
        process runs. This way, you can see the process' output while it's running.
        Useful for long-running processes.
        """
        process: Optional[Process] = None
        try:
            process = await anyio.open_process(command, stderr=STDOUT)
            await drain_streams(process, input_for_stdin)
        except BaseException:
            if process is not None:
                # Try to gracefully terminate the process
                process.terminate()
                # Give the process some time to stop
                with anyio.move_on_after(5, shield=True):
                    await drain_streams(process)
            raise
        finally:
            if process is not None:
                # We tried to be graceful. Now there is no mercy.
                with suppress(ProcessLookupError):
                    process.kill()
                # Close the streams (stdin, stdout, stderr)
                await process.aclose()
    
        assert process.returncode is not None
        # Check the return code (rc)
        if raise_on_rc and process.returncode != 0:
            raise CalledProcessError(process.returncode, command)
    
    
    async def drain_streams(
        process: Process, input_for_stdin: Optional[Path] = None
    ) -> None:
        async with anyio.create_task_group() as tg:
            # In parallel:
            #  * send to stdin
            #  * receive from stdout
            if process.stdin is not None and input_for_stdin is not None:
                tg.start_soon(_send_to_stdin, process, input_for_stdin)
            if process.stdout is not None:
                tg.start_soon(_receive_from_stdout, process)
            # Wait for normal exit
            await process.wait()
    
    
    async def _send_to_stdin(process: Process, input_for_stdin: Path) -> None:
        assert process.stdin is not None
        # Forward data from file to stdin
        async with await FileReadStream.from_path(input_for_stdin) as chunks:
            async for chunk in chunks:
                await process.stdin.send(chunk)
    
    
    async def _receive_from_stdout(process: Process) -> None:
        assert process.stdout is not None
        # Forward data from stdout
        async for string in TextReceiveStream(process.stdout):
            print(string)
    
    
    async def main():
        async with anyio.create_task_group() as tg:
            # Run the process in the "background"
            tg.start_soon(run_process, ("sleep", "10"))
            # We can do something else while the process runs
            print("Sleeping now. Try to press CTRL+C.")
            await anyio.sleep(10)
    
    
    anyio.run(main)
    

    Try to press CTRL+C while it runs. Example stack trace:

    Sleeping now. Try to press CTRL+C.
    ^Cunhandled exception during asyncio.run() shutdown
    task: <Task finished name='__main__.run_process' coro=<run_process() done, defined at /projects/stork/anyio_bug.py:14> exception=<ExceptionGroup: >>
    Traceback (most recent call last):
      File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 629, in run_until_complete
        self.run_forever()
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 596, in run_forever
        self._run_once()
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1854, in _run_once
        event_list = self._selector.select(timeout)
      File "/usr/local/lib/python3.9/selectors.py", line 469, in select
        fd_event_list = self._selector.poll(timeout, max_ev)
    KeyboardInterrupt
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/projects/stork/anyio_bug.py", line 65, in drain_streams
        await process.wait()
      File "/projects/stork/.venv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 825, in wait
        return await self._process.wait()
      File "/usr/local/lib/python3.9/asyncio/subprocess.py", line 135, in wait
        return await self._transport._wait()
      File "/usr/local/lib/python3.9/asyncio/base_subprocess.py", line 235, in _wait
        return await waiter
    asyncio.exceptions.CancelledError
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/projects/stork/anyio_bug.py", line 30, in run_process
        await drain_streams(process, input_for_stdin)
      File "/projects/stork/anyio_bug.py", line 65, in drain_streams
        await process.wait()
      File "/projects/stork/.venv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 526, in __aexit__
        raise ExceptionGroup(exceptions)
    anyio._backends._asyncio.ExceptionGroup: 0 exceptions were raised in the task group:
    ----------------------------
    
    Traceback (most recent call last):
      File "/projects/stork/anyio_bug.py", line 92, in <module>
        anyio.run(main)
      File "/projects/stork/.venv/lib/python3.9/site-packages/anyio/_core/_eventloop.py", line 55, in run
        return asynclib.run(func, *args, **backend_options)  # type: ignore
      File "/projects/stork/.venv/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 211, in run
        return native_run(wrapper(), debug=debug)
      File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
        return loop.run_until_complete(main)
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 629, in run_until_complete
        self.run_forever()
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 596, in run_forever
        self._run_once()
      File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1854, in _run_once
        event_list = self._selector.select(timeout)
      File "/usr/local/lib/python3.9/selectors.py", line 469, in select
        fd_event_list = self._selector.poll(timeout, max_ev)
    KeyboardInterrupt
    

    The interesting part to me is the anyio._backends._asyncio.ExceptionGroup: 0 exceptions were raised in the task group message. Why does anyio raise an ExceptionGroup with zero exceptions in it?

    If I'm doing something that I'm not supposed to, let me know. :)

    bug asyncio 
    opened by frederikaalund 19
  • ModuleNotFoundError: No module named 'anyio._backends'

    ModuleNotFoundError: No module named 'anyio._backends'

    I'm using asks under trio, and I'm compiling it to an .exe file with PyInstaller. Every time asks is run on the compiled .exe it returns the following error:

      File "site-packages\anyio\__init__.py", line 94, in _get_asynclib
    KeyError: 'anyio._backends._trio'
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "myscript.py", line 324, in myfunction
      File "site-packages\asks\base_funcs.py", line 30, in request
      File "site-packages\asks\sessions.py", line 198, in request
      File "site-packages\asks\sessions.py", line 365, in sema
      File "site-packages\anyio\__init__.py", line 698, in create_semaphore
      File "site-packages\anyio\__init__.py", line 96, in _get_asynclib
      File "importlib\__init__.py", line 127, in import_module
      File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
      File "<frozen importlib._bootstrap>", line 991, in _find_and_load
      File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
      File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
      File "<frozen importlib._bootstrap>", line 991, in _find_and_load
      File "<frozen importlib._bootstrap>", line 973, in _find_and_load_unlocked
    ModuleNotFoundError: No module named 'anyio._backends'
    
    • Platform: Windows-10-10.0.19041-SP0
    • PyInstaller: 3.6
    • Python: 3.8.0
    • Trio: 0.16.0
    • Asks: 2.4.8
    • Anyio 1.4.0

    It's already imported on the main function, I tried to hiddenimport it on a hook file and also manually when compiling, but it results the same.

    Is anyio not compatible with PyInstaller? Or am I not seeing something?

    opened by g0per 19
  • Logging to stdout from anyio worker processes

    Logging to stdout from anyio worker processes

    We wanted to use anyio.to_process.run_sync in order to execute CPU intensive code in worker processes, and we want to emit logs from the code executed by the worker processes. AnyIO uses the stdout/stderr of the worker processes for internal purposes to communicate with the parent process, which means we need another way to write to main process stdout from the worker process without race conditions, or an alternative way to write logs from inside Docker.

    Alternatives examined without changes to AnyIO

    1. os.dup the parent process stdout file descriptor and pass that file descriptor to the worker process. This will not work as expected since there can be race conditions if both the parent process and worker process write logs at the same time to the parent's stdout even via different file descriptors.
    2. Write to Docker logs without using writing to stdout/stderr - we didn't find a way to do it, if you know any that will be appreciated.

    Change proposal to AnyIO Instead of hard-coding stdout/stdin as the communication mechanism between parent process and child process, allow AnyIO users to opt into another piping mechanism so stdout can still be used in the child process for e.g. logging purposes inside Docker.

    opened by EldarSehayekZenity 2
  • On `trio` backend, `CapacityLimiter` does not take keyword arguments

    On `trio` backend, `CapacityLimiter` does not take keyword arguments

    MWE:

    import anyio
    
    async def test():
        lim = anyio.CapacityLimiter(total_tokens=1)
        async with lim:
            await anyio.sleep(0.5)
            
    with anyio.start_blocking_portal(backend="trio") as portal:
        portal.start_task_soon(test).result()
    

    fails with TypeError: CapacityLimiter.__init__() got an unexpected keyword argument 'total_tokens'.

    This doesn't happen when supplying the tokens as positional arguments or when using the asyncio backend.

    opened by burnpanck 0
  • `fail_after` deadline is set on initialization not context entry

    `fail_after` deadline is set on initialization not context entry

    Discussed at https://gitter.im/python-trio/AnyIO?at=63ac6d617de82d261600ea24

    When using a fail_after context, the deadline is set at "initialization" time rather than __enter__. See https://github.com/agronholm/anyio/blob/0cbb84bfadd9078c5dad63bab43907ed0dd555a1/src/anyio/_core/_tasks.py#L112-L114

    import anyio
    
    async def main():
        ctx = anyio.fail_after(5)
        await anyio.sleep(5)
        with ctx:
            for i in range(1, 6):
                print(i)
                await anyio.sleep(1)
    
    anyio.run(main)
    
    ❯ python example.py
    1
    Traceback (most recent call last):
      File "/Users/mz/dev/prefect/example.py", line 168, in main
        await anyio.sleep(1)
      File "/opt/homebrew/Caskroom/miniconda/base/envs/orion-dev-39/lib/python3.9/site-packages/anyio/_core/_eventloop.py", line 83, in sleep
        return await get_asynclib().sleep(delay)
      File "/opt/homebrew/Caskroom/miniconda/base/envs/orion-dev-39/lib/python3.9/asyncio/tasks.py", line 652, in sleep
        return await future
    asyncio.exceptions.CancelledError
    

    Since this is a context manager, the user expectation is that the timer starts when the context is entered.

    opened by madkinsz 1
  • Fix `TASK_STATUS_IGNORED`'s generic inheritance

    Fix `TASK_STATUS_IGNORED`'s generic inheritance

    the following currently fails type checking on the 4.0 dev branch:

    from anyio import TASK_STATUS_IGNORED
    from anyio.abc import TaskStatus
    
    
    async def foo(*, task_status: TaskStatus[None] = TASK_STATUS_IGNORED) -> None:
        # error: Incompatible default for argument "task_status" (default has type "_IgnoredTaskStatus", argument has type "TaskStatus[None]")  [assignment]
        raise NotImplementedError()
    
    opened by gschaffner 0
  • Add opt-in fuller typing for callables accepting `(func, *args, ...)`

    Add opt-in fuller typing for callables accepting `(func, *args, ...)`

    this follows up discussion in #491. this uses trio_typing.plugin to add opt-in fuller typing to callables accepting (func, *args, unrelated_kwarg0, ...). see versionhistory.rst for more detailed info.

    all of the errors that Mypy currently produces when checking AnyIO with this PR are due to either #510 or https://github.com/python/mypy/issues/14337, I believe.

    even with this Mypy bug, this PR is quite useful for typed AnyIO-dependent works as-is, but the Mypy bug does limit its usefulness to (for the most part) only non-generic functions.

    if there is interest in this work, I'd be happy to add some typing tests to this PR.

    opened by gschaffner 1
  • 3.6.2: pytest is failing in many units with `deprecated since Trio 0.22.0`

    3.6.2: pytest is failing in many units with `deprecated since Trio 0.22.0`

    I'm packaging your module as an rpm package so I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.

    • python3 -sBm build -w --no-isolation
    • because I'm calling build with --no-isolation I'm using during all processes only locally installed modules
    • install .whl file in </install/prefix>
    • run pytest with PYTHONPATH pointing to sitearch and sitelib inside </install/prefix>

    Here is list of installed modules in build env

    Package                       Version
    ----------------------------- -----------------
    alabaster                     0.7.12
    appdirs                       1.4.4
    asn1crypto                    1.5.1
    async-generator               1.10
    attrs                         22.1.0
    Babel                         2.11.0
    bcrypt                        3.2.2
    Brlapi                        0.8.3
    build                         0.9.0
    cffi                          1.15.1
    charset-normalizer            3.0.1
    contourpy                     1.0.6
    cryptography                  38.0.4
    cssselect                     1.1.0
    cycler                        0.11.0
    distro                        1.8.0
    dnspython                     2.2.1
    docutils                      0.19
    exceptiongroup                1.0.0
    extras                        1.0.0
    fixtures                      4.0.0
    fonttools                     4.38.0
    gpg                           1.17.1-unknown
    hypothesis                    6.58.2
    idna                          3.4
    imagesize                     1.4.1
    importlib-metadata            5.1.0
    iniconfig                     1.1.1
    Jinja2                        3.1.2
    kiwisolver                    1.4.4
    libcomps                      0.1.19
    louis                         3.23.0
    lxml                          4.9.1
    MarkupSafe                    2.1.1
    matplotlib                    3.6.2
    numpy                         1.23.1
    olefile                       0.46
    outcome                       1.1.0
    packaging                     21.3
    pbr                           5.9.0
    pep517                        0.13.0
    Pillow                        9.3.0
    pip                           22.3.1
    pluggy                        1.0.0
    ply                           3.11
    pyasn1                        0.4.8
    pyasn1-modules                0.2.8
    pycparser                     2.21
    Pygments                      2.13.0
    PyGObject                     3.42.2
    pyparsing                     3.0.9
    pytest                        7.2.0
    pytest-mock                   3.10.0
    python-dateutil               2.8.2
    pytz                          2022.4
    PyYAML                        6.0
    requests                      2.28.1
    rpm                           4.17.0
    scour                         0.38.2
    setuptools                    65.6.3
    setuptools-scm                7.0.5
    six                           1.16.0
    sniffio                       1.2.0
    snowballstemmer               2.2.0
    sortedcontainers              2.4.0
    Sphinx                        5.3.0
    sphinx_autodoc_typehints      1.19.4
    sphinx-rtd-theme              1.1.1
    sphinxcontrib-applehelp       1.0.2.dev20221204
    sphinxcontrib-devhelp         1.0.2.dev20221204
    sphinxcontrib-htmlhelp        2.0.0
    sphinxcontrib-jsmath          1.0.1.dev20221204
    sphinxcontrib-qthelp          1.0.3.dev20221204
    sphinxcontrib-serializinghtml 1.1.5
    testtools                     2.5.0
    tomli                         2.0.1
    tpm2-pkcs11-tools             1.33.7
    tpm2-pytss                    1.1.0
    trio                          0.21.0+dev
    trustme                       0.9.0
    typing_extensions             4.4.0
    urllib3                       1.26.12
    uvloop                        0.16.0
    wheel                         0.38.4
    zipp                          3.11.0
    
    opened by kloczek 2
Owner
Alex Grönholm
Alex Grönholm
Ultra fast asyncio event loop.

uvloop is a fast, drop-in replacement of the built-in asyncio event loop. uvloop is implemented in Cython and uses libuv under the hood. The project d

magicstack 9.1k Jan 7, 2023
A concurrent sync tool which works with multiple sources and targets.

Concurrent Sync A concurrent sync tool which works similar to rsync. It supports syncing given sources with multiple targets concurrently. Requirement

Halit Şimşek 2 Jan 11, 2022
rosny is a lightweight library for building concurrent systems.

rosny is a lightweight library for building concurrent systems. Installation Tested on: Linux Python >= 3.6 From pip: pip install rosny From source: p

Ruslan Baikulov 6 Oct 5, 2021
A lightweight (serverless) native python parallel processing framework based on simple decorators and call graphs.

A lightweight (serverless) native python parallel processing framework based on simple decorators and call graphs, supporting both control flow and dataflow execution paradigms as well as de-centralized CPU & GPU scheduling.

null 102 Jan 6, 2023
Trio – a friendly Python library for async concurrency and I/O

Trio – a friendly Python library for async concurrency and I/O The Trio project aims to produce a production-quality, permissively licensed, async/awa

null 5k Jan 7, 2023
Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed.

Tornado Web Server Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed. By using non-blocking ne

null 20.9k Jan 1, 2023
A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio.

asyncio-socks-server A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio. Features Supports both TCP a

Amaindex 164 Dec 30, 2022
A Proof of concept of a modern python CLI with click, pydantic, rich and anyio

httpcli This project is a proof of concept of a modern python networking cli which can be simple and easy to maintain using some of the best packages

Kevin Tewouda 17 Nov 15, 2022
Asynchronous and also synchronous non-official QvaPay client for asyncio and Python language.

Asynchronous and also synchronous non-official QvaPay client for asyncio and Python language. This library is still under development, the interface could be changed.

Leynier Gutiérrez González 8 Sep 18, 2021
Asynchronous HTTP client/server framework for asyncio and Python

Async http client/server framework Key Features Supports both client and server side of HTTP protocol. Supports both client and server Web-Sockets out

aio-libs 13.2k Jan 5, 2023
Asynchronous HTTP client/server framework for asyncio and Python

Async http client/server framework Key Features Supports both client and server side of HTTP protocol. Supports both client and server Web-Sockets out

aio-libs 13.1k Jan 1, 2023
A simple python program to sign text using either the RSA or ISRSAC algorithm with GUI built using tkinter library.

Digital Signatures using ISRSAC Algorithm A simple python program to sign text using either the RSA or ISRSAC algorithm with GUI built using tkinter l

Vasu Mandhanya 3 Nov 15, 2022
Asynchronous interface for peewee ORM powered by asyncio

peewee-async Asynchronous interface for peewee ORM powered by asyncio. Important notes Since version 0.6.0a only peewee 3.5+ is supported If you still

05Bit 666 Dec 30, 2022
A very simple asynchronous wrapper that allows you to get access to the Oracle database in asyncio programs.

cx_Oracle_async A very simple asynchronous wrapper that allows you to get access to the Oracle database in asyncio programs. Easy to use , buy may not

null 36 Dec 21, 2022
An asynchronous Minecraft server wrapper written in python3 with asyncio

mark3 (WIP) A modern Minecraft server wrapper written in python3 with asyncio TODO Note: The order of the following checklist doesn't necessarily mean

Colin Andress 7 Jul 29, 2022
Asynchronous For Python(asyncio)

asyncio is a library to write concurrent code using the async/await syntax.asyncio is used as a foundation for multiple Python asynchronous frameworks that provide high-performance network and web-servers, database connection libraries, distributed task queues, etc. asyncio is often a perfect fit for IO-bound and high-level structured network code.

Janak raikhola 0 Feb 5, 2022
Coroutine-based concurrency library for Python

gevent Read the documentation online at http://www.gevent.org. Post issues on the bug tracker, discuss and ask open ended questions on the mailing lis

gevent 5.9k Dec 28, 2022
SNV calling pipeline developed explicitly to process individual or trio vcf files obtained from Illumina based pipeline (grch37/grch38).

SNV Pipeline SNV calling pipeline developed explicitly to process individual or trio vcf files obtained from Illumina based pipeline (grch37/grch38).

East Genomics 1 Nov 2, 2021
TriOTP, the OTP framework for Python Trio

TriOTP, the OTP framework for Python Trio See documentation for more informations. Introduction This project is a simplified implementation of the Erl

David Delassus 7 Nov 21, 2022