Just normal build, install and test cycle used on building package from non-root account:
- "setup.py build"
- "setup.py install --root </install/prefix>"
- "pytest with PYTHONPATH pointing to setearch and sitelib inside </install/prefix>
+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.0-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.0-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.0, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, asyncio-0.15.1, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
collected 1242 items
tests/test_compat.py ....................................................................................... [ 7%]
tests/test_debugging.py FF..................... [ 8%]
tests/test_eventloop.py ......... [ 9%]
tests/test_fileio.py .........................s...........s...........................................................sss........................................... [ 21%]
.................... [ 22%]
tests/test_from_thread.py ............................................................................ [ 28%]
tests/test_lowlevel.py ........................... [ 30%]
tests/test_pytest_plugin.py FFFF.. [ 31%]
tests/test_signals.py ......... [ 32%]
tests/test_sockets.py .............................................................................................................................................. [ 43%]
.................................................................................................................................................................... [ 56%]
..................... [ 58%]
tests/test_subprocesses.py .................. [ 59%]
tests/test_synchronization.py ................................................................................................... [ 67%]
tests/test_taskgroups.py ........................................................................................................................................... [ 79%]
.....................................s [ 82%]
tests/test_to_process.py ..................... [ 83%]
tests/test_to_thread.py ........................ [ 85%]
tests/streams/test_buffered.py ............ [ 86%]
tests/streams/test_file.py .............................. [ 89%]
tests/streams/test_memory.py ................................................................. [ 94%]
tests/streams/test_stapled.py .................. [ 95%]
tests/streams/test_text.py ............... [ 97%]
tests/streams/test_tls.py .................................... [100%]
================================================================================= FAILURES =================================================================================
_______________________________________________________________________ test_main_task_name[asyncio] _______________________________________________________________________
tests/test_debugging.py:37: in test_main_task_name
for loop in [obj for obj in gc.get_objects()
tests/test_debugging.py:38: in <listcomp>
if isinstance(obj, asyncio.AbstractEventLoop)]:
/usr/lib/python3.8/site-packages/itsdangerous/_json.py:24: in __getattribute__
warnings.warn(
E DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. Use Python's 'json' module instead.
___________________________________________________________________ test_main_task_name[asyncio+uvloop] ____________________________________________________________________
tests/test_debugging.py:37: in test_main_task_name
for loop in [obj for obj in gc.get_objects()
tests/test_debugging.py:38: in <listcomp>
if isinstance(obj, asyncio.AbstractEventLoop)]:
/usr/lib/python3.8/site-packages/itsdangerous/_json.py:24: in __getattribute__
warnings.warn(
E DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. Use Python's 'json' module instead.
_______________________________________________________________________________ test_plugin ________________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:65: in test_plugin
result.assert_outcomes(passed=3 * len(get_all_backends()), skipped=len(get_all_backends()))
E AssertionError: assert {'errors': 6,...pped': 0, ...} == {'errors': 0,...pped': 2, ...}
E Omitting 3 identical items, use -vv to show
E Differing items:
E {'skipped': 0} != {'skipped': 2}
E {'errors': 6} != {'errors': 0}
E {'passed': 2} != {'passed': 6}
E Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_plugin0
plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
collecting ... collected 8 items
test_plugin.py::test_marked_test[asyncio] PASSED [ 12%]
test_plugin.py::test_marked_test[trio] PASSED [ 25%]
test_plugin.py::test_async_fixture_from_marked_test[asyncio] ERROR [ 37%]
test_plugin.py::test_async_fixture_from_marked_test[trio] ERROR [ 50%]
test_plugin.py::test_async_fixture_from_sync_test[asyncio] ERROR [ 62%]
test_plugin.py::test_async_fixture_from_sync_test[trio] ERROR [ 75%]
test_plugin.py::test_skip_inline[asyncio] ERROR [ 87%]
test_plugin.py::test_skip_inline[trio] ERROR [100%]
==================================== ERRORS ====================================
________ ERROR at setup of test_async_fixture_from_marked_test[asyncio] ________
args = (), kwargs = {}
request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_marked_test[asyncio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
_________ ERROR at setup of test_async_fixture_from_marked_test[trio] __________
args = (), kwargs = {}
request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_marked_test[trio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
_________ ERROR at setup of test_async_fixture_from_sync_test[asyncio] _________
args = (), kwargs = {}
request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_sync_test[asyncio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
__________ ERROR at setup of test_async_fixture_from_sync_test[trio] ___________
args = (), kwargs = {}
request = <SubRequest 'async_fixture' for <Function test_async_fixture_from_sync_test[trio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
_________________ ERROR at setup of test_skip_inline[asyncio] __________________
args = (), kwargs = {}
request = <SubRequest 'some_feature' for <Function test_skip_inline[asyncio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
___________________ ERROR at setup of test_skip_inline[trio] ___________________
args = (), kwargs = {}
request = <SubRequest 'some_feature' for <Function test_skip_inline[trio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
=========================== short test summary info ============================
ERROR test_plugin.py::test_async_fixture_from_marked_test[asyncio] - Exceptio...
ERROR test_plugin.py::test_async_fixture_from_marked_test[trio] - Exception: ...
ERROR test_plugin.py::test_async_fixture_from_sync_test[asyncio] - Exception:...
ERROR test_plugin.py::test_async_fixture_from_sync_test[trio] - Exception: As...
ERROR test_plugin.py::test_skip_inline[asyncio] - Exception: Asynchronous fix...
ERROR test_plugin.py::test_skip_inline[trio] - Exception: Asynchronous fixtur...
========================= 2 passed, 6 errors in 0.14s ==========================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
_______________________________________________________________________________ test_asyncio _______________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:138: in test_asyncio
result.assert_outcomes(passed=2, failed=1, errors=2)
E AssertionError: assert {'errors': 3,...pped': 0, ...} == {'errors': 2,...pped': 0, ...}
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'errors': 3} != {'errors': 2}
E {'passed': 0} != {'passed': 2}
E Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_asyncio0
plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
collecting ... collected 4 items
test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method ERROR [ 25%]
test_asyncio.py::test_callback_exception_during_test FAILED [ 50%]
test_asyncio.py::test_callback_exception_during_setup ERROR [ 75%]
test_asyncio.py::test_callback_exception_during_teardown ERROR [100%]
==================================== ERRORS ====================================
____ ERROR at setup of TestClassFixtures.test_class_fixture_in_test_method _____
args = (), kwargs = {'anyio_backend': 'asyncio'}
request = <SubRequest 'async_class_fixture' for <Function test_class_fixture_in_test_method>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
____________ ERROR at setup of test_callback_exception_during_setup ____________
args = (), kwargs = {}
request = <SubRequest 'setup_fail_fixture' for <Function test_callback_exception_during_setup>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
__________ ERROR at setup of test_callback_exception_during_teardown ___________
args = (), kwargs = {}
request = <SubRequest 'teardown_fail_fixture' for <Function test_callback_exception_during_teardown>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
=================================== FAILURES ===================================
_____________________ test_callback_exception_during_test ______________________
def callback():
nonlocal started
started = True
> raise Exception('foo')
E Exception: foo
test_asyncio.py:22: Exception
=========================== short test summary info ============================
FAILED test_asyncio.py::test_callback_exception_during_test - Exception: foo
ERROR test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method
ERROR test_asyncio.py::test_callback_exception_during_setup - Exception: Asyn...
ERROR test_asyncio.py::test_callback_exception_during_teardown - Exception: A...
========================= 1 failed, 3 errors in 0.12s ==========================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
________________________________________________________________________ test_autouse_async_fixture ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:175: in test_autouse_async_fixture
result.assert_outcomes(passed=len(get_all_backends()))
E AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'errors': 2} != {'errors': 0}
E {'passed': 0} != {'passed': 2}
E Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_autouse_async_fixture0
plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
collecting ... collected 2 items
test_autouse_async_fixture.py::test_autouse_backend[asyncio] ERROR [ 50%]
test_autouse_async_fixture.py::test_autouse_backend[trio] ERROR [100%]
==================================== ERRORS ====================================
_______________ ERROR at setup of test_autouse_backend[asyncio] ________________
args = (), kwargs = {'anyio_backend_name': 'asyncio'}
request = <SubRequest 'autouse_async_fixture' for <Function test_autouse_backend[asyncio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
_________________ ERROR at setup of test_autouse_backend[trio] _________________
args = (), kwargs = {'anyio_backend_name': 'trio'}
request = <SubRequest 'autouse_async_fixture' for <Function test_autouse_backend[trio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
=========================== short test summary info ============================
ERROR test_autouse_async_fixture.py::test_autouse_backend[asyncio] - Exceptio...
ERROR test_autouse_async_fixture.py::test_autouse_backend[trio] - Exception: ...
============================== 2 errors in 0.10s ===============================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
__________________________________________________________________ test_cancel_scope_in_asyncgen_fixture ___________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/tests/test_pytest_plugin.py:202: in test_cancel_scope_in_asyncgen_fixture
result.assert_outcomes(passed=len(get_all_backends()))
E AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'errors': 2} != {'errors': 0}
E {'passed': 0} != {'passed': 2}
E Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.11, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.0/.hypothesis/examples')
rootdir: /tmp/pytest-of-tkloczko/pytest-17/test_cancel_scope_in_asyncgen_fixture0
plugins: anyio-3.3.0, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, toolbox-0.5, aiohttp-0.3.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, pyfakefs-4.5.0, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, hypothesis-6.14.4, Faker-8.10.3, xprocess-0.18.1, black-0.3.12, checkdocs-2.7.1
collecting ... collected 2 items
test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[asyncio] ERROR [ 50%]
test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[trio] ERROR [100%]
==================================== ERRORS ====================================
__________ ERROR at setup of test_cancel_in_asyncgen_fixture[asyncio] __________
args = (), kwargs = {}
request = <SubRequest 'asyncgen_fixture' for <Function test_cancel_in_asyncgen_fixture[asyncio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
___________ ERROR at setup of test_cancel_in_asyncgen_fixture[trio] ____________
args = (), kwargs = {}
request = <SubRequest 'asyncgen_fixture' for <Function test_cancel_in_asyncgen_fixture[trio]>>
def wrapper(*args, **kwargs): # type: ignore
request = kwargs["request"]
if strip_request:
del kwargs["request"]
# if neither the fixture nor the test use the 'loop' fixture,
# 'getfixturevalue' will fail because the test is not parameterized
# (this can be removed someday if 'loop' is no longer parameterized)
if "loop" not in request.fixturenames:
> raise Exception(
"Asynchronous fixtures must depend on the 'loop' fixture or "
"be used in tests depending from it."
)
E Exception: Asynchronous fixtures must depend on the 'loop' fixture or be used in tests depending from it.
/usr/lib64/python3.8/site-packages/aiohttp/pytest_plugin.py:84: Exception
=========================== short test summary info ============================
ERROR test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[asyncio]
ERROR test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture[trio]
============================== 2 errors in 0.10s ===============================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
========================================================================= short test summary info ==========================================================================
SKIPPED [1] tests/test_fileio.py:119: Drive only makes sense on Windows
SKIPPED [1] tests/test_fileio.py:159: Only makes sense on Windows
SKIPPED [3] tests/test_fileio.py:318: os.lchmod() is not available
SKIPPED [1] tests/test_taskgroups.py:967: Cancel messages are only supported on py3.9+
FAILED tests/test_debugging.py::test_main_task_name[asyncio] - DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous 2.1. U...
FAILED tests/test_debugging.py::test_main_task_name[asyncio+uvloop] - DeprecationWarning: Importing 'itsdangerous.json' is deprecated and will be removed in ItsDangerous...
FAILED tests/test_pytest_plugin.py::test_plugin - AssertionError: assert {'errors': 6,...pped': 0, ...} == {'errors': 0,...pped': 2, ...}
FAILED tests/test_pytest_plugin.py::test_asyncio - AssertionError: assert {'errors': 3,...pped': 0, ...} == {'errors': 2,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_autouse_async_fixture - AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_cancel_scope_in_asyncgen_fixture - AssertionError: assert {'errors': 2,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
================================================================ 6 failed, 1230 passed, 6 skipped in 35.12s ================================================================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.