Go to:
Gentoo Home
Documentation
Forums
Lists
Bugs
Planet
Store
Wiki
Get Gentoo!
Gentoo's Bugzilla – Attachment 762181 Details for
Bug 831214
dev-python/pytest-rerunfailures fails tests
Home
|
New
–
[Ex]
|
Browse
|
Search
|
Privacy Policy
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
build.log
build.log (text/plain), 395.65 KB, created by
Matt Turner
on 2022-01-14 22:53:39 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Matt Turner
Created:
2022-01-14 22:53:39 UTC
Size:
395.65 KB
patch
obsolete
>[32m * [39;49;00mPackage: dev-python/pytest-rerunfailures-10.2 >[32m * [39;49;00mRepository: gentoo >[32m * [39;49;00mMaintainer: python@gentoo.org >[32m * [39;49;00mUSE: alpha elibc_glibc kernel_linux python_targets_python3_9 test userland_GNU >[32m * [39;49;00mFEATURES: ccache network-sandbox preserve-libs sandbox test userpriv usersandbox >>>> Unpacking source... >>>> Unpacking pytest-rerunfailures-10.2.tar.gz to /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work >>>> Source unpacked in /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work >>>> Preparing source in /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2 ... >>>> Source prepared. >>>> Configuring source in /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2 ... >>>> Source configured. >>>> Compiling source in /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2 ... > [32m*[0m python3_9: running distutils-r1_run_phase distutils-r1_python_compile >python3.9 setup.py build -j 3 >running build >running build_py >copying pytest_rerunfailures.py -> /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2-python3_9/lib >warning: build_py: byte-compiling is disabled, skipping. > >>>> Source compiled. >>>> Test phase: dev-python/pytest-rerunfailures-10.2 > [32m*[0m python3_9: running distutils-r1_run_phase python_test >python3.9 -m pytest -vv -ra -l -Wdefault --color=yes >[1m=========================================================== test session starts ===========================================================[0m >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3.9 >cachedir: .pytest_cache >hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/.hypothesis/examples') >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >[1mcollecting ... [0mcollected 63 items > >test_pytest_rerunfailures.py::test_error_when_run_with_pdb [31mFAILED[0m[31m [ 1%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_pass [32mPASSED[0m[31m [ 3%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_skipif_mark [32mPASSED[0m[31m [ 4%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_skip_call [32mPASSED[0m[31m [ 6%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_xfail_mark [32mPASSED[0m[31m [ 7%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_xfail_call [32mPASSED[0m[31m [ 9%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_xpass [32mPASSED[0m[31m [ 11%][0m >test_pytest_rerunfailures.py::test_rerun_fails_after_consistent_setup_failure [31mFAILED[0m[31m [ 12%][0m >test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_setup_failure [31mFAILED[0m[31m [ 14%][0m >test_pytest_rerunfailures.py::test_rerun_fails_after_consistent_test_failure [31mFAILED[0m[31m [ 15%][0m >test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_failure [31mFAILED[0m[31m [ 17%][0m >test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_crash [31mFAILED[0m[31m [ 19%][0m >test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_failure_with_flaky_mark [31mFAILED[0m[31m [ 20%][0m >test_pytest_rerunfailures.py::test_reruns_if_flaky_mark_is_called_without_options [31mFAILED[0m[31m [ 22%][0m >test_pytest_rerunfailures.py::test_reruns_if_flaky_mark_is_called_with_positional_argument [31mFAILED[0m[31m [ 23%][0m >test_pytest_rerunfailures.py::test_no_extra_test_summary_for_reruns_by_default [31mFAILED[0m[31m [ 25%][0m >test_pytest_rerunfailures.py::test_extra_test_summary_for_reruns [31mFAILED[0m[31m [ 26%][0m >test_pytest_rerunfailures.py::test_verbose [31mFAILED[0m[31m [ 28%][0m >test_pytest_rerunfailures.py::test_no_rerun_on_class_setup_error_without_reruns [32mPASSED[0m[31m [ 30%][0m >test_pytest_rerunfailures.py::test_rerun_on_class_setup_error_with_reruns [31mFAILED[0m[31m [ 31%][0m >test_pytest_rerunfailures.py::test_rerun_with_resultslog [33mSKIPPED[0m (--result-log removed in pytest>=6.1)[31m [ 33%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay[-1] [31mFAILED[0m[31m [ 34%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay[0] [31mFAILED[0m[31m [ 36%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay[0.0] [31mFAILED[0m[31m [ 38%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay[1] [31mFAILED[0m[31m [ 39%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay[2.5] [31mFAILED[0m[31m [ 41%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay_marker[-1] [31mFAILED[0m[31m [ 42%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay_marker[0] [31mFAILED[0m[31m [ 44%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay_marker[0.0] [31mFAILED[0m[31m [ 46%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay_marker[1] [31mFAILED[0m[31m [ 47%][0m >test_pytest_rerunfailures.py::test_reruns_with_delay_marker[2.5] [31mFAILED[0m[31m [ 49%][0m >test_pytest_rerunfailures.py::test_rerun_on_setup_class_with_error_with_reruns [31mFAILED[0m[31m [ 50%][0m >test_pytest_rerunfailures.py::test_rerun_on_class_scope_fixture_with_error_with_reruns [31mFAILED[0m[31m [ 52%][0m >test_pytest_rerunfailures.py::test_rerun_on_module_fixture_with_reruns [31mFAILED[0m[31m [ 53%][0m >test_pytest_rerunfailures.py::test_rerun_on_session_fixture_with_reruns [31mFAILED[0m[31m [ 55%][0m >test_pytest_rerunfailures.py::test_execution_count_exposed [31mFAILED[0m[31m [ 57%][0m >test_pytest_rerunfailures.py::test_rerun_report [31mFAILED[0m[31m [ 58%][0m >test_pytest_rerunfailures.py::test_pytest_runtest_logfinish_is_called [32mPASSED[0m[31m [ 60%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts0-True] [31mFAILED[0m[31m [ 61%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts1-True] [31mFAILED[0m[31m [ 63%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts2-True] [31mFAILED[0m[31m [ 65%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts3-False] [32mPASSED[0m[31m [ 66%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts4-True] [31mFAILED[0m[31m [ 68%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts5-True] [31mFAILED[0m[31m [ 69%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts6-True] [31mFAILED[0m[31m [ 71%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts7-True] [31mFAILED[0m[31m [ 73%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts8-False] [32mPASSED[0m[31m [ 74%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts9-False] [32mPASSED[0m[31m [ 76%][0m >test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts10-True] [31mFAILED[0m[31m [ 77%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[True-20] [31mFAILED[0m[31m [ 79%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[False-00] [31mFAILED[0m[31m [ 80%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[True-21] [31mFAILED[0m[31m [ 82%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[False-01] [31mFAILED[0m[31m [ 84%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[1-2] [31mFAILED[0m[31m [ 85%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[0-0] [31mFAILED[0m[31m [ 87%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition6-2] [31mFAILED[0m[31m [ 88%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition7-0] [31mFAILED[0m[31m [ 90%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition8-2] [31mFAILED[0m[31m [ 92%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition9-0] [31mFAILED[0m[31m [ 93%][0m >test_pytest_rerunfailures.py::test_reruns_with_condition_marker[None-0] [31mFAILED[0m[31m [ 95%][0m >test_pytest_rerunfailures.py::test_reruns_with_string_condition[sys.platform.startswith("non-exists") == False-2] [31mFAILED[0m[31m [ 96%][0m >test_pytest_rerunfailures.py::test_reruns_with_string_condition[os.getpid() != -1-2] [31mFAILED[0m[31m [ 98%][0m >test_pytest_rerunfailures.py::test_reruns_with_string_condition_with_global_var [31mFAILED[0m[31m [100%][0m > >================================================================ FAILURES ================================================================= >[31m[1m______________________________________________________ test_error_when_run_with_pdb _______________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_error_when_run_with_pdb0')> > > [94mdef[39;49;00m [92mtest_error_when_run_with_pdb[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_pass(): pass[39;49;00m[33m"[39;49;00m) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--pdb[39;49;00m[33m"[39;49;00m) >> result.stderr.fnmatch_lines_random([33m"[39;49;00m[33mERROR: --reruns incompatible with --pdb[39;49;00m[33m"[39;49;00m) >[1m[31mE Failed: line 'ERROR: --reruns incompatible with --pdb' not found in output[0m > >result = <RunResult ret=ExitCode.OK len(stdout.lines)=10 len(stderr.lines)=0 duration=1.27s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_error_when_run_with_pdb0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:71: Failed >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_error_when_run_with_pdb0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_error_when_run_with_pdb.py . [100%] > >============================================================ 1 passed in 0.17s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_rerun_fails_after_consistent_setup_failure _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_setup_failure0')> > > [94mdef[39;49;00m [92mtest_rerun_fails_after_consistent_setup_failure[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_pass(): pass[39;49;00m[33m"[39;49;00m) > testdir.makeconftest( > [33m"""[39;49;00m > [33m def pytest_runtest_setup(item):[39;49;00m > [33m raise Exception('Setup failure')"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m0[39;49;00m, error=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=22 len(stderr.lines)=0 duration=3.61s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_setup_failure0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:154: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 1 > failed = 0 > field = 'errors' > outcomes = {'errors': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=22 len(stderr.lines)=0 duration=3.61s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_setup_failure0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_fails_after_consistent_setup_failure.py E [100%] > >================================================================= ERRORS ================================================================== >_______________________________________________________ ERROR at setup of test_pass _______________________________________________________ > >item = <Function test_pass> > > def pytest_runtest_setup(item): >> raise Exception('Setup failure') >E Exception: Setup failure > >conftest.py:2: Exception >========================================================= short test summary info ========================================================= >ERROR test_rerun_fails_after_consistent_setup_failure.py::test_pass - Exception: Setup failure >============================================================ 1 error in 0.19s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_rerun_passes_after_temporary_setup_failure _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_setup_failure0')> > > [94mdef[39;49;00m [92mtest_rerun_passes_after_temporary_setup_failure[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_pass(): pass[39;49;00m[33m"[39;49;00m) > testdir.makeconftest( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def pytest_runtest_setup(item):[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure()[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=26 len(stderr.lines)=0 duration=1.24s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_setup_failure0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:165: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 1} > passed = 1 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=26 len(stderr.lines)=0 duration=1.24s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'passed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_setup_failure0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_passes_after_temporary_setup_failure.py E [100%] > >================================================================= ERRORS ================================================================== >_______________________________________________________ ERROR at setup of test_pass _______________________________________________________ > >item = <Function test_pass> > > def pytest_runtest_setup(item): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 1: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 1 > >conftest.py:8: Exception >============================================================ 1 error in 0.19s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_rerun_fails_after_consistent_test_failure ______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_test_failure0')> > > [94mdef[39;49;00m [92mtest_rerun_fails_after_consistent_test_failure[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_fail(): assert False[39;49;00m[33m"[39;49;00m) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=19 len(stderr.lines)=0 duration=1.22s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_test_failure0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:171: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=19 len(stderr.lines)=0 duration=1.22s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_fails_after_consistent_test_failure0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_fails_after_consistent_test_failure.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > >> def test_fail(): assert False >E assert False > >test_rerun_fails_after_consistent_test_failure.py:1: AssertionError >========================================================= short test summary info ========================================================= >FAILED test_rerun_fails_after_consistent_test_failure.py::test_fail - assert False >============================================================ 1 failed in 0.18s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_rerun_passes_after_temporary_test_failure ______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure0')> > > [94mdef[39;49;00m [92mtest_rerun_passes_after_temporary_test_failure[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure()[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=24 len(stderr.lines)=0 duration=1.24s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:181: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'failed': 1} > passed = 1 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=24 len(stderr.lines)=0 duration=1.24s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'passed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'passed' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_passes_after_temporary_test_failure.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_pass ________________________________________________________________ > > def test_pass(): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 1: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 1 > >test_rerun_passes_after_temporary_test_failure.py:8: Exception >============================================================ 1 failed in 0.19s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_rerun_passes_after_temporary_test_crash _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_crash0')> > > [37m@pytest[39;49;00m.mark.skipif([95mnot[39;49;00m has_xdist, reason=[33m"[39;49;00m[33mrequires xdist with crashitem[39;49;00m[33m"[39;49;00m) > [94mdef[39;49;00m [92mtest_rerun_passes_after_temporary_test_crash[39;49;00m(testdir): > [90m# note: we need two tests because there is a bug where xdist[39;49;00m > [90m# cannot rerun the last test if it crashes. the bug exists only[39;49;00m > [90m# in xdist is there is no error that causes the bug in this plugin.[39;49;00m > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def test_crash():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_crash()[33m}[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m pass[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m-n[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m2[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=17 len(stderr.lines)=0 duration=39.76s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_crash0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:198: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'failed': 1, 'passed': 1} > passed = 2 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=17 len(stderr.lines)=0 duration=39.76s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1, 'passed': 1}, field_name = 'passed', expected_value = 2 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '2' but got '1'[0m >[1m[31mE assert 1 == 2[0m >[1m[31mE +1[0m >[1m[31mE -2[0m > >expected_value = 2 >field_name = 'passed' >field_value = 1 >outcomes = {'failed': 1, 'passed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_crash0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >gw0 I >gw0 [2] > >[gw0] node down: Not properly terminated >F >replacing crashed worker gw0 >. >================================================================ FAILURES ================================================================= >_____________________________________________ test_rerun_passes_after_temporary_test_crash.py _____________________________________________ >[gw0] linux -- Python 3.9.9 /usr/bin/python3.9 >worker 'gw0' crashed while running 'test_rerun_passes_after_temporary_test_crash.py::test_crash' >====================================================== 1 failed, 1 passed in 38.58s ======================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________ test_rerun_passes_after_temporary_test_failure_with_flaky_mark ______________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure_with_flaky_mark0')> > > [94mdef[39;49;00m [92mtest_rerun_passes_after_temporary_test_failure_with_flaky_mark[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2)[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure([94m2[39;49;00m)[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m1[39;49;00m, rerun=[94m2[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=29 len(stderr.lines)=0 duration=3.13s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure_with_flaky_mark0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:210: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 1} > passed = 1 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=29 len(stderr.lines)=0 duration=3.13s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'passed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_passes_after_temporary_test_failure_with_flaky_mark0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_passes_after_temporary_test_failure_with_flaky_mark.py E [100%] > >================================================================= ERRORS ================================================================== >_______________________________________________________ ERROR at setup of test_pass _______________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_pass> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >============================================================ 1 error in 0.58s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m___________________________________________ test_reruns_if_flaky_mark_is_called_without_options ___________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0')> > > [94mdef[39;49;00m [92mtest_reruns_if_flaky_mark_is_called_without_options[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m @pytest.mark.flaky()[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure([94m1[39;49;00m)[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.OK len(stdout.lines)=19 len(stderr.lines)=0 duration=3.75s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:222: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 0 > field = 'errors' > outcomes = {'passed': 1} > passed = 1 > rerun = 1 > result = <RunResult ret=ExitCode.OK len(stdout.lines)=19 len(stderr.lines)=0 duration=3.75s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'passed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'passed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_if_flaky_mark_is_called_without_options.py [100%]. [100%] >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! > >===End Flaky Test Report=== > >============================================================ 1 passed in 0.60s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________ test_reruns_if_flaky_mark_is_called_with_positional_argument _______________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0')> > > [94mdef[39;49;00m [92mtest_reruns_if_flaky_mark_is_called_with_positional_argument[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m @pytest.mark.flaky(2)[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure([94m2[39;49;00m)[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m1[39;49;00m, rerun=[94m2[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=42 len(stderr.lines)=0 duration=3.55s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:234: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'failed': 1} > passed = 1 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=42 len(stderr.lines)=0 duration=3.55s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'passed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'passed' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_if_flaky_mark_is_called_with_positional_argument.py [100%]F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_pass ________________________________________________________________ > > @pytest.mark.flaky(2) > def test_pass(): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 2: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 2 > >test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10: Exception >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >============================================================ 1 failed in 0.58s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________ test_no_extra_test_summary_for_reruns_by_default _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_no_extra_test_summary_for_reruns_by_default0')> > > [94mdef[39;49;00m [92mtest_no_extra_test_summary_for_reruns_by_default[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure()[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) > [94massert[39;49;00m [33m"[39;49;00m[33mRERUN[39;49;00m[33m"[39;49;00m [95mnot[39;49;00m [95min[39;49;00m result.stdout.str() >> [94massert[39;49;00m [33m"[39;49;00m[33m1 rerun[39;49;00m[33m"[39;49;00m [95min[39;49;00m result.stdout.str() >[1m[31mE assert '1 rerun' in "=========================================================== test session starts =====================================...ytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'."[0m >[1m[31mE + where "=========================================================== test session starts =====================================...ytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'." = <bound method LineMatcher.str of <_pytest.pytester.LineMatcher object at 0x2003b906b80>>()[0m >[1m[31mE + where <bound method LineMatcher.str of <_pytest.pytester.LineMatcher object at 0x2003b906b80>> = <_pytest.pytester.LineMatcher object at 0x2003b906b80>.str[0m >[1m[31mE + where <_pytest.pytester.LineMatcher object at 0x2003b906b80> = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=43 len(stderr.lines)=0 duration=3.46s>.stdout[0m > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=43 len(stderr.lines)=0 duration=3.46s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_no_extra_test_summary_for_reruns_by_default0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:245: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_no_extra_test_summary_for_reruns_by_default0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_no_extra_test_summary_for_reruns_by_default.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_pass ________________________________________________________________ > > def test_pass(): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 1: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 1 > >test_no_extra_test_summary_for_reruns_by_default.py:8: Exception >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_no_extra_test_summary_for_reruns_by_default.py::test_pass - Exception: Failure: 1 >============================================================ 1 failed in 0.52s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m___________________________________________________ test_extra_test_summary_for_reruns ____________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_extra_test_summary_for_reruns0')> > > [94mdef[39;49;00m [92mtest_extra_test_summary_for_reruns[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure()[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m-r[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mR[39;49;00m[33m"[39;49;00m) >> result.stdout.fnmatch_lines_random([[33m"[39;49;00m[33mRERUN test_*:*[39;49;00m[33m"[39;49;00m]) >[1m[31mE Failed: line 'RERUN test_*:*' not found in output[0m > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=41 len(stderr.lines)=0 duration=3.48s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_extra_test_summary_for_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:255: Failed >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_extra_test_summary_for_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_extra_test_summary_for_reruns.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_pass ________________________________________________________________ > > def test_pass(): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 1: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 1 > >test_extra_test_summary_for_reruns.py:8: Exception >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >============================================================ 1 failed in 0.53s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________________________ test_verbose _______________________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_verbose0')> > > [94mdef[39;49;00m [92mtest_verbose[39;49;00m(testdir): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m def test_pass():[39;49;00m[33m[39;49;00m > [33m [39;49;00m[33m{[39;49;00mtemporary_failure()[33m}[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m-v[39;49;00m[33m"[39;49;00m) >> result.stdout.fnmatch_lines_random([[33m"[39;49;00m[33mtest_*:* RERUN*[39;49;00m[33m"[39;49;00m]) >[1m[31mE Failed: line 'test_*:* RERUN*' not found in output[0m > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=45 len(stderr.lines)=0 duration=6.12s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_verbose0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:266: Failed >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3.9 >cachedir: .pytest_cache >hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/.hypothesis/examples') >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_verbose0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collecting ... collected 1 item > >test_verbose.py::test_pass FAILED [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_pass ________________________________________________________________ > > def test_pass(): > > import py > path = py.path.local(__file__).dirpath().ensure('test.res') > count = path.read() or 1 > if int(count) <= 1: > path.write(int(count) + 1) >> raise Exception('Failure: {0}'.format(count)) >E Exception: Failure: 1 > >test_verbose.py:8: Exception >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_verbose.py::test_pass - Exception: Failure: 1 >============================================================ 1 failed in 0.73s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________ test_rerun_on_class_setup_error_with_reruns _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_setup_error_with_reruns0')> > > [94mdef[39;49;00m [92mtest_rerun_on_class_setup_error_with_reruns[39;49;00m(testdir): > testdir.makepyfile( > [33m"""[39;49;00m > [33m class TestFoo(object):[39;49;00m > [33m @classmethod[39;49;00m > [33m def setup_class(cls):[39;49;00m > [33m assert False[39;49;00m > [33m[39;49;00m > [33m def test_pass():[39;49;00m > [33m pass"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m0[39;49;00m, error=[94m1[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=40 len(stderr.lines)=0 duration=4.70s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_setup_error_with_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:297: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 1 > failed = 0 > field = 'errors' > outcomes = {'errors': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=40 len(stderr.lines)=0 duration=4.70s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_setup_error_with_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_on_class_setup_error_with_reruns.py E [100%] > >================================================================= ERRORS ================================================================== >___________________________________________________ ERROR at setup of TestFoo.test_pass ___________________________________________________ > >cls = <class 'test_rerun_on_class_setup_error_with_reruns.TestFoo'> > > @classmethod > def setup_class(cls): >> assert False >E assert False > >test_rerun_on_class_setup_error_with_reruns.py:4: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_rerun_on_class_setup_error_with_reruns.py::TestFoo::test_pass - assert False >============================================================ 1 error in 0.61s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________________ test_reruns_with_delay[-1] ________________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay0')> >delay_time = -1 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33m"""[39;49;00m > [33m def test_fail():[39;49;00m > [33m assert False"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m3[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns-delay[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(delay_time)) > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: >> result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) >[1m[31mE Failed: nomatch: '*UserWarning: Delay time between re-runs cannot be < 0. Using default value: 0'[0m >[1m[31mE and: '=========================================================== test session starts ==========================================================='[0m >[1m[31mE and: 'platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1'[0m >[1m[31mE and: 'rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay0'[0m >[1m[31mE and: 'plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0'[0m >[1m[31mE and: 'collected 1 item'[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'test_reruns_with_delay.py F [100%]'[0m >[1m[31mE and: ''[0m >[1m[31mE and: '================================================================ FAILURES ================================================================='[0m >[1m[31mE and: '________________________________________________________________ test_fail ________________________________________________________________'[0m >[1m[31mE and: ''[0m >[1m[31mE and: ' def test_fail():'[0m >[1m[31mE and: '> assert False'[0m >[1m[31mE and: 'E assert False'[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'test_reruns_with_delay.py:2: AssertionError'[0m >[1m[31mE and: '===Flaky Test Report==='[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'test_pass failed (1 runs remaining out of 2).'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 1'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>]'[0m >[1m[31mE and: 'test_pass passed 1 out of the required 1 times. Success!'[0m >[1m[31mE and: 'test_pass failed (1 runs remaining out of 2).'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 1'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>]'[0m >[1m[31mE and: 'test_pass failed; it passed 0 out of the required 1 times.'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 2'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>]'[0m >[1m[31mE and: ''[0m >[1m[31mE and: '===End Flaky Test Report==='[0m >[1m[31mE and: '========================================================= short test summary info ========================================================='[0m >[1m[31mE and: 'FAILED test_reruns_with_delay.py::test_fail - assert False'[0m >[1m[31mE and: '============================================================ 1 failed in 0.42s ============================================================'[0m >[1m[31mE and: "pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'."[0m >[1m[31mE remains unmatched: '*UserWarning: Delay time between re-runs cannot be < 0. Using default value: 0'[0m > >delay_time = -1 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=37 len(stderr.lines)=0 duration=3.29s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:326: Failed >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > > def test_fail(): >> assert False >E assert False > >test_reruns_with_delay.py:2: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_reruns_with_delay.py::test_fail - assert False >============================================================ 1 failed in 0.42s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________________________ test_reruns_with_delay[0] ________________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200026232336'>, 0) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(0)\nActual: not called.' >__mock_self = <MagicMock id='2200026232336'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200026232336'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200026232336'>, 0) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(0)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200026232336'>, args = (0,), kwargs = {}, expected = 'mock(0)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(0)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (0,) >error_message = 'expected call not found.\nExpected: mock(0)\nActual: not called.' >expected = 'mock(0)' >kwargs = {} >self = <MagicMock id='2200026232336'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay1')> >delay_time = 0 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33m"""[39;49;00m > [33m def test_fail():[39;49;00m > [33m assert False"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m3[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns-delay[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(delay_time)) > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=37 len(stderr.lines)=0 duration=3.17s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay1')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:332: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay1 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > > def test_fail(): >> assert False >E assert False > >test_reruns_with_delay.py:2: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_reruns_with_delay.py::test_fail - assert False >============================================================ 1 failed in 0.53s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________________ test_reruns_with_delay[0.0] _______________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200025713680'>, 0.0) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' >__mock_self = <MagicMock id='2200025713680'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200025713680'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200025713680'>, 0.0) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200025713680'>, args = (0.0,), kwargs = {}, expected = 'mock(0.0)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0.0)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (0.0,) >error_message = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' >expected = 'mock(0.0)' >kwargs = {} >self = <MagicMock id='2200025713680'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay2')> >delay_time = 0.0 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33m"""[39;49;00m > [33m def test_fail():[39;49;00m > [33m assert False"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m3[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns-delay[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(delay_time)) > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0.0)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 0.0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=37 len(stderr.lines)=0 duration=3.74s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay2')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:332: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay2 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > > def test_fail(): >> assert False >E assert False > >test_reruns_with_delay.py:2: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_reruns_with_delay.py::test_fail - assert False >============================================================ 1 failed in 0.59s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________________________ test_reruns_with_delay[1] ________________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200359155408'>, 1) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(1)\nActual: not called.' >__mock_self = <MagicMock id='2200359155408'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200359155408'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200359155408'>, 1) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(1)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200359155408'>, args = (1,), kwargs = {}, expected = 'mock(1)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(1)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(1)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (1,) >error_message = 'expected call not found.\nExpected: mock(1)\nActual: not called.' >expected = 'mock(1)' >kwargs = {} >self = <MagicMock id='2200359155408'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay3')> >delay_time = 1 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33m"""[39;49;00m > [33m def test_fail():[39;49;00m > [33m assert False"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m3[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns-delay[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(delay_time)) > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(1)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 1 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=37 len(stderr.lines)=0 duration=3.77s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay3')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:332: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay3 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > > def test_fail(): >> assert False >E assert False > >test_reruns_with_delay.py:2: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_reruns_with_delay.py::test_fail - assert False >============================================================ 1 failed in 0.48s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________________ test_reruns_with_delay[2.5] _______________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200358985632'>, 2.5) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' >__mock_self = <MagicMock id='2200358985632'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200358985632'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200358985632'>, 2.5) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200358985632'>, args = (2.5,), kwargs = {}, expected = 'mock(2.5)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(2.5)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (2.5,) >error_message = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' >expected = 'mock(2.5)' >kwargs = {} >self = <MagicMock id='2200358985632'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay4')> >delay_time = 2.5 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33m"""[39;49;00m > [33m def test_fail():[39;49;00m > [33m assert False"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m3[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m--reruns-delay[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(delay_time)) > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(2.5)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 2.5 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=37 len(stderr.lines)=0 duration=6.83s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay4')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:332: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay4 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay.py F [100%] > >================================================================ FAILURES ================================================================= >________________________________________________________________ test_fail ________________________________________________________________ > > def test_fail(): >> assert False >E assert False > >test_reruns_with_delay.py:2: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_reruns_with_delay.py::test_fail - assert False >============================================================ 1 failed in 0.58s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________________ test_reruns_with_delay_marker[-1] ____________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker0')> >delay_time = -1 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay_marker[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, reruns_delay=[39;49;00m[33m{[39;49;00mdelay_time[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest() > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: >> result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) >[1m[31mE Failed: nomatch: '*UserWarning: Delay time between re-runs cannot be < 0. Using default value: 0'[0m >[1m[31mE and: '=========================================================== test session starts ==========================================================='[0m >[1m[31mE and: 'platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1'[0m >[1m[31mE and: 'rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker0'[0m >[1m[31mE and: 'plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0'[0m >[1m[31mE and: 'collected 1 item'[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'test_reruns_with_delay_marker.py E [100%]'[0m >[1m[31mE and: ''[0m >[1m[31mE and: '================================================================= ERRORS =================================================================='[0m >[1m[31mE and: '_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________'[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two>'[0m >[1m[31mE and: ''[0m >[1m[31mE and: ' def pytest_runtest_setup(self, item):'[0m >[1m[31mE and: ' """'[0m >[1m[31mE and: " Pytest hook to modify the test before it's run."[0m >[1m[31mE and: ' '[0m >[1m[31mE and: ' :param item:'[0m >[1m[31mE and: ' The test item.'[0m >[1m[31mE and: ' """'[0m >[1m[31mE and: ' if not self._has_flaky_attributes(item):'[0m >[1m[31mE and: " if hasattr(item, 'iter_markers'):"[0m >[1m[31mE and: " for marker in item.iter_markers(name='flaky'):"[0m >[1m[31mE and: '> self._make_test_flaky(item, *marker.args, **marker.kwargs)'[0m >[1m[31mE and: "E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns'"[0m >[1m[31mE and: ''[0m >[1m[31mE and: '/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError'[0m >[1m[31mE and: '===Flaky Test Report==='[0m >[1m[31mE and: ''[0m >[1m[31mE and: 'test_pass failed (1 runs remaining out of 2).'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 1'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>]'[0m >[1m[31mE and: 'test_pass passed 1 out of the required 1 times. Success!'[0m >[1m[31mE and: 'test_pass failed (1 runs remaining out of 2).'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 1'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>]'[0m >[1m[31mE and: 'test_pass failed; it passed 0 out of the required 1 times.'[0m >[1m[31mE and: "\t<class 'Exception'>"[0m >[1m[31mE and: '\tFailure: 2'[0m >[1m[31mE and: '\t[<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>]'[0m >[1m[31mE and: ''[0m >[1m[31mE and: '===End Flaky Test Report==='[0m >[1m[31mE and: '========================================================= short test summary info ========================================================='[0m >[1m[31mE and: "ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns'"[0m >[1m[31mE and: '============================================================ 1 error in 0.93s ============================================================='[0m >[1m[31mE and: "pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'."[0m >[1m[31mE remains unmatched: '*UserWarning: Delay time between re-runs cannot be < 0. Using default value: 0'[0m > >delay_time = -1 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.88s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:353: Failed >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.93s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________________ test_reruns_with_delay_marker[0] _____________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200029442208'>, 0) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(0)\nActual: not called.' >__mock_self = <MagicMock id='2200029442208'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200029442208'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200029442208'>, 0) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(0)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200029442208'>, args = (0,), kwargs = {}, expected = 'mock(0)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(0)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (0,) >error_message = 'expected call not found.\nExpected: mock(0)\nActual: not called.' >expected = 'mock(0)' >kwargs = {} >self = <MagicMock id='2200029442208'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker1')> >delay_time = 0 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay_marker[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, reruns_delay=[39;49;00m[33m{[39;49;00mdelay_time[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest() > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.69s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker1')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:359: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker1 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.53s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m___________________________________________________ test_reruns_with_delay_marker[0.0] ____________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200027189792'>, 0.0) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' >__mock_self = <MagicMock id='2200027189792'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200027189792'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200027189792'>, 0.0) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200027189792'>, args = (0.0,), kwargs = {}, expected = 'mock(0.0)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0.0)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (0.0,) >error_message = 'expected call not found.\nExpected: mock(0.0)\nActual: not called.' >expected = 'mock(0.0)' >kwargs = {} >self = <MagicMock id='2200027189792'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker2')> >delay_time = 0.0 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay_marker[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, reruns_delay=[39;49;00m[33m{[39;49;00mdelay_time[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest() > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(0.0)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 0.0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=2.85s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker2')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:359: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker2 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________________ test_reruns_with_delay_marker[1] _____________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200027902160'>, 1) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(1)\nActual: not called.' >__mock_self = <MagicMock id='2200027902160'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200027902160'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200027902160'>, 1) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(1)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200027902160'>, args = (1,), kwargs = {}, expected = 'mock(1)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(1)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(1)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (1,) >error_message = 'expected call not found.\nExpected: mock(1)\nActual: not called.' >expected = 'mock(1)' >kwargs = {} >self = <MagicMock id='2200027902160'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker3')> >delay_time = 1 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay_marker[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, reruns_delay=[39;49;00m[33m{[39;49;00mdelay_time[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest() > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(1)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 1 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.86s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker3')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:359: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker3 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.93s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m___________________________________________________ test_reruns_with_delay_marker[2.5] ____________________________________________________[0m > >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790>, args = (<MagicMock id='2200408965904'>, 2.5) >kwargs = {}, __tracebackhide__ = True, msg = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' >__mock_self = <MagicMock id='2200408965904'> > > [94mdef[39;49;00m [92massert_wrapper[39;49;00m( > __wrapped_mock_method__: Callable[..., Any], *args: Any, **kwargs: Any > ) -> [94mNone[39;49;00m: > __tracebackhide__ = [94mTrue[39;49;00m > [94mtry[39;49;00m: >> __wrapped_mock_method__(*args, **kwargs) > >__mock_self = <MagicMock id='2200408965904'> >__tracebackhide__ = True >__wrapped_mock_method__ = <function NonCallableMock.assert_called_with at 0x20002a13790> >args = (<MagicMock id='2200408965904'>, 2.5) >kwargs = {} >msg = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' > >[1m[31m/usr/lib/python3.9/site-packages/pytest_mock/plugin.py[0m:414: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <MagicMock id='2200408965904'>, args = (2.5,), kwargs = {}, expected = 'mock(2.5)', actual = 'not called.' >error_message = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' > > [94mdef[39;49;00m [92massert_called_with[39;49;00m([96mself[39;49;00m, /, *args, **kwargs): > [33m"""assert that the last call was made with the specified arguments.[39;49;00m > [33m[39;49;00m > [33m Raises an AssertionError if the args and keyword args passed in are[39;49;00m > [33m different to the last call to the mock."""[39;49;00m > [94mif[39;49;00m [96mself[39;49;00m.call_args [95mis[39;49;00m [94mNone[39;49;00m: > expected = [96mself[39;49;00m._format_mock_call_signature(args, kwargs) > actual = [33m'[39;49;00m[33mnot called.[39;49;00m[33m'[39;49;00m > error_message = ([33m'[39;49;00m[33mexpected call not found.[39;49;00m[33m\n[39;49;00m[33mExpected: [39;49;00m[33m%s[39;49;00m[33m\n[39;49;00m[33mActual: [39;49;00m[33m%s[39;49;00m[33m'[39;49;00m > % (expected, actual)) >> [94mraise[39;49;00m [96mAssertionError[39;49;00m(error_message) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(2.5)[0m >[1m[31mE Actual: not called.[0m > >actual = 'not called.' >args = (2.5,) >error_message = 'expected call not found.\nExpected: mock(2.5)\nActual: not called.' >expected = 'mock(2.5)' >kwargs = {} >self = <MagicMock id='2200408965904'> > >[1m[31m/usr/lib/python3.9/unittest/mock.py[0m:898: AssertionError > >[33mDuring handling of the above exception, another exception occurred:[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker4')> >delay_time = 2.5 > > [37m@pytest[39;49;00m.mark.parametrize([33m"[39;49;00m[33mdelay_time[39;49;00m[33m"[39;49;00m, [-[94m1[39;49;00m, [94m0[39;49;00m, [94m0.0[39;49;00m, [94m1[39;49;00m, [94m2.5[39;49;00m]) > [94mdef[39;49;00m [92mtest_reruns_with_delay_marker[39;49;00m(testdir, delay_time): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, reruns_delay=[39;49;00m[33m{[39;49;00mdelay_time[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > time.sleep = mock.MagicMock() > > result = testdir.runpytest() > > [94mif[39;49;00m delay_time < [94m0[39;49;00m: > result.stdout.fnmatch_lines( > [33m"[39;49;00m[33m*UserWarning: Delay time between re-runs cannot be < 0. [39;49;00m[33m"[39;49;00m > [33m"[39;49;00m[33mUsing default value: 0[39;49;00m[33m"[39;49;00m > ) > delay_time = [94m0[39;49;00m > >> time.sleep.assert_called_with(delay_time) >[1m[31mE AssertionError: expected call not found.[0m >[1m[31mE Expected: mock(2.5)[0m >[1m[31mE Actual: not called.[0m > >delay_time = 2.5 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.57s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker4')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:359: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_delay_marker4 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_delay_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_delay_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________ test_rerun_on_setup_class_with_error_with_reruns _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_setup_class_with_error_with_reruns0')> > > [94mdef[39;49;00m [92mtest_rerun_on_setup_class_with_error_with_reruns[39;49;00m(testdir): > [33m"""[39;49;00m > [33m Case: setup_class throwing error on the first execution for parametrized test[39;49;00m > [33m """[39;49;00m > testdir.makepyfile( > [33m"""[39;49;00m > [33m import pytest[39;49;00m > [33m[39;49;00m > [33m pass_fixture = False[39;49;00m > [33m[39;49;00m > [33m class TestFoo(object):[39;49;00m > [33m @classmethod[39;49;00m > [33m def setup_class(cls):[39;49;00m > [33m global pass_fixture[39;49;00m > [33m if not pass_fixture:[39;49;00m > [33m pass_fixture = True[39;49;00m > [33m assert False[39;49;00m > [33m assert True[39;49;00m > [33m @pytest.mark.parametrize('param', [1, 2, 3])[39;49;00m > [33m def test_pass(self, param):[39;49;00m > [33m assert param"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m3[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=71 len(stderr.lines)=0 duration=3.66s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_setup_class_with_error_with_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:387: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 3} > passed = 3 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=71 len(stderr.lines)=0 duration=3.66s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 3}, field_name = 'passed', expected_value = 3 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '3' but got '0'[0m >[1m[31mE assert 0 == 3[0m >[1m[31mE +0[0m >[1m[31mE -3[0m > >expected_value = 3 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 3} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_setup_class_with_error_with_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 3 items > >test_rerun_on_setup_class_with_error_with_reruns.py EEE [100%] > >================================================================= ERRORS ================================================================== >_________________________________________________ ERROR at setup of TestFoo.test_pass[1] __________________________________________________ > >cls = <class 'test_rerun_on_setup_class_with_error_with_reruns.TestFoo'> > > @classmethod > def setup_class(cls): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_setup_class_with_error_with_reruns.py:11: AssertionError >_________________________________________________ ERROR at setup of TestFoo.test_pass[2] __________________________________________________ > >cls = <class 'test_rerun_on_setup_class_with_error_with_reruns.TestFoo'> > > @classmethod > def setup_class(cls): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_setup_class_with_error_with_reruns.py:11: AssertionError >_________________________________________________ ERROR at setup of TestFoo.test_pass[3] __________________________________________________ > >cls = <class 'test_rerun_on_setup_class_with_error_with_reruns.TestFoo'> > > @classmethod > def setup_class(cls): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_setup_class_with_error_with_reruns.py:11: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_rerun_on_setup_class_with_error_with_reruns.py::TestFoo::test_pass[1] - assert False >ERROR test_rerun_on_setup_class_with_error_with_reruns.py::TestFoo::test_pass[2] - assert False >ERROR test_rerun_on_setup_class_with_error_with_reruns.py::TestFoo::test_pass[3] - assert False >============================================================ 3 errors in 0.68s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________ test_rerun_on_class_scope_fixture_with_error_with_reruns _________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_scope_fixture_with_error_with_reruns0')> > > [94mdef[39;49;00m [92mtest_rerun_on_class_scope_fixture_with_error_with_reruns[39;49;00m(testdir): > [33m"""[39;49;00m > [33m Case: Class scope fixture throwing error on the first execution[39;49;00m > [33m for parametrized test[39;49;00m > [33m """[39;49;00m > testdir.makepyfile( > [33m"""[39;49;00m > [33m import pytest[39;49;00m > [33m[39;49;00m > [33m pass_fixture = False[39;49;00m > [33m[39;49;00m > [33m class TestFoo(object):[39;49;00m > [33m[39;49;00m > [33m @pytest.fixture(scope="class")[39;49;00m > [33m def setup_fixture(self):[39;49;00m > [33m global pass_fixture[39;49;00m > [33m if not pass_fixture:[39;49;00m > [33m pass_fixture = True[39;49;00m > [33m assert False[39;49;00m > [33m assert True[39;49;00m > [33m @pytest.mark.parametrize('param', [1, 2, 3])[39;49;00m > [33m def test_pass(self, setup_fixture, param):[39;49;00m > [33m assert param"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m3[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=71 len(stderr.lines)=0 duration=6.65s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_scope_fixture_with_error_with_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:415: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 3} > passed = 3 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=71 len(stderr.lines)=0 duration=6.65s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 3}, field_name = 'passed', expected_value = 3 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '3' but got '0'[0m >[1m[31mE assert 0 == 3[0m >[1m[31mE +0[0m >[1m[31mE -3[0m > >expected_value = 3 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 3} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_class_scope_fixture_with_error_with_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 3 items > >test_rerun_on_class_scope_fixture_with_error_with_reruns.py EEE [100%] > >================================================================= ERRORS ================================================================== >_________________________________________________ ERROR at setup of TestFoo.test_pass[1] __________________________________________________ > >self = <test_rerun_on_class_scope_fixture_with_error_with_reruns.TestFoo object at 0x200543222b0> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_class_scope_fixture_with_error_with_reruns.py:12: AssertionError >_________________________________________________ ERROR at setup of TestFoo.test_pass[2] __________________________________________________ > >self = <test_rerun_on_class_scope_fixture_with_error_with_reruns.TestFoo object at 0x200543222b0> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_class_scope_fixture_with_error_with_reruns.py:12: AssertionError >_________________________________________________ ERROR at setup of TestFoo.test_pass[3] __________________________________________________ > >self = <test_rerun_on_class_scope_fixture_with_error_with_reruns.TestFoo object at 0x200543222b0> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_class_scope_fixture_with_error_with_reruns.py:12: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_rerun_on_class_scope_fixture_with_error_with_reruns.py::TestFoo::test_pass[1] - assert False >ERROR test_rerun_on_class_scope_fixture_with_error_with_reruns.py::TestFoo::test_pass[2] - assert False >ERROR test_rerun_on_class_scope_fixture_with_error_with_reruns.py::TestFoo::test_pass[3] - assert False >============================================================ 3 errors in 0.48s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________________ test_rerun_on_module_fixture_with_reruns _________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_module_fixture_with_reruns0')> > > [94mdef[39;49;00m [92mtest_rerun_on_module_fixture_with_reruns[39;49;00m(testdir): > [33m"""[39;49;00m > [33m Case: Module scope fixture is not re-executed when class scope fixture throwing[39;49;00m > [33m error on the first execution for parametrized test[39;49;00m > [33m """[39;49;00m > testdir.makepyfile( > [33m"""[39;49;00m > [33m import pytest[39;49;00m > [33m[39;49;00m > [33m pass_fixture = False[39;49;00m > [33m[39;49;00m > [33m @pytest.fixture(scope='module')[39;49;00m > [33m def module_fixture():[39;49;00m > [33m assert not pass_fixture[39;49;00m > [33m[39;49;00m > [33m class TestFoo(object):[39;49;00m > [33m @pytest.fixture(scope="class")[39;49;00m > [33m def setup_fixture(self):[39;49;00m > [33m global pass_fixture[39;49;00m > [33m if not pass_fixture:[39;49;00m > [33m pass_fixture = True[39;49;00m > [33m assert False[39;49;00m > [33m assert True[39;49;00m > [33m def test_pass_1(self, module_fixture, setup_fixture):[39;49;00m > [33m assert True[39;49;00m > [33m[39;49;00m > [33m def test_pass_2(self, module_fixture, setup_fixture):[39;49;00m > [33m assert True"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m2[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=57 len(stderr.lines)=0 duration=3.30s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_module_fixture_with_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:448: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 2} > passed = 2 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=57 len(stderr.lines)=0 duration=3.30s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 2}, field_name = 'passed', expected_value = 2 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '2' but got '0'[0m >[1m[31mE assert 0 == 2[0m >[1m[31mE +0[0m >[1m[31mE -2[0m > >expected_value = 2 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 2} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_module_fixture_with_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 2 items > >test_rerun_on_module_fixture_with_reruns.py EE [100%] > >================================================================= ERRORS ================================================================== >__________________________________________________ ERROR at setup of TestFoo.test_pass_1 __________________________________________________ > >self = <test_rerun_on_module_fixture_with_reruns.TestFoo object at 0x200542f4700> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_module_fixture_with_reruns.py:15: AssertionError >__________________________________________________ ERROR at setup of TestFoo.test_pass_2 __________________________________________________ > >self = <test_rerun_on_module_fixture_with_reruns.TestFoo object at 0x200542f4700> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_module_fixture_with_reruns.py:15: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_rerun_on_module_fixture_with_reruns.py::TestFoo::test_pass_1 - assert False >ERROR test_rerun_on_module_fixture_with_reruns.py::TestFoo::test_pass_2 - assert False >============================================================ 2 errors in 0.57s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________________ test_rerun_on_session_fixture_with_reruns ________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_session_fixture_with_reruns0')> > > [94mdef[39;49;00m [92mtest_rerun_on_session_fixture_with_reruns[39;49;00m(testdir): > [33m"""[39;49;00m > [33m Case: Module scope fixture is not re-executed when class scope fixture[39;49;00m > [33m throwing error on the first execution for parametrized test[39;49;00m > [33m """[39;49;00m > testdir.makepyfile( > [33m"""[39;49;00m > [33m import pytest[39;49;00m > [33m[39;49;00m > [33m pass_fixture = False[39;49;00m > [33m[39;49;00m > [33m @pytest.fixture(scope='session')[39;49;00m > [33m def session_fixture():[39;49;00m > [33m assert not pass_fixture[39;49;00m > [33m[39;49;00m > [33m class TestFoo(object):[39;49;00m > [33m @pytest.fixture(scope="class")[39;49;00m > [33m def setup_fixture(self):[39;49;00m > [33m global pass_fixture[39;49;00m > [33m if not pass_fixture:[39;49;00m > [33m pass_fixture = True[39;49;00m > [33m assert False[39;49;00m > [33m assert True[39;49;00m > [33m[39;49;00m > [33m def test_pass_1(self, session_fixture, setup_fixture):[39;49;00m > [33m assert True[39;49;00m > [33m def test_pass_2(self, session_fixture, setup_fixture):[39;49;00m > [33m assert True"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m1[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m2[39;49;00m, rerun=[94m1[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=57 len(stderr.lines)=0 duration=5.44s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_session_fixture_with_reruns0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:481: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 2} > passed = 2 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=57 len(stderr.lines)=0 duration=5.44s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 2}, field_name = 'passed', expected_value = 2 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '2' but got '0'[0m >[1m[31mE assert 0 == 2[0m >[1m[31mE +0[0m >[1m[31mE -2[0m > >expected_value = 2 >field_name = 'passed' >field_value = 0 >outcomes = {'errors': 2} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_on_session_fixture_with_reruns0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 2 items > >test_rerun_on_session_fixture_with_reruns.py EE [100%] > >================================================================= ERRORS ================================================================== >__________________________________________________ ERROR at setup of TestFoo.test_pass_1 __________________________________________________ > >self = <test_rerun_on_session_fixture_with_reruns.TestFoo object at 0x2000584ffa0> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_session_fixture_with_reruns.py:15: AssertionError >__________________________________________________ ERROR at setup of TestFoo.test_pass_2 __________________________________________________ > >self = <test_rerun_on_session_fixture_with_reruns.TestFoo object at 0x2000584ffa0> > > @pytest.fixture(scope="class") > def setup_fixture(self): > global pass_fixture > if not pass_fixture: > pass_fixture = True >> assert False >E assert False > >test_rerun_on_session_fixture_with_reruns.py:15: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_rerun_on_session_fixture_with_reruns.py::TestFoo::test_pass_1 - assert False >ERROR test_rerun_on_session_fixture_with_reruns.py::TestFoo::test_pass_2 - assert False >============================================================ 2 errors in 0.89s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________________ test_execution_count_exposed _______________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_execution_count_exposed0')> > > [94mdef[39;49;00m [92mtest_execution_count_exposed[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_pass(): assert True[39;49;00m[33m"[39;49;00m) > testdir.makeconftest( > [33m"""[39;49;00m > [33m def pytest_runtest_teardown(item):[39;49;00m > [33m assert item.execution_count == 3"""[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m2[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, passed=[94m3[39;49;00m, rerun=[94m2[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=39 len(stderr.lines)=0 duration=4.35s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_execution_count_exposed0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:492: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:58: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mpassed[39;49;00m[33m"[39;49;00m, passed) > error = 0 > failed = 0 > outcomes = {'errors': 1, 'passed': 1} > passed = 3 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=39 len(stderr.lines)=0 duration=4.35s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1, 'passed': 1}, field_name = 'passed', expected_value = 3 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.passed has unexpected value. Expected '3' but got '1'[0m >[1m[31mE assert 1 == 3[0m >[1m[31mE +1[0m >[1m[31mE -3[0m > >expected_value = 3 >field_name = 'passed' >field_value = 1 >outcomes = {'errors': 1, 'passed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_execution_count_exposed0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_execution_count_exposed.py .E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at teardown of test_pass ______________________________________________________ > >item = <Function test_pass> > > def pytest_runtest_teardown(item): >> assert item.execution_count == 3 >E AttributeError: 'Function' object has no attribute 'execution_count' > >conftest.py:2: AttributeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_execution_count_exposed.py::test_pass - AttributeError: 'Function' object has no attribute 'execution_count' >======================================================= 1 passed, 1 error in 0.80s ======================================================== >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________________________ test_rerun_report ____________________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_report0')> > > [94mdef[39;49;00m [92mtest_rerun_report[39;49;00m(testdir): > testdir.makepyfile([33m"[39;49;00m[33mdef test_pass(): assert False[39;49;00m[33m"[39;49;00m) > testdir.makeconftest( > [33m"""[39;49;00m > [33m def pytest_runtest_logreport(report):[39;49;00m > [33m assert hasattr(report, 'rerun')[39;49;00m > [33m assert isinstance(report.rerun, int)[39;49;00m > [33m assert report.rerun <= 2[39;49;00m > [33m """[39;49;00m > ) > result = testdir.runpytest([33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33m2[39;49;00m[33m"[39;49;00m) >> assert_outcomes(result, failed=[94m1[39;49;00m, rerun=[94m2[39;49;00m, passed=[94m0[39;49;00m) > >result = <RunResult ret=ExitCode.INTERNAL_ERROR len(stdout.lines)=89 len(stderr.lines)=0 duration=3.56s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_report0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:506: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.INTERNAL_ERROR len(stdout.lines)=89 len(stderr.lines)=0 duration=3.56s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_report0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_rerun_report.py >INTERNALERROR> Traceback (most recent call last): >INTERNALERROR> File "/usr/lib/python3.9/site-packages/_pytest/main.py", line 269, in wrap_session >INTERNALERROR> session.exitstatus = doit(config, session) or 0 >INTERNALERROR> File "/usr/lib/python3.9/site-packages/_pytest/main.py", line 323, in _main >INTERNALERROR> config.hook.pytest_runtestloop(session=session) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__ >INTERNALERROR> return self._hookexec(self, self.get_hookimpls(), kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec >INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 337, in traced_hookexec >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 52, in from_call >INTERNALERROR> result = func() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 335, in <lambda> >INTERNALERROR> outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs)) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda> >INTERNALERROR> self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall( >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 208, in _multicall >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall >INTERNALERROR> res = hook_impl.function(*args) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/_pytest/main.py", line 348, in pytest_runtestloop >INTERNALERROR> item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__ >INTERNALERROR> return self._hookexec(self, self.get_hookimpls(), kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec >INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 337, in traced_hookexec >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 52, in from_call >INTERNALERROR> result = func() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 335, in <lambda> >INTERNALERROR> outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs)) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda> >INTERNALERROR> self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall( >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 208, in _multicall >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall >INTERNALERROR> res = hook_impl.function(*args) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py", line 94, in pytest_runtest_protocol >INTERNALERROR> self.runner.pytest_runtest_protocol(item, nextitem) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/_pytest/runner.py", line 109, in pytest_runtest_protocol >INTERNALERROR> runtestprotocol(item, nextitem=nextitem) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/_pytest/runner.py", line 120, in runtestprotocol >INTERNALERROR> rep = call_and_report(item, "setup", log) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py", line 154, in call_and_report >INTERNALERROR> hook.pytest_runtest_logreport(report=report) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/hooks.py", line 286, in __call__ >INTERNALERROR> return self._hookexec(self, self.get_hookimpls(), kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 93, in _hookexec >INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 337, in traced_hookexec >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 52, in from_call >INTERNALERROR> result = func() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 335, in <lambda> >INTERNALERROR> outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs)) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/manager.py", line 84, in <lambda> >INTERNALERROR> self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall( >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 208, in _multicall >INTERNALERROR> return outcome.get_result() >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 80, in get_result >INTERNALERROR> raise ex[1].with_traceback(ex[2]) >INTERNALERROR> File "/usr/lib/python3.9/site-packages/pluggy/callers.py", line 187, in _multicall >INTERNALERROR> res = hook_impl.function(*args) >INTERNALERROR> File "/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_rerun_report0/conftest.py", line 2, in pytest_runtest_logreport >INTERNALERROR> assert hasattr(report, 'rerun') >INTERNALERROR> AssertionError: assert False >INTERNALERROR> + where False = hasattr(<TestReport 'test_rerun_report.py::test_pass' when='setup' outcome='passed'>, 'rerun') > >========================================================== no tests ran in 0.49s ========================================================== >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts0-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag0')> >only_rerun_texts = ['AssertionError'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'AssertionError' >only_rerun_texts = ['AssertionError'] >pytest_args = ['--reruns', '1', '--only-rerun', 'AssertionError'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=5.49s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=5.49s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.55s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts1-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag1')> >only_rerun_texts = ['Assertion*'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'Assertion*' >only_rerun_texts = ['Assertion*'] >pytest_args = ['--reruns', '1', '--only-rerun', 'Assertion*'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=6.10s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag1')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=6.10s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag1 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.58s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts2-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag2')> >only_rerun_texts = ['Assertion'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'Assertion' >only_rerun_texts = ['Assertion'] >pytest_args = ['--reruns', '1', '--only-rerun', 'Assertion'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.82s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag2')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.82s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag2 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.56s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts4-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag4')> >only_rerun_texts = [''], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = '' >only_rerun_texts = [''] >pytest_args = ['--reruns', '1', '--only-rerun', ''] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.79s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag4')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.79s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag4 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.57s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts5-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag5')> >only_rerun_texts = ['AssertionError: '], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'AssertionError: ' >only_rerun_texts = ['AssertionError: '] >pytest_args = ['--reruns', '1', '--only-rerun', 'AssertionError: '] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.79s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag5')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.79s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag5 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.58s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts6-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag6')> >only_rerun_texts = ['AssertionError: ERR'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'AssertionError: ERR' >only_rerun_texts = ['AssertionError: ERR'] >pytest_args = ['--reruns', '1', '--only-rerun', 'AssertionError: ERR'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.84s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag6')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.84s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag6 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.57s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts7-True] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag7')> >only_rerun_texts = ['ERR'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'ERR' >only_rerun_texts = ['ERR'] >pytest_args = ['--reruns', '1', '--only-rerun', 'ERR'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.72s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag7')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.72s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag7 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.69s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m______________________________________________ test_only_rerun_flag[only_rerun_texts10-True] ______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag10')> >only_rerun_texts = ['AssertionError', 'ValueError'], should_rerun = True > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33monly_rerun_texts, should_rerun[39;49;00m[33m"[39;49;00m, > [ > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion*[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertion[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: [39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError: ERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError,ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError ValueError[39;49;00m[33m"[39;49;00m], [94mFalse[39;49;00m), > ([[33m"[39;49;00m[33mAssertionError[39;49;00m[33m"[39;49;00m, [33m"[39;49;00m[33mValueError[39;49;00m[33m"[39;49;00m], [94mTrue[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_only_rerun_flag[39;49;00m(testdir, only_rerun_texts, should_rerun): > testdir.makepyfile([33m'[39;49;00m[33mdef test_only_rerun(): raise AssertionError([39;49;00m[33m"[39;49;00m[33mERR[39;49;00m[33m"[39;49;00m[33m)[39;49;00m[33m'[39;49;00m) > > num_failed = [94m1[39;49;00m > num_passed = [94m0[39;49;00m > num_reruns = [94m1[39;49;00m > num_reruns_actual = num_reruns [94mif[39;49;00m should_rerun [94melse[39;49;00m [94m0[39;49;00m > > pytest_args = [[33m"[39;49;00m[33m--reruns[39;49;00m[33m"[39;49;00m, [96mstr[39;49;00m(num_reruns)] > [94mfor[39;49;00m only_rerun_text [95min[39;49;00m only_rerun_texts: > pytest_args.extend([[33m"[39;49;00m[33m--only-rerun[39;49;00m[33m"[39;49;00m, only_rerun_text]) > result = testdir.runpytest(*pytest_args) >> assert_outcomes( > result, passed=num_passed, failed=num_failed, rerun=num_reruns_actual > ) > >num_failed = 1 >num_passed = 0 >num_reruns = 1 >num_reruns_actual = 1 >only_rerun_text = 'ValueError' >only_rerun_texts = ['AssertionError', 'ValueError'] >pytest_args = ['--reruns', > '1', > '--only-rerun', > 'AssertionError', > '--only-rerun', > 'ValueError'] >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.46s> >should_rerun = True >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag10')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:550: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:65: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mrerun[39;49;00m[33m"[39;49;00m, rerun) > error = 0 > failed = 1 > field = 'errors' > outcomes = {'failed': 1} > passed = 0 > rerun = 1 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=36 len(stderr.lines)=0 duration=3.46s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'failed': 1}, field_name = 'rerun', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.rerun has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'rerun' >field_value = 0 >outcomes = {'failed': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_only_rerun_flag10 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_only_rerun_flag.py F [100%] > >================================================================ FAILURES ================================================================= >_____________________________________________________________ test_only_rerun _____________________________________________________________ > >> def test_only_rerun(): raise AssertionError("ERR") >E AssertionError: ERR > >test_only_rerun_flag.py:1: AssertionError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >FAILED test_only_rerun_flag.py::test_only_rerun - AssertionError: ERR >============================================================ 1 failed in 0.50s ============================================================ >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________ test_reruns_with_condition_marker[True-20] ________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker0')> >condition = True, expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = True >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.58s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.58s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________ test_reruns_with_condition_marker[False-00] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker1')> >condition = False, expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = False >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.56s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker1')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.56s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker1 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.61s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________ test_reruns_with_condition_marker[True-21] ________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker2')> >condition = True, expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = True >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.57s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker2')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.57s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker2 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_______________________________________________ test_reruns_with_condition_marker[False-01] _______________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker3')> >condition = False, expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = False >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.60s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker3')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.60s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker3 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.87s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_________________________________________________ test_reruns_with_condition_marker[1-2] __________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker4')> >condition = 1, expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = 1 >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.69s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker4')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=4.69s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker4 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.60s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_________________________________________________ test_reruns_with_condition_marker[0-0] __________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker5')> >condition = 0, expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = 0 >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.58s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker5')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.58s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker5 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.61s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_reruns_with_condition_marker[condition6-2] _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker6')> >condition = ['list'], expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = ['list'] >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=5.95s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker6')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=5.95s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker6 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_reruns_with_condition_marker[condition7-0] _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker7')> >condition = [], expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = [] >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.52s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker7')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.52s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker7 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.60s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_reruns_with_condition_marker[condition8-2] _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker8')> >condition = {'dict': 1}, expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = {'dict': 1} >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.52s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker8')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.52s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker8 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_____________________________________________ test_reruns_with_condition_marker[condition9-0] _____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker9')> >condition = {}, expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = {} >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.53s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker9')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.53s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker9 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.58s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m________________________________________________ test_reruns_with_condition_marker[None-0] ________________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker10')> >condition = None, expected_reruns = 0 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [ > ([94m1[39;49;00m == [94m1[39;49;00m, [94m2[39;49;00m), > ([94m1[39;49;00m == [94m2[39;49;00m, [94m0[39;49;00m), > ([94mTrue[39;49;00m, [94m2[39;49;00m), > ([94mFalse[39;49;00m, [94m0[39;49;00m), > ([94m1[39;49;00m, [94m2[39;49;00m), > ([94m0[39;49;00m, [94m0[39;49;00m), > ([[33m"[39;49;00m[33mlist[39;49;00m[33m"[39;49;00m], [94m2[39;49;00m), > ([], [94m0[39;49;00m), > ({[33m"[39;49;00m[33mdict[39;49;00m[33m"[39;49;00m: [94m1[39;49;00m}, [94m2[39;49;00m), > ({}, [94m0[39;49;00m), > ([94mNone[39;49;00m, [94m0[39;49;00m), > ], > ) > [94mdef[39;49;00m [92mtest_reruns_with_condition_marker[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=expected_reruns) > >condition = None >expected_reruns = 0 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.53s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker10')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:582: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 0 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.53s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_condition_marker10 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_condition_marker.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_condition_marker.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.61s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m___________________________ test_reruns_with_string_condition[sys.platform.startswith("non-exists") == False-2] ___________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition0')> >condition = 'sys.platform.startswith("non-exists") == False', expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [([33m'[39;49;00m[33msys.platform.startswith([39;49;00m[33m"[39;49;00m[33mnon-exists[39;49;00m[33m"[39;49;00m[33m) == False[39;49;00m[33m'[39;49;00m, [94m2[39;49;00m), ([33m"[39;49;00m[33mos.getpid() != -1[39;49;00m[33m"[39;49;00m, [94m2[39;49;00m)], > ) > [90m# before evaluating the condition expression, sys&os&platform package has been imported[39;49;00m > [94mdef[39;49;00m [92mtest_reruns_with_string_condition[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m'[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m'[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=[94m2[39;49;00m) > >condition = 'sys.platform.startswith("non-exists") == False' >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.56s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:600: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.56s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_string_condition.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_string_condition.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.58s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m_________________________________________ test_reruns_with_string_condition[os.getpid() != -1-2] __________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition1')> >condition = 'os.getpid() != -1', expected_reruns = 2 > > [37m@pytest[39;49;00m.mark.parametrize( > [33m"[39;49;00m[33mcondition, expected_reruns[39;49;00m[33m"[39;49;00m, > [([33m'[39;49;00m[33msys.platform.startswith([39;49;00m[33m"[39;49;00m[33mnon-exists[39;49;00m[33m"[39;49;00m[33m) == False[39;49;00m[33m'[39;49;00m, [94m2[39;49;00m), ([33m"[39;49;00m[33mos.getpid() != -1[39;49;00m[33m"[39;49;00m, [94m2[39;49;00m)], > ) > [90m# before evaluating the condition expression, sys&os&platform package has been imported[39;49;00m > [94mdef[39;49;00m [92mtest_reruns_with_string_condition[39;49;00m(testdir, condition, expected_reruns): > testdir.makepyfile( > [33mf[39;49;00m[33m"""[39;49;00m[33m[39;49;00m > [33m import pytest[39;49;00m[33m[39;49;00m > [33m[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition=[39;49;00m[33m'[39;49;00m[33m{[39;49;00mcondition[33m}[39;49;00m[33m'[39;49;00m[33m)[39;49;00m[33m[39;49;00m > [33m def test_fail_two():[39;49;00m[33m[39;49;00m > [33m assert False[39;49;00m[33m"""[39;49;00m > ) > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=[94m2[39;49;00m) > >condition = 'os.getpid() != -1' >expected_reruns = 2 >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.54s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition1')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:600: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.54s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition1 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_string_condition.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_string_condition.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' >============================================================ 1 error in 0.59s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >[31m[1m____________________________________________ test_reruns_with_string_condition_with_global_var ____________________________________________[0m > >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition_with_global_var0')> > > [94mdef[39;49;00m [92mtest_reruns_with_string_condition_with_global_var[39;49;00m(testdir): > testdir.makepyfile( > [33m"""[39;49;00m > [33m import pytest[39;49;00m > [33m[39;49;00m > [33m rerunBool = False[39;49;00m > [33m @pytest.mark.flaky(reruns=2, condition='rerunBool')[39;49;00m > [33m def test_fail_two():[39;49;00m > [33m global rerunBool[39;49;00m > [33m rerunBool = True[39;49;00m > [33m assert False"""[39;49;00m > ) > result = testdir.runpytest() >> assert_outcomes(result, passed=[94m0[39;49;00m, failed=[94m1[39;49;00m, rerun=[94m2[39;49;00m) > >result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.57s> >testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition_with_global_var0')> > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:616: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:60: in assert_outcomes > check_outcome_field(outcomes, [33m"[39;49;00m[33mfailed[39;49;00m[33m"[39;49;00m, failed) > error = 0 > failed = 1 > outcomes = {'errors': 1} > passed = 0 > rerun = 2 > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=48 len(stderr.lines)=0 duration=3.57s> > skipped = 0 > xfailed = 0 > xpassed = 0 >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >outcomes = {'errors': 1}, field_name = 'failed', expected_value = 1 > > [94mdef[39;49;00m [92mcheck_outcome_field[39;49;00m(outcomes, field_name, expected_value): > field_value = outcomes.get(field_name, [94m0[39;49;00m) >> [94massert[39;49;00m field_value == expected_value, ( > [33mf[39;49;00m[33m"[39;49;00m[33moutcomes.[39;49;00m[33m{[39;49;00mfield_name[33m}[39;49;00m[33m has unexpected value. [39;49;00m[33m"[39;49;00m > [33mf[39;49;00m[33m"[39;49;00m[33mExpected [39;49;00m[33m'[39;49;00m[33m{[39;49;00mexpected_value[33m}[39;49;00m[33m'[39;49;00m[33m but got [39;49;00m[33m'[39;49;00m[33m{[39;49;00mfield_value[33m}[39;49;00m[33m'[39;49;00m[33m"[39;49;00m > ) >[1m[31mE AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0'[0m >[1m[31mE assert 0 == 1[0m >[1m[31mE +0[0m >[1m[31mE -1[0m > >expected_value = 1 >field_name = 'failed' >field_value = 0 >outcomes = {'errors': 1} > >[1m[31m/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2/test_pytest_rerunfailures.py[0m:41: AssertionError >---------------------------------------------------------- Captured stdout call ----------------------------------------------------------- >=========================================================== test session starts =========================================================== >platform linux -- Python 3.9.9, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 >rootdir: /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_with_string_condition_with_global_var0 >plugins: rerunfailures-10.2, datadir-1.3.1, forked-1.3.0, mock-3.6.1, xdist-2.3.0, xprocess-0.18.1, tornado-0.8.1, timeout-1.4.2, hypothesis-6.14.3, flaky-3.7.0, httpbin-1.0.0, lazy-fixture-0.6.3, asyncio-0.16.0, freezegun-0.4.2, trio-0.7.0, regressions-2.2.0, virtualenv-1.7.0, pkgcore-0.12.8, pylama-8.3.6, shutil-1.7.0 >collected 1 item > >test_reruns_with_string_condition_with_global_var.py E [100%] > >================================================================= ERRORS ================================================================== >_____________________________________________________ ERROR at setup of test_fail_two _____________________________________________________ > >self = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x20003435c70>, item = <Function test_fail_two> > > def pytest_runtest_setup(self, item): > """ > Pytest hook to modify the test before it's run. > > :param item: > The test item. > """ > if not self._has_flaky_attributes(item): > if hasattr(item, 'iter_markers'): > for marker in item.iter_markers(name='flaky'): >> self._make_test_flaky(item, *marker.args, **marker.kwargs) >E TypeError: _make_test_flaky() got an unexpected keyword argument 'reruns' > >/usr/lib/python3.9/site-packages/flaky/flaky_pytest_plugin.py:244: TypeError >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >ERROR test_reruns_with_string_condition_with_global_var.py::test_fail_two - TypeError: _make_test_flaky() got an unexpected keyword argu... >============================================================ 1 error in 0.60s ============================================================= >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. >===Flaky Test Report=== > >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_without_options0/test_reruns_if_flaky_mark_is_called_without_options.py:10>] >test_pass passed 1 out of the required 1 times. Success! >test_pass failed (1 runs remaining out of 2). > <class 'Exception'> > Failure: 1 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] >test_pass failed; it passed 0 out of the required 1 times. > <class 'Exception'> > Failure: 2 > [<TracebackEntry /var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/pytest-of-portage/pytest-0/test_reruns_if_flaky_mark_is_called_with_positional_argument0/test_reruns_if_flaky_mark_is_called_with_positional_argument.py:10>] > >===End Flaky Test Report=== >========================================================= short test summary info ========================================================= >SKIPPED [1] test_pytest_rerunfailures.py:300: --result-log removed in pytest>=6.1 >FAILED test_pytest_rerunfailures.py::test_error_when_run_with_pdb - Failed: line 'ERROR: --reruns incompatible with --pdb' not found in ... >FAILED test_pytest_rerunfailures.py::test_rerun_fails_after_consistent_setup_failure - AssertionError: outcomes.rerun has unexpected val... >FAILED test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_setup_failure - AssertionError: outcomes.passed has unexpected va... >FAILED test_pytest_rerunfailures.py::test_rerun_fails_after_consistent_test_failure - AssertionError: outcomes.rerun has unexpected valu... >FAILED test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_failure - AssertionError: outcomes.passed has unexpected val... >FAILED test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_crash - AssertionError: outcomes.passed has unexpected value... >FAILED test_pytest_rerunfailures.py::test_rerun_passes_after_temporary_test_failure_with_flaky_mark - AssertionError: outcomes.passed ha... >FAILED test_pytest_rerunfailures.py::test_reruns_if_flaky_mark_is_called_without_options - AssertionError: outcomes.rerun has unexpected... >FAILED test_pytest_rerunfailures.py::test_reruns_if_flaky_mark_is_called_with_positional_argument - AssertionError: outcomes.passed has ... >FAILED test_pytest_rerunfailures.py::test_no_extra_test_summary_for_reruns_by_default - assert '1 rerun' in "===========================... >FAILED test_pytest_rerunfailures.py::test_extra_test_summary_for_reruns - Failed: line 'RERUN test_*:*' not found in output >FAILED test_pytest_rerunfailures.py::test_verbose - Failed: line 'test_*:* RERUN*' not found in output >FAILED test_pytest_rerunfailures.py::test_rerun_on_class_setup_error_with_reruns - AssertionError: outcomes.rerun has unexpected value. ... >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay[-1] - Failed: nomatch: '*UserWarning: Delay time between re-runs cannot be <... >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay[0] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay[0.0] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay[1] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay[2.5] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay_marker[-1] - Failed: nomatch: '*UserWarning: Delay time between re-runs cann... >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay_marker[0] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay_marker[0.0] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay_marker[1] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_reruns_with_delay_marker[2.5] - AssertionError: expected call not found. >FAILED test_pytest_rerunfailures.py::test_rerun_on_setup_class_with_error_with_reruns - AssertionError: outcomes.passed has unexpected v... >FAILED test_pytest_rerunfailures.py::test_rerun_on_class_scope_fixture_with_error_with_reruns - AssertionError: outcomes.passed has unex... >FAILED test_pytest_rerunfailures.py::test_rerun_on_module_fixture_with_reruns - AssertionError: outcomes.passed has unexpected value. Ex... >FAILED test_pytest_rerunfailures.py::test_rerun_on_session_fixture_with_reruns - AssertionError: outcomes.passed has unexpected value. E... >FAILED test_pytest_rerunfailures.py::test_execution_count_exposed - AssertionError: outcomes.passed has unexpected value. Expected '3' b... >FAILED test_pytest_rerunfailures.py::test_rerun_report - AssertionError: outcomes.failed has unexpected value. Expected '1' but got '0' >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts0-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts1-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts2-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts4-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts5-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts6-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts7-True] - AssertionError: outcomes.rerun has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_only_rerun_flag[only_rerun_texts10-True] - AssertionError: outcomes.rerun has unexpected value... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[True-20] - AssertionError: outcomes.failed has unexpected value. ... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[False-00] - AssertionError: outcomes.failed has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[True-21] - AssertionError: outcomes.failed has unexpected value. ... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[False-01] - AssertionError: outcomes.failed has unexpected value.... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[1-2] - AssertionError: outcomes.failed has unexpected value. Expe... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[0-0] - AssertionError: outcomes.failed has unexpected value. Expe... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition6-2] - AssertionError: outcomes.failed has unexpected va... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition7-0] - AssertionError: outcomes.failed has unexpected va... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition8-2] - AssertionError: outcomes.failed has unexpected va... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[condition9-0] - AssertionError: outcomes.failed has unexpected va... >FAILED test_pytest_rerunfailures.py::test_reruns_with_condition_marker[None-0] - AssertionError: outcomes.failed has unexpected value. E... >FAILED test_pytest_rerunfailures.py::test_reruns_with_string_condition[sys.platform.startswith("non-exists") == False-2] - AssertionErro... >FAILED test_pytest_rerunfailures.py::test_reruns_with_string_condition[os.getpid() != -1-2] - AssertionError: outcomes.failed has unexpe... >FAILED test_pytest_rerunfailures.py::test_reruns_with_string_condition_with_global_var - AssertionError: outcomes.failed has unexpected ... >[31m========================================== [31m[1m51 failed[0m, [32m11 passed[0m, [33m1 skipped[0m[31m in 292.85s (0:04:52)[0m[31m ===========================================[0m >pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'. > [31;01m*[0m ERROR: dev-python/pytest-rerunfailures-10.2::gentoo failed (test phase): > [31;01m*[0m pytest failed with python3.9 > [31;01m*[0m > [31;01m*[0m Call stack: > [31;01m*[0m ebuild.sh, line 127: Called src_test > [31;01m*[0m environment, line 2856: Called distutils-r1_src_test > [31;01m*[0m environment, line 1199: Called _distutils-r1_run_foreach_impl 'python_test' > [31;01m*[0m environment, line 452: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2519: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2035: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2033: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 753: Called distutils-r1_run_phase 'python_test' > [31;01m*[0m environment, line 1138: Called python_test > [31;01m*[0m environment, line 2815: Called distutils-r1_python_test > [31;01m*[0m environment, line 1095: Called epytest > [31;01m*[0m environment, line 1551: Called die > [31;01m*[0m The specific snippet of code: > [31;01m*[0m "${@}" || die -n "pytest failed with ${EPYTHON}"; > [31;01m*[0m > [31;01m*[0m If you need support, post the output of `emerge --info '=dev-python/pytest-rerunfailures-10.2::gentoo'`, > [31;01m*[0m the complete build log and the output of `emerge -pqv '=dev-python/pytest-rerunfailures-10.2::gentoo'`. > [31;01m*[0m The complete build log is located at '/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/build.log'. > [31;01m*[0m The ebuild environment file is located at '/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/temp/environment'. > [31;01m*[0m Working directory: '/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2' > [31;01m*[0m S: '/var/tmp/portage/dev-python/pytest-rerunfailures-10.2/work/pytest-rerunfailures-10.2'
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 831214
: 762181