Go to:
Gentoo Home
Documentation
Forums
Lists
Bugs
Planet
Store
Wiki
Get Gentoo!
Gentoo's Bugzilla – Attachment 760148 Details for
Bug 829862
dev-python/pyudev-0.22.0-r1: test failures
Home
|
New
–
[Ex]
|
Browse
|
Search
|
Privacy Policy
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
build.log
build.log (text/plain), 107.65 KB, created by
Rolf Eike Beer
on 2021-12-23 10:59:36 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Rolf Eike Beer
Created:
2021-12-23 10:59:36 UTC
Size:
107.65 KB
patch
obsolete
>[32m * [39;49;00mPackage: dev-python/pyudev-0.22.0-r1 >[32m * [39;49;00mRepository: gentoo >[32m * [39;49;00mMaintainer: python@gentoo.org >[32m * [39;49;00mUSE: elibc_glibc kernel_linux python_targets_python3_9 sparc test userland_GNU >[32m * [39;49;00mFEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox >>>> Unpacking source... >>>> Unpacking pyudev-0.22.0.tar.gz to /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work >>>> Source unpacked in /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work >>>> Preparing source in /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0 ... > [33;01m*[0m If your PORTAGE_TMPDIR is longer in length then '/var/tmp/', > [33;01m*[0m change it to /var/tmp to ensure tests will pass. > [32m*[0m Applying pyudev-0.22-fix-hypothesis.patch ... >[A[256C [34;01m[ [32;01mok[34;01m ][0m >>>> Source prepared. >>>> Configuring source in /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0 ... >>>> Source configured. >>>> Compiling source in /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0 ... > [32m*[0m python3_9: running distutils-r1_run_phase distutils-r1_python_compile >python3.9 setup.py build -j 20 >running build >running build_py >creating /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/glib.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/core.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/__init__.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/_compat.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/pyside.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/wx.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/discover.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/pyqt5.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/_qt_base.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/pyqt4.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/_errors.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/monitor.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/version.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >copying src/pyudev/_util.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev >creating /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/device >copying src/pyudev/device/__init__.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/device >copying src/pyudev/device/_device.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/device >creating /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_os >copying src/pyudev/_os/__init__.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_os >copying src/pyudev/_os/pipe.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_os >copying src/pyudev/_os/poll.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_os >creating /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >copying src/pyudev/_ctypeslib/utils.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >copying src/pyudev/_ctypeslib/__init__.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >copying src/pyudev/_ctypeslib/libc.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >copying src/pyudev/_ctypeslib/_errorcheckers.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >copying src/pyudev/_ctypeslib/libudev.py -> /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/_ctypeslib >warning: build_py: byte-compiling is disabled, skipping. > >>>> Source compiled. >>>> Test phase: dev-python/pyudev-0.22.0-r1 > [32m*[0m python3_9: running distutils-r1_run_phase python_test >python3.9 -m pytest -vv -ra -l -Wdefault >========================================================================================================================= test session starts ========================================================================================================================== >platform linux -- Python 3.9.9, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3.9 >cachedir: .pytest_cache >hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/.hypothesis/examples') >rootdir: /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0, configfile: pytest.ini >plugins: virtualenv-1.7.0, forked-1.3.0, cov-3.0.0, mock-3.6.1, xdist-2.4.0, shutil-1.7.0, pkgcore-0.12.8, hypothesis-6.24.2 >collecting ... [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p > [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p > [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p > [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >collected 169 items > >tests/test_core.py::test_udev_version PASSED [ 0%] >tests/test_core.py::TestContext::test_sys_path [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 1%] >tests/test_core.py::TestContext::test_device_path [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 1%] >tests/test_core.py::TestContext::test_run_path [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 2%] >tests/test_core.py::TestContext::test_log_priority_get [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 2%] >tests/test_core.py::TestContext::test_log_priority_get_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 3%] >tests/test_core.py::TestContext::test_log_priority_set_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 4%] >tests/test_core.py::TestContext::test_log_priority_roundtrip [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 4%] >tests/test_device.py::TestAttributes::test_getitem <- tests/_device_tests/_attributes_tests.py PASSED [ 5%] >tests/test_device.py::TestAttributes::test_getitem_nonexisting <- tests/_device_tests/_attributes_tests.py PASSED [ 5%] >tests/test_device.py::TestAttributes::test_non_iterable <- tests/_device_tests/_attributes_tests.py PASSED [ 6%] >tests/test_device.py::TestAttributes::test_asstring <- tests/_device_tests/_attributes_tests.py PASSED [ 7%] >tests/test_device.py::TestAttributes::test_asint <- tests/_device_tests/_attributes_tests.py PASSED [ 7%] >tests/test_device.py::TestAttributes::test_asbool <- tests/_device_tests/_attributes_tests.py PASSED [ 8%] >tests/test_device.py::TestAttributes::test_available_attributes <- tests/_device_tests/_attributes_tests.py FAILED [ 8%] >tests/test_device.py::TestDevice::test_parent <- tests/_device_tests/_device_tests.py PASSED [ 9%] >tests/test_device.py::TestDevice::test_child_of_parent <- tests/_device_tests/_device_tests.py FAILED [ 10%] >tests/test_device.py::TestDevice::test_children <- tests/_device_tests/_device_tests.py FAILED [ 10%] >tests/test_device.py::TestDevice::test_ancestors <- tests/_device_tests/_device_tests.py PASSED [ 11%] >tests/test_device.py::TestDevice::test_find_parent <- tests/_device_tests/_device_tests.py PASSED [ 11%] >tests/test_device.py::TestDevice::test_find_parent_no_devtype_mock <- tests/_device_tests/_device_tests.py PASSED [ 12%] >tests/test_device.py::TestDevice::test_find_parent_with_devtype_mock <- tests/_device_tests/_device_tests.py PASSED [ 13%] >tests/test_device.py::TestDevice::test_traverse <- tests/_device_tests/_device_tests.py PASSED [ 13%] >tests/test_device.py::TestDevice::test_sys_path <- tests/_device_tests/_device_tests.py PASSED [ 14%] >tests/test_device.py::TestDevice::test_device_path <- tests/_device_tests/_device_tests.py PASSED [ 14%] >tests/test_device.py::TestDevice::test_subsystem <- tests/_device_tests/_device_tests.py PASSED [ 15%] >tests/test_device.py::TestDevice::test_device_sys_name <- tests/_device_tests/_device_tests.py PASSED [ 15%] >tests/test_device.py::TestDevice::test_sys_number <- tests/_device_tests/_device_tests.py PASSED [ 16%] >tests/test_device.py::TestDevice::test_type <- tests/_device_tests/_device_tests.py PASSED [ 17%] >tests/test_device.py::TestDevice::test_driver <- tests/_device_tests/_device_tests.py PASSED [ 17%] >tests/test_device.py::TestDevice::test_device_node <- tests/_device_tests/_device_tests.py PASSED [ 18%] >tests/test_device.py::TestDevice::test_device_number <- tests/_device_tests/_device_tests.py PASSED [ 18%] >tests/test_device.py::TestDevice::test_is_initialized <- tests/_device_tests/_device_tests.py PASSED [ 19%] >tests/test_device.py::TestDevice::test_is_initialized_mock <- tests/_device_tests/_device_tests.py PASSED [ 20%] >tests/test_device.py::TestDevice::test_time_since_initialized <- tests/_device_tests/_device_tests.py PASSED [ 20%] >tests/test_device.py::TestDevice::test_time_since_initialized_mock <- tests/_device_tests/_device_tests.py PASSED [ 21%] >tests/test_device.py::TestDevice::test_links <- tests/_device_tests/_device_tests.py PASSED [ 21%] >tests/test_device.py::TestDevice::test_action <- tests/_device_tests/_device_tests.py PASSED [ 22%] >tests/test_device.py::TestDevice::test_action_mock <- tests/_device_tests/_device_tests.py PASSED [ 23%] >tests/test_device.py::TestDevice::test_sequence_number <- tests/_device_tests/_device_tests.py PASSED [ 23%] >tests/test_device.py::TestDevice::test_attributes <- tests/_device_tests/_device_tests.py PASSED [ 24%] >tests/test_device.py::TestDevice::test_no_leak <- tests/_device_tests/_device_tests.py PASSED [ 24%] >tests/test_device.py::TestDevice::test_tags <- tests/_device_tests/_device_tests.py PASSED [ 25%] >tests/test_device.py::TestDevice::test_iteration <- tests/_device_tests/_device_tests.py PASSED [ 26%] >tests/test_device.py::TestDevice::test_length <- tests/_device_tests/_device_tests.py PASSED [ 26%] >tests/test_device.py::TestDevice::test_key_subset <- tests/_device_tests/_device_tests.py PASSED [ 27%] >tests/test_device.py::TestDevice::test_getitem <- tests/_device_tests/_device_tests.py PASSED [ 27%] >tests/test_device.py::TestDevice::test_getitem_devname <- tests/_device_tests/_device_tests.py PASSED [ 28%] >tests/test_device.py::TestDevice::test_getitem_nonexisting <- tests/_device_tests/_device_tests.py PASSED [ 28%] >tests/test_device.py::TestDevice::test_asint <- tests/_device_tests/_device_tests.py PASSED [ 29%] >tests/test_device.py::TestDevice::test_asbool <- tests/_device_tests/_device_tests.py PASSED [ 30%] >tests/test_device.py::TestDevice::test_hash <- tests/_device_tests/_device_tests.py PASSED [ 30%] >tests/test_device.py::TestDevice::test_equality <- tests/_device_tests/_device_tests.py PASSED [ 31%] >tests/test_device.py::TestDevice::test_inequality <- tests/_device_tests/_device_tests.py PASSED [ 31%] >tests/test_device.py::TestDevice::test_device_ordering <- tests/_device_tests/_device_tests.py PASSED [ 32%] >tests/test_device.py::TestDevice::test_id_wwn_with_extension <- tests/_device_tests/_device_tests.py SKIPPED (unsafe to check ID_WWN_WITH_EXTENSION) [ 33%] >tests/test_device.py::TestDevices::test_from_path <- tests/_device_tests/_devices_tests.py PASSED [ 33%] >tests/test_device.py::TestDevices::test_from_path_strips_leading_slash <- tests/_device_tests/_devices_tests.py PASSED [ 34%] >tests/test_device.py::TestDevices::test_from_sys_path <- tests/_device_tests/_devices_tests.py PASSED [ 34%] >tests/test_device.py::TestDevices::test_from_sys_path_device_not_found <- tests/_device_tests/_devices_tests.py PASSED [ 35%] >tests/test_device.py::TestDevices::test_from_name <- tests/_device_tests/_devices_tests.py PASSED [ 36%] >tests/test_device.py::TestDevices::test_from_name_no_device_in_existing_subsystem <- tests/_device_tests/_devices_tests.py PASSED [ 36%] >tests/test_device.py::TestDevices::test_from_name_nonexisting_subsystem <- tests/_device_tests/_devices_tests.py PASSED [ 37%] >tests/test_device.py::TestDevices::test_from_device_number <- tests/_device_tests/_devices_tests.py PASSED [ 37%] >tests/test_device.py::TestDevices::test_from_device_number_wrong_type <- tests/_device_tests/_devices_tests.py PASSED [ 38%] >tests/test_device.py::TestDevices::test_from_device_number_invalid_type <- tests/_device_tests/_devices_tests.py PASSED [ 39%] >tests/test_device.py::TestDevices::test_from_device_file <- tests/_device_tests/_devices_tests.py PASSED [ 39%] >tests/test_device.py::TestDevices::test_from_device_file_no_device_file <- tests/_device_tests/_devices_tests.py PASSED [ 40%] >tests/test_device.py::TestDevices::test_from_device_file_non_existing <- tests/_device_tests/_devices_tests.py PASSED [ 40%] >tests/test_device.py::TestDevices::test_from_environment <- tests/_device_tests/_devices_tests.py PASSED [ 41%] >tests/test_device.py::TestTags::test_iteration_and_contains <- tests/_device_tests/_tags_tests.py SKIPPED (no device with tags) [ 42%] >tests/test_device.py::TestTags::test_iteration_mock <- tests/_device_tests/_tags_tests.py PASSED [ 42%] >tests/test_device.py::TestTags::test_contains_mock <- tests/_device_tests/_tags_tests.py PASSED [ 43%] >tests/test_device.py::test_garbage PASSED [ 43%] >tests/test_discover.py::TestDiscovery::test_device_number PASSED [ 44%] >tests/test_discover.py::TestDiscovery::test_path PASSED [ 44%] >tests/test_discover.py::TestDiscovery::test_name PASSED [ 45%] >tests/test_discover.py::TestDiscovery::test_anything FAILED [ 46%] >tests/test_enumerate.py::TestEnumerator::test_match_subsystem FAILED [ 46%] >tests/test_enumerate.py::TestEnumerator::test_match_subsystem_nomatch FAILED [ 47%] >tests/test_enumerate.py::TestEnumerator::test_match_subsystem_nomatch_unfulfillable PASSED [ 47%] >tests/test_enumerate.py::TestEnumerator::test_match_subsystem_nomatch_complete FAILED [ 48%] >tests/test_enumerate.py::TestEnumerator::test_match_sys_name FAILED [ 49%] >tests/test_enumerate.py::TestEnumerator::test_match_property_string FAILED [ 49%] >tests/test_enumerate.py::TestEnumerator::test_match_property_int FAILED [ 50%] >tests/test_enumerate.py::TestEnumerator::test_match_property_bool FAILED [ 50%] >tests/test_enumerate.py::TestEnumerator::test_match_tag SKIPPED (failed health check for test_match_tag() (/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_enumerate.py: 199)) [ 51%] >tests/test_enumerate.py::TestEnumerator::test_match_parent FAILED [ 52%] >tests/test_enumerate.py::TestEnumeratorMatchCombinations::test_combined_property_matches FAILED [ 52%] >tests/test_enumerate.py::TestEnumeratorMatchCombinations::test_match FAILED [ 53%] >tests/test_enumerate.py::TestEnumeratorMatchMethod::test_match_passthrough_subsystem PASSED [ 53%] >tests/test_enumerate.py::TestEnumeratorMatchMethod::test_match_passthrough_sys_name PASSED [ 54%] >tests/test_enumerate.py::TestEnumeratorMatchMethod::test_match_passthrough_tag PASSED [ 55%] >tests/test_enumerate.py::TestEnumeratorMatchMethod::test_match_passthrough_parent PASSED [ 55%] >tests/test_enumerate.py::TestEnumeratorMatchMethod::test_match_passthrough_property PASSED [ 56%] >tests/test_monitor.py::TestMonitor::test_from_netlink_invalid_source [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 56%] >tests/test_monitor.py::TestMonitor::test_from_netlink_source_udev [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 57%] >tests/test_monitor.py::TestMonitor::test_from_netlink_source_udev_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >Exception ignored in: <function Monitor.__del__ at 0xf5566b20> >Traceback (most recent call last): > File "/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/monitor.py", line 90, in __del__ > self._libudev.udev_monitor_unref(self) >ctypes.ArgumentError: argument 1: <class 'TypeError'>: expected LP_udev_monitor instance instead of _SentinelObject >PASSED [ 57%] >tests/test_monitor.py::TestMonitor::test_from_netlink_source_kernel [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 58%] >tests/test_monitor.py::TestMonitor::test_from_netlink_source_kernel_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 59%] >tests/test_monitor.py::TestMonitor::test_fileno [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 59%] >tests/test_monitor.py::TestMonitor::test_fileno_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 60%] >tests/test_monitor.py::TestMonitor::test_filter_by_no_subsystem [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 60%] >tests/test_monitor.py::TestMonitor::test_filter_by_subsystem_no_dev_type [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 61%] >tests/test_monitor.py::TestMonitor::test_filter_by_subsystem_no_dev_type_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 62%] >tests/test_monitor.py::TestMonitor::test_filter_by_subsystem_dev_type [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 62%] >tests/test_monitor.py::TestMonitor::test_filter_by_subsystem_dev_type_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 63%] >tests/test_monitor.py::TestMonitor::test_filter_by_tag [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 63%] >tests/test_monitor.py::TestMonitor::test_filter_by_tag_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 64%] >tests/test_monitor.py::TestMonitor::test_remove_filter [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 65%] >tests/test_monitor.py::TestMonitor::test_remove_filter_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 65%] >tests/test_monitor.py::TestMonitor::test_start_netlink_kernel_source [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 66%] >tests/test_monitor.py::TestMonitor::test_start_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 66%] >tests/test_monitor.py::TestMonitor::test_enable_receiving [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 67%] >tests/test_monitor.py::TestMonitor::test_set_receive_buffer_size_mock [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 68%] >tests/test_monitor.py::TestMonitor::test_poll_timeout [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 68%] >tests/test_monitor.py::TestMonitor::test_receive_device [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 69%] >tests/test_monitor.py::TestMonitorObserver::test_deprecated_handler [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 69%] >tests/test_monitor.py::TestMonitorObserver::test_fake [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 70%] >tests/test_observer.py::test_fake_monitor [33;01m*[0m Unable to trace static ELF: /sbin/ldconfig: /sbin/ldconfig -p >PASSED [ 71%] >tests/test_observer.py::TestPysideObserver::test_monitor SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 71%] >tests/test_observer.py::TestPysideObserver::test_events_fake_monitor SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 72%] >tests/test_observer.py::TestPyQt4Observer::test_monitor SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 72%] >tests/test_observer.py::TestPyQt4Observer::test_events_fake_monitor SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 73%] >tests/test_observer.py::TestPyQt5Observer::test_monitor SKIPPED (could not import 'PyQt5.QtCore': No module named 'PyQt5') [ 73%] >tests/test_observer.py::TestPyQt5Observer::test_events_fake_monitor SKIPPED (could not import 'PyQt5.QtCore': No module named 'PyQt5') [ 74%] >tests/test_observer.py::TestGlibObserver::test_monitor SKIPPED (could not import 'glib': No module named 'glib') [ 75%] >tests/test_observer.py::TestGlibObserver::test_events_fake_monitor SKIPPED (could not import 'glib': No module named 'glib') [ 75%] >tests/test_observer.py::TestWxObserver::test_monitor SKIPPED (Display required for wxPython) [ 76%] >tests/test_observer.py::TestWxObserver::test_events_fake_monitor SKIPPED (Display required for wxPython) [ 76%] >tests/test_observer_deprecated.py::TestDeprecatedPysideObserver::test_monitor SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 77%] >tests/test_observer_deprecated.py::TestDeprecatedPysideObserver::test_events_fake_monitor[add] SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 78%] >tests/test_observer_deprecated.py::TestDeprecatedPysideObserver::test_events_fake_monitor[remove] SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 78%] >tests/test_observer_deprecated.py::TestDeprecatedPysideObserver::test_events_fake_monitor[change] SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 79%] >tests/test_observer_deprecated.py::TestDeprecatedPysideObserver::test_events_fake_monitor[move] SKIPPED (could not import 'PySide.QtCore': No module named 'PySide') [ 79%] >tests/test_observer_deprecated.py::TestDeprecatedPyQt4Observer::test_monitor SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 80%] >tests/test_observer_deprecated.py::TestDeprecatedPyQt4Observer::test_events_fake_monitor[add] SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 81%] >tests/test_observer_deprecated.py::TestDeprecatedPyQt4Observer::test_events_fake_monitor[remove] SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 81%] >tests/test_observer_deprecated.py::TestDeprecatedPyQt4Observer::test_events_fake_monitor[change] SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 82%] >tests/test_observer_deprecated.py::TestDeprecatedPyQt4Observer::test_events_fake_monitor[move] SKIPPED (could not import 'PyQt4.QtCore': No module named 'PyQt4') [ 82%] >tests/test_observer_deprecated.py::TestDeprecatedGlibObserver::test_monitor SKIPPED (could not import 'glib': No module named 'glib') [ 83%] >tests/test_observer_deprecated.py::TestDeprecatedGlibObserver::test_events_fake_monitor[add] SKIPPED (could not import 'glib': No module named 'glib') [ 84%] >tests/test_observer_deprecated.py::TestDeprecatedGlibObserver::test_events_fake_monitor[remove] SKIPPED (could not import 'glib': No module named 'glib') [ 84%] >tests/test_observer_deprecated.py::TestDeprecatedGlibObserver::test_events_fake_monitor[change] SKIPPED (could not import 'glib': No module named 'glib') [ 85%] >tests/test_observer_deprecated.py::TestDeprecatedGlibObserver::test_events_fake_monitor[move] SKIPPED (could not import 'glib': No module named 'glib') [ 85%] >tests/test_observer_deprecated.py::TestDeprecatedWxObserver::test_monitor SKIPPED (Display required for wxPython) [ 86%] >tests/test_observer_deprecated.py::TestDeprecatedWxObserver::test_events_fake_monitor[add] SKIPPED (Display required for wxPython) [ 86%] >tests/test_observer_deprecated.py::TestDeprecatedWxObserver::test_events_fake_monitor[remove] SKIPPED (Display required for wxPython) [ 87%] >tests/test_observer_deprecated.py::TestDeprecatedWxObserver::test_events_fake_monitor[change] SKIPPED (Display required for wxPython) [ 88%] >tests/test_observer_deprecated.py::TestDeprecatedWxObserver::test_events_fake_monitor[move] SKIPPED (Display required for wxPython) [ 88%] >tests/test_pypi.py::test_manifest_complete SKIPPED (Not in git clone) [ 89%] >tests/test_pypi.py::test_description_rendering SKIPPED (condition: sys.version_info[0] > 2) [ 89%] >tests/test_util.py::test_ensure_byte_string PASSED [ 90%] >tests/test_util.py::test_ensure_byte_string_none PASSED [ 91%] >tests/test_util.py::test_ensure_unicode_string PASSED [ 91%] >tests/test_util.py::test_ensure_unicode_string_none PASSED [ 92%] >tests/test_util.py::test_property_value_to_bytes_string PASSED [ 92%] >tests/test_util.py::test_property_value_to_bytes_int PASSED [ 93%] >tests/test_util.py::test_property_value_to_bytes_bool PASSED [ 94%] >tests/test_util.py::test_string_to_bool_true PASSED [ 94%] >tests/test_util.py::test_string_to_bool_false PASSED [ 95%] >tests/test_util.py::test_string_to_bool_invalid_value PASSED [ 95%] >tests/test_util.py::test_udev_list_iterate_no_entry PASSED [ 96%] >tests/test_util.py::test_udev_list_iterate_mock PASSED [ 97%] >tests/test_util.py::test_get_device_type_character_device PASSED [ 97%] >tests/test_util.py::test_get_device_type_block_device PASSED [ 98%] >tests/test_util.py::test_get_device_type_no_device_file PASSED [ 98%] >tests/test_util.py::test_get_device_type_not_existing PASSED [ 99%] >tests/test_util.py::test_eintr_retry_call PASSED [100%] > >=============================================================================================================================== FAILURES =============================================================================================================================== >_______________________________________________________________________________________________________________ TestAttributes.test_available_attributes _______________________________________________________________________________________________________________ > >self = <tests._device_tests._attributes_tests.TestAttributes object at 0xf5032028> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51ce850> >self = <tests._device_tests._attributes_tests.TestAttributes object at 0xf5032028> > >tests/_device_tests/_attributes_tests.py:128: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <tests._device_tests._attributes_tests.TestAttributes object at 0xf5032028>, a_device = Device('/sys/devices/system/cpu/cpu16') > > @_UDEV_TEST(167, "test_available_attributes") > @given(strategies.sampled_from(_DEVICES)) > @settings(max_examples=5) > def test_available_attributes(self, a_device): > """ > Test that the available attributes are exactly the names of files > in the sysfs directory that are regular files or softlinks. > """ > available_attributes = sorted(a_device.attributes.available_attributes) > > attribute_filenames = [] > sys_path = a_device.sys_path > for filename in sorted(os.listdir(sys_path)): > filepath = os.path.join(sys_path, filename) > status = os.lstat(filepath) > mode = status.st_mode > if not stat.S_ISLNK(mode) and not stat.S_ISREG(mode): > continue > if not stat.S_IRUSR & mode: > continue > attribute_filenames.append(filename) > >> assert available_attributes == attribute_filenames >E AssertionError > >a_device = Device('/sys/devices/system/cpu/cpu16') >attribute_filenames = ['clock_tick', > 'l1_dcache_line_size', > 'l1_dcache_size', > 'l1_icache_line_size', > 'l1_icache_size', > 'l2_cache_line_size', > 'l2_cache_size', > 'node0', > 'subsystem', > 'uevent'] >available_attributes = ['clock_tick', > 'l1_dcache_line_size', > 'l1_dcache_size', > 'l1_icache_line_size', > 'l1_icache_size', > 'l2_cache_line_size', > 'l2_cache_size', > 'node0', > 'power/autosuspend_delay_ms', > 'power/control', > 'power/pm_qos_resume_latency_us', > 'power/runtime_active_time', > 'power/runtime_status', > 'power/runtime_suspended_time', > 'subsystem', > 'topology/core_cpus', > 'topology/core_cpus_list', > 'topology/core_id', > 'topology/core_siblings', > 'topology/core_siblings_list', > 'topology/die_cpus', > 'topology/die_cpus_list', > 'topology/die_id', > 'topology/package_cpus', > 'topology/package_cpus_list', > 'topology/physical_package_id', > 'topology/thread_siblings', > 'topology/thread_siblings_list', > 'uevent'] >filename = 'uevent' >filepath = '/sys/devices/system/cpu/cpu16/uevent' >mode = 33188 >self = <tests._device_tests._attributes_tests.TestAttributes object at 0xf5032028> >status = os.stat_result(st_mode=33188, st_ino=3934, st_dev=23, st_nlink=1, st_uid=0, st_gid=0, st_size=8192, st_atime=1639040536, st_mtime=1639040536, st_ctime=1639040536) >sys_path = '/sys/devices/system/cpu/cpu16' > >tests/_device_tests/_attributes_tests.py:149: AssertionError >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_available_attributes( > a_device=Device('/sys/devices/system/cpu/cpu16'), > self=<tests._device_tests._attributes_tests.TestAttributes at 0xf5032028>, >) >___________________________________________________________________________________________________________________ TestDevice.test_child_of_parent ____________________________________________________________________________________________________________________ > >self = <tests._device_tests._device_tests.TestDevice object at 0xf50856d0> > > @pytest.mark.skipif(len(_devices) == 0, reason='no device with a parent') >> @_UDEV_TEST(172, "test_child_of_parents") > @given(strategies.sampled_from(_devices)) > @settings(max_examples=5) > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf523ac88> >self = <tests._device_tests._device_tests.TestDevice object at 0xf50856d0> > >tests/_device_tests/_device_tests.py:69: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests._device_tests._device_tests.TestDevice object at 0xf50856d0>, Device('/sys/devices/system/cpu/cpu16')), kwargs = {}, initial_draws = 1, start = 933404.696982965, result = None, finish = 933405.09206032, internal_draw_time = 0 >runtime = datetime.timedelta(microseconds=395077), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 395.08ms, which exceeds the deadline of 200.00ms > >args = (<tests._device_tests._device_tests.TestDevice object at 0xf50856d0>, > Device('/sys/devices/system/cpu/cpu16')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 933405.09206032 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(microseconds=395077) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf507e640> >start = 933404.696982965 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_child_of_parent( > a_device=Device('/sys/devices/system/cpu/cpu16'), > self=<tests._device_tests._device_tests.TestDevice at 0xf50856d0>, >) >_______________________________________________________________________________________________________________________ TestDevice.test_children _______________________________________________________________________________________________________________________ > >self = <tests._device_tests._device_tests.TestDevice object at 0xf505d418> > > @pytest.mark.skipif(len(_devices) == 0, reason='no device with a child') >> @_UDEV_TEST(172, "test_children") > @given(strategies.sampled_from(_devices)) > @settings(max_examples=5) > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf523ae38> >self = <tests._device_tests._device_tests.TestDevice object at 0xf505d418> > >tests/_device_tests/_device_tests.py:78: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests._device_tests._device_tests.TestDevice object at 0xf505d418>, Device('/sys/devices/root/f028c59c/pci0000:02/0000:02:00.0/0000:03:02.0/0000:0a:00.0/host0/port-0:2/end_device-0:2/target0:0:2')), kwargs = {}, initial_draws = 1, start = 933411.805348003 >result = None, finish = 933412.693193947, internal_draw_time = 0, runtime = datetime.timedelta(microseconds=887846), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 887.85ms, which exceeds the deadline of 200.00ms > >args = (<tests._device_tests._device_tests.TestDevice object at 0xf505d418>, > Device('/sys/devices/root/f028c59c/pci0000:02/0000:02:00.0/0000:03:02.0/0000:0a:00.0/host0/port-0:2/end_device-0:2/target0:0:2')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 1 bytes, frozen) >finish = 933412.693193947 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(microseconds=887846) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf507e748> >start = 933411.805348003 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_children( > a_device=Device('/sys/devices/root/f028c59c/pci0000:02/0000:02:00.0/0000:03:02.0/0000:0a:00.0/host0/port-0:2/end_device-0:2/target0:0:2'), > self=<tests._device_tests._device_tests.TestDevice at 0xf505d418>, >) >_____________________________________________________________________________________________________________________ TestDiscovery.test_anything ______________________________________________________________________________________________________________________ > >self = <tests.test_discover.TestDiscovery object at 0xf50c6b38> > > @given( >> strategies.sampled_from(_DEVICES), > strategies.text(":, -/+=").filter(lambda x: x)) > @settings(max_examples=NUM_TESTS) > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf5199c88> >self = <tests.test_discover.TestDiscovery object at 0xf50c6b38> > >tests/test_discover.py:157: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_discover.TestDiscovery object at 0xf50c6b38>, Device('/sys/devices/channel-devices'), ':'), kwargs = {}, initial_draws = 1, start = 933448.088995204, result = None, finish = 933448.504258266, internal_draw_time = 0 >runtime = datetime.timedelta(microseconds=415263), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 415.26ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_discover.TestDiscovery object at 0xf50c6b38>, > Device('/sys/devices/channel-devices'), > ':') >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 5 bytes, frozen) >finish = 933448.504258266 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(microseconds=415263) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f1efe8> >start = 933448.088995204 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_anything( > a_device=Device('/sys/devices/channel-devices'), > a_string=':', > self=<tests.test_discover.TestDiscovery at 0xf50c6b38>, >) >_________________________________________________________________________________________________________________ TestEnumerator.test_match_subsystem __________________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4f45598> > > @failed_health_check_wrapper >> @given(_CONTEXT_STRATEGY, _SUBSYSTEM_STRATEGY) > @settings(max_examples=10) > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51709b8> >self = <tests.test_enumerate.TestEnumerator object at 0xf4f45598> > >tests/test_enumerate.py:88: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f45598>, <pyudev.core.Context object at 0xf53faef8>, 'vio'), kwargs = {}, initial_draws = 1, start = 933497.96980397, result = None, finish = 933502.21512678, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=4, microseconds=245323), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 4245.32ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f45598>, > <pyudev.core.Context object at 0xf53faef8>, > 'vio') >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 933502.21512678 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=4, microseconds=245323) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf507e448> >start = 933497.96980397 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_subsystem( > context=<pyudev.core.Context at 0xf53faef8>, > subsystem='vio', > self=<tests.test_enumerate.TestEnumerator at 0xf4f45598>, >) >_____________________________________________________________________________________________________________ TestEnumerator.test_match_subsystem_nomatch ______________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4fb59e8> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf5170808> >self = <tests.test_enumerate.TestEnumerator object at 0xf4fb59e8> > >tests/test_enumerate.py:100: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4fb59e8>, <pyudev.core.Context object at 0xf53faef8>, 'vio'), kwargs = {}, initial_draws = 1, start = 933585.260215873, result = None, finish = 933593.018568049, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=7, microseconds=758352), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 7758.35ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4fb59e8>, > <pyudev.core.Context object at 0xf53faef8>, > 'vio') >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 933593.018568049 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=7, microseconds=758352) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4fb51f0> >start = 933585.260215873 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_subsystem_nomatch( > context=<pyudev.core.Context at 0xf53faef8>, > subsystem='vio', > self=<tests.test_enumerate.TestEnumerator at 0xf4fb59e8>, >) >_________________________________________________________________________________________________________ TestEnumerator.test_match_subsystem_nomatch_complete _________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf507e298> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf5170460> >self = <tests.test_enumerate.TestEnumerator object at 0xf507e298> > >tests/test_enumerate.py:124: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf507e298>, <pyudev.core.Context object at 0xf53faef8>, 'vio'), kwargs = {}, initial_draws = 1, start = 933681.756010684, result = None, finish = 933689.544009835, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=7, microseconds=787999), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 7788.00ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf507e298>, > <pyudev.core.Context object at 0xf53faef8>, > 'vio') >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 933689.544009835 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=7, microseconds=787999) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f91c40> >start = 933681.756010684 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_subsystem_nomatch_complete( > context=<pyudev.core.Context at 0xf53faef8>, > subsystem='vio', > self=<tests.test_enumerate.TestEnumerator at 0xf507e298>, >) >__________________________________________________________________________________________________________________ TestEnumerator.test_match_sys_name __________________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4fb5418> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51702b0> >self = <tests.test_enumerate.TestEnumerator object at 0xf4fb5418> > >tests/test_enumerate.py:142: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4fb5418>, <pyudev.core.Context object at 0xf53faef8>, 'channel-devices'), kwargs = {}, initial_draws = 1, start = 933733.600395179, result = None, finish = 933737.538079137, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=3, microseconds=937684), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 3937.68ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4fb5418>, > <pyudev.core.Context object at 0xf53faef8>, > 'channel-devices') >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 933737.538079137 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=3, microseconds=937684) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf50c67d8> >start = 933733.600395179 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_sys_name( > context=<pyudev.core.Context at 0xf53faef8>, > sysname='channel-devices', > self=<tests.test_enumerate.TestEnumerator at 0xf4fb5418>, >) >______________________________________________________________________________________________________________ TestEnumerator.test_match_property_string _______________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4f6f3a0> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf5170028> >self = <tests.test_enumerate.TestEnumerator object at 0xf4f6f3a0> > >tests/test_enumerate.py:153: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f6f3a0>, <pyudev.core.Context object at 0xf53faef8>, ('DEVPATH', '/devices/channel-devices')), kwargs = {}, initial_draws = 1, start = 933822.637602301, result = None, finish = 933840.926380008 >internal_draw_time = 0, runtime = datetime.timedelta(seconds=18, microseconds=288778), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 18288.78ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f6f3a0>, > <pyudev.core.Context object at 0xf53faef8>, > ('DEVPATH', '/devices/channel-devices')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 3 bytes, frozen) >finish = 933840.926380008 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=18, microseconds=288778) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f6f7a8> >start = 933822.637602301 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_property_string( > context=<pyudev.core.Context at 0xf53faef8>, > pair=('DEVPATH', '/devices/channel-devices'), > self=<tests.test_enumerate.TestEnumerator at 0xf4f6f3a0>, >) >________________________________________________________________________________________________________________ TestEnumerator.test_match_property_int ________________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf50323d0> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf5170e38> >self = <tests.test_enumerate.TestEnumerator object at 0xf50323d0> > >tests/test_enumerate.py:166: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf50323d0>, <pyudev.core.Context object at 0xf53faef8>, ('MAJOR', '10')), kwargs = {}, initial_draws = 1, start = 933922.183200935, result = None, finish = 933925.307041691, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=3, microseconds=123841), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 3123.84ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf50323d0>, > <pyudev.core.Context object at 0xf53faef8>, > ('MAJOR', '10')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 3 bytes, frozen) >finish = 933925.307041691 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=3, microseconds=123841) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f1eb98> >start = 933922.183200935 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_property_int( > context=<pyudev.core.Context at 0xf53faef8>, > pair=('MAJOR', '10'), > self=<tests.test_enumerate.TestEnumerator at 0xf50323d0>, >) >_______________________________________________________________________________________________________________ TestEnumerator.test_match_property_bool ________________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4f6fb50> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51c3070> >self = <tests.test_enumerate.TestEnumerator object at 0xf4f6fb50> > >tests/test_enumerate.py:183: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f6fb50>, <pyudev.core.Context object at 0xf53faef8>, ('OF_COMPATIBLE_N', '0')), kwargs = {}, initial_draws = 1, start = 933950.134307592, result = None, finish = 933952.79465009, internal_draw_time = 0 >runtime = datetime.timedelta(seconds=2, microseconds=660342), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 2660.34ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4f6fb50>, > <pyudev.core.Context object at 0xf53faef8>, > ('OF_COMPATIBLE_N', '0')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 3 bytes, frozen) >finish = 933952.79465009 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=2, microseconds=660342) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f1d328> >start = 933950.134307592 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_property_bool( > context=<pyudev.core.Context at 0xf53faef8>, > pair=('OF_COMPATIBLE_N', '0'), > self=<tests.test_enumerate.TestEnumerator at 0xf4f6fb50>, >) >___________________________________________________________________________________________________________________ TestEnumerator.test_match_parent ___________________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumerator object at 0xf4e8ba90> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51c7658> >self = <tests.test_enumerate.TestEnumerator object at 0xf4e8ba90> > >tests/test_enumerate.py:213: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4e8ba90>, <pyudev.core.Context object at 0xf53faef8>, Device('/sys/devices/root/f0287998')), kwargs = {}, initial_draws = 1, start = 934030.707199619, result = None, finish = 934037.448758671 >internal_draw_time = 0, runtime = datetime.timedelta(seconds=6, microseconds=741559), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 6741.56ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumerator object at 0xf4e8ba90>, > <pyudev.core.Context object at 0xf53faef8>, > Device('/sys/devices/root/f0287998')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 2 bytes, frozen) >finish = 934037.448758671 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=6, microseconds=741559) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4ff8df0> >start = 934030.707199619 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match_parent( > context=<pyudev.core.Context at 0xf53faef8>, > device=Device('/sys/devices/root/f0287998'), > self=<tests.test_enumerate.TestEnumerator at 0xf4e8ba90>, >) >____________________________________________________________________________________________________ TestEnumeratorMatchCombinations.test_combined_property_matches ____________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4dfb220> > > @given(_CONTEXT_STRATEGY, >> strategies.lists( > elements=_MATCH_PROPERTY_STRATEGY, > min_size=2, > max_size=3, > unique_by=lambda p: p[0])) > @settings(max_examples=2) > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51c79b8> >self = <tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4dfb220> > >tests/test_enumerate.py:237: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4dfb220>, <pyudev.core.Context object at 0xf53faef8>, [('DEVPATH', '/devices/channel-devices'), ('MODALIAS', 'vio:Tchannel-devicesSchannel-devices')]), kwargs = {}, initial_draws = 1 >start = 934189.813193359, result = None, finish = 934196.81782474, internal_draw_time = 0, runtime = datetime.timedelta(seconds=7, microseconds=4631), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 7004.63ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4dfb220>, > <pyudev.core.Context object at 0xf53faef8>, > [('DEVPATH', '/devices/channel-devices'), > ('MODALIAS', 'vio:Tchannel-devicesSchannel-devices')]) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 9 bytes, frozen) >finish = 934196.81782474 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=7, microseconds=4631) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4dfbb98> >start = 934189.813193359 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_combined_property_matches( > context=<pyudev.core.Context at 0xf53faef8>, > ppairs=[('DEVPATH', '/devices/channel-devices'), > ('MODALIAS', 'vio:Tchannel-devicesSchannel-devices')], > self=<tests.test_enumerate.TestEnumeratorMatchCombinations at 0xf4dfb220>, >) >______________________________________________________________________________________________________________ TestEnumeratorMatchCombinations.test_match ______________________________________________________________________________________________________________ > >self = <tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4f91568> > >> ??? > >f = <function given.<locals>.run_test_as_given.<locals>.wrapped_test at 0xf51c7b68> >self = <tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4f91568> > >tests/test_enumerate.py:264: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >args = (<tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4f91568>, <pyudev.core.Context object at 0xf53faef8>, 'vio', 'channel-devices', ('DEVPATH', '/devices/channel-devices')), kwargs = {}, initial_draws = 1, start = 934243.009618453 >result = None, finish = 934247.094205277, internal_draw_time = 0, runtime = datetime.timedelta(seconds=4, microseconds=84587), current_deadline = timedelta(milliseconds=200) > > @proxies(self.test) > def test(*args, **kwargs): > self.__test_runtime = None > initial_draws = len(data.draw_times) > start = time.perf_counter() > result = self.test(*args, **kwargs) > finish = time.perf_counter() > internal_draw_time = sum(data.draw_times[initial_draws:]) > runtime = datetime.timedelta( > seconds=finish - start - internal_draw_time > ) > self.__test_runtime = runtime > current_deadline = self.settings.deadline > if not is_final: > current_deadline = (current_deadline // 4) * 5 > if runtime >= current_deadline: >> raise DeadlineExceeded(runtime, self.settings.deadline) >E hypothesis.errors.DeadlineExceeded: Test took 4084.59ms, which exceeds the deadline of 200.00ms > >args = (<tests.test_enumerate.TestEnumeratorMatchCombinations object at 0xf4f91568>, > <pyudev.core.Context object at 0xf53faef8>, > 'vio', > 'channel-devices', > ('DEVPATH', '/devices/channel-devices')) >current_deadline = timedelta(milliseconds=200) >data = ConjectureData(VALID, 7 bytes, frozen) >finish = 934247.094205277 >initial_draws = 1 >internal_draw_time = 0 >is_final = True >kwargs = {} >result = None >runtime = datetime.timedelta(seconds=4, microseconds=84587) >self = <hypothesis.core.StateForActualGivenExecution object at 0xf4f91718> >start = 934243.009618453 > >/usr/lib/python3.9/site-packages/hypothesis/core.py:588: DeadlineExceeded >------------------------------------------------------------------------------------------------------------------------------ Hypothesis ------------------------------------------------------------------------------------------------------------------------------ >Falsifying example: test_match( > context=<pyudev.core.Context at 0xf53faef8>, > subsystem='vio', > sysname='channel-devices', > ppair=('DEVPATH', '/devices/channel-devices'), > self=<tests.test_enumerate.TestEnumeratorMatchCombinations at 0xf4f91568>, >) >=========================================================================================================================== warnings summary =========================================================================================================================== >../pyudev-0.22.0-python3_9/lib/pyudev/monitor.py:136 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/monitor.py:136: DeprecationWarning: invalid escape sequence \ > """ > >tests/test_util.py:38 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:38: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:46 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:46: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:52 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:52: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:60 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:66 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:66: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:74 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:80 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:80: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:88 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:88: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:94 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:94: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_util.py:100 > /var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_util.py:100: PytestUnknownMarkWarning: Unknown pytest.mark.conversion - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html > @pytest.mark.conversion > >tests/test_device.py::TestDevice::test_find_parent_no_devtype_mock >tests/test_device.py::TestDevice::test_find_parent_with_devtype_mock > /usr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning: Exception ignored in: <function Device.__del__ at 0xf552c898> > > Traceback (most recent call last): > File "/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/device/_device.py", line 457, in __del__ > self._libudev.udev_device_unref(self) > ctypes.ArgumentError: argument 1: <class 'TypeError'>: expected LP_udev_device instance instead of _SentinelObject > > warnings.warn(pytest.PytestUnraisableExceptionWarning(msg)) > >tests/test_monitor.py::TestMonitor::test_from_netlink_source_udev_mock >tests/test_monitor.py::TestMonitor::test_from_netlink_source_kernel_mock > /usr/lib/python3.9/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning: Exception ignored in: <function Monitor.__del__ at 0xf5566b20> > > Traceback (most recent call last): > File "/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0-python3_9/lib/pyudev/monitor.py", line 90, in __del__ > self._libudev.udev_monitor_unref(self) > ctypes.ArgumentError: argument 1: <class 'TypeError'>: expected LP_udev_monitor instance instead of _SentinelObject > > warnings.warn(pytest.PytestUnraisableExceptionWarning(msg)) > >-- Docs: https://docs.pytest.org/en/stable/warnings.html >======================================================================================================================= short test summary info ======================================================================================================================== >SKIPPED [1] tests/_device_tests/_device_tests.py:469: unsafe to check ID_WWN_WITH_EXTENSION >SKIPPED [1] tests/_device_tests/_tags_tests.py:54: no device with tags >SKIPPED [1] tests/utils/misc.py:62: failed health check for test_match_tag() (/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0/tests/test_enumerate.py: 199) >SKIPPED [2] tests/test_observer.py:109: could not import 'PySide.QtCore': No module named 'PySide' >SKIPPED [2] tests/test_observer.py:109: could not import 'PyQt4.QtCore': No module named 'PyQt4' >SKIPPED [2] tests/test_observer.py:109: could not import 'PyQt5.QtCore': No module named 'PyQt5' >SKIPPED [2] tests/test_observer.py:149: could not import 'glib': No module named 'glib' >SKIPPED [1] tests/test_observer.py:92: Display required for wxPython >SKIPPED [1] tests/test_observer.py:97: Display required for wxPython >SKIPPED [5] tests/test_observer_deprecated.py:119: could not import 'PySide.QtCore': No module named 'PySide' >SKIPPED [5] tests/test_observer_deprecated.py:119: could not import 'PyQt4.QtCore': No module named 'PyQt4' >SKIPPED [5] tests/test_observer_deprecated.py:167: could not import 'glib': No module named 'glib' >SKIPPED [1] tests/test_observer_deprecated.py:85: Display required for wxPython >SKIPPED [4] tests/test_observer_deprecated.py:90: Display required for wxPython >SKIPPED [1] tests/test_pypi.py:49: Not in git clone >SKIPPED [1] tests/test_pypi.py:171: condition: sys.version_info[0] > 2 >FAILED tests/test_device.py::TestAttributes::test_available_attributes - AssertionError >FAILED tests/test_device.py::TestDevice::test_child_of_parent - hypothesis.errors.DeadlineExceeded: Test took 395.08ms, which exceeds the deadline of 200.00ms >FAILED tests/test_device.py::TestDevice::test_children - hypothesis.errors.DeadlineExceeded: Test took 887.85ms, which exceeds the deadline of 200.00ms >FAILED tests/test_discover.py::TestDiscovery::test_anything - hypothesis.errors.DeadlineExceeded: Test took 415.26ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_subsystem - hypothesis.errors.DeadlineExceeded: Test took 4245.32ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_subsystem_nomatch - hypothesis.errors.DeadlineExceeded: Test took 7758.35ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_subsystem_nomatch_complete - hypothesis.errors.DeadlineExceeded: Test took 7788.00ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_sys_name - hypothesis.errors.DeadlineExceeded: Test took 3937.68ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_property_string - hypothesis.errors.DeadlineExceeded: Test took 18288.78ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_property_int - hypothesis.errors.DeadlineExceeded: Test took 3123.84ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_property_bool - hypothesis.errors.DeadlineExceeded: Test took 2660.34ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumerator::test_match_parent - hypothesis.errors.DeadlineExceeded: Test took 6741.56ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumeratorMatchCombinations::test_combined_property_matches - hypothesis.errors.DeadlineExceeded: Test took 7004.63ms, which exceeds the deadline of 200.00ms >FAILED tests/test_enumerate.py::TestEnumeratorMatchCombinations::test_match - hypothesis.errors.DeadlineExceeded: Test took 4084.59ms, which exceeds the deadline of 200.00ms >================================================================================================= 14 failed, 120 passed, 35 skipped, 15 warnings in 1053.73s (0:17:33) ================================================================================================= > [31;01m*[0m ERROR: dev-python/pyudev-0.22.0-r1::gentoo failed (test phase): > [31;01m*[0m pytest failed with python3.9 > [31;01m*[0m > [31;01m*[0m Call stack: > [31;01m*[0m ebuild.sh, line 127: Called src_test > [31;01m*[0m environment, line 2916: Called distutils-r1_src_test > [31;01m*[0m environment, line 1249: Called _distutils-r1_run_foreach_impl 'python_test' > [31;01m*[0m environment, line 468: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2570: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2100: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 2098: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' > [31;01m*[0m environment, line 803: Called distutils-r1_run_phase 'python_test' > [31;01m*[0m environment, line 1188: Called python_test > [31;01m*[0m environment, line 2875: Called distutils-r1_python_test > [31;01m*[0m environment, line 1145: Called epytest > [31;01m*[0m environment, line 1616: Called die > [31;01m*[0m The specific snippet of code: > [31;01m*[0m "${@}" || die -n "pytest failed with ${EPYTHON}"; > [31;01m*[0m > [31;01m*[0m If you need support, post the output of `emerge --info '=dev-python/pyudev-0.22.0-r1::gentoo'`, > [31;01m*[0m the complete build log and the output of `emerge -pqv '=dev-python/pyudev-0.22.0-r1::gentoo'`. > [31;01m*[0m The complete build log is located at '/var/tmp/portage/dev-python/pyudev-0.22.0-r1/temp/build.log'. > [31;01m*[0m The ebuild environment file is located at '/var/tmp/portage/dev-python/pyudev-0.22.0-r1/temp/environment'. > [31;01m*[0m Working directory: '/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0' > [31;01m*[0m S: '/var/tmp/portage/dev-python/pyudev-0.22.0-r1/work/pyudev-0.22.0'
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 829862
: 760148