Go to:
Gentoo Home
Documentation
Forums
Lists
Bugs
Planet
Store
Wiki
Get Gentoo!
Gentoo's Bugzilla – Attachment 782636 Details for
Bug 849605
[guru] [HELP] dev-python/pytest-benchmark-3.4.1 fails tests
Home
|
New
–
[Ex]
|
Browse
|
Search
|
Privacy Policy
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
build.log
build.log (text/plain), 356.60 KB, created by
Agostino Sarubbo
on 2022-06-04 06:31:49 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Agostino Sarubbo
Created:
2022-06-04 06:31:49 UTC
Size:
356.60 KB
patch
obsolete
> * Package: dev-python/pytest-benchmark-3.4.1 > * Repository: guru > * Maintainer: lssndrbarbieri@gmail.com > * USE: abi_x86_64 amd64 elibc_glibc kernel_linux python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 test userland_GNU > * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; >This ebuild was merged at the following commit: >https://github.com/gentoo/gentoo/commit/0a8a45b7e39c2946f7fbf2645e9697ac94f28d2c (Wed Jun 1 12:04:11 UTC 2022) >@@@@@ END @@@@@ > > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; >This ebuild was merged at the following commit: >https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=3995b1b59177a12964d0f922eda2aca41a8c2947 (Wed Jun 1 06:39:01 UTC 2022) >@@@@@ END @@@@@ > > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This ebuild was merged (directly or as a dependency) because of the following commit: >https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=dd47c8f0f30a2ad205bf7c191032ae904a325ea9 >@@@@@ END @@@@@ > > > >################## ># emerge --info: # >################## >Portage 3.0.30 (python 3.10.4-final-0, default/linux/amd64/17.1, gcc-11.3.0, glibc-2.35-r5, 4.19.174-gentoo x86_64) >================================================================= >System uname: Linux-4.19.174-gentoo-x86_64-Intel-R-_Xeon-R-_CPU_E5-2650_v4_@_2.20GHz-with-glibc2.35 >KiB Mem: 264046488 total, 23911836 free >KiB Swap: 0 total, 0 free >sh bash 5.1_p16 >ld GNU ld (Gentoo 2.38 p4) 2.38 >app-misc/pax-utils: 1.3.4::gentoo >app-shells/bash: 5.1_p16::gentoo >dev-lang/perl: 5.34.1-r3::gentoo >dev-lang/python: 2.7.18_p15::gentoo, 3.8.13_p2::gentoo, 3.9.13::gentoo, 3.10.4_p1::gentoo, 3.11.0_beta2_p1::gentoo >dev-util/cmake: 3.23.2::gentoo >dev-util/meson: 0.62.1::gentoo >sys-apps/baselayout: 2.8::gentoo >sys-apps/openrc: 0.44.10::gentoo >sys-apps/sandbox: 2.29::gentoo >sys-devel/autoconf: 2.71-r1::gentoo >sys-devel/automake: 1.16.5::gentoo >sys-devel/binutils: 2.38-r2::gentoo >sys-devel/binutils-config: 5.4.1::gentoo >sys-devel/gcc: 11.3.0::gentoo >sys-devel/gcc-config: 2.5-r1::gentoo >sys-devel/libtool: 2.4.7::gentoo >sys-devel/make: 4.3::gentoo >sys-kernel/linux-headers: 5.18::gentoo (virtual/os-headers) >sys-libs/glibc: 2.35-r5::gentoo >Repositories: > >gentoo > location: /usr/portage > sync-type: rsync > sync-uri: rsync://rsync.gentoo.org/gentoo-portage > priority: -1000 > sync-rsync-extra-opts: > sync-rsync-verify-max-age: 24 > sync-rsync-verify-jobs: 1 > sync-rsync-verify-metamanifest: yes > >guru > location: /opt/guru > masters: gentoo > priority: 0 > >ACCEPT_KEYWORDS="amd64 ~amd64" >ACCEPT_LICENSE="* BSD-2" >CBUILD="x86_64-pc-linux-gnu" >CFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >CHOST="x86_64-pc-linux-gnu" >CONFIG_PROTECT="/etc /usr/share/gnupg/qualified.txt" >CONFIG_PROTECT_MASK="/etc/ca-certificates.conf /etc/env.d /etc/fonts/fonts.conf /etc/gconf /etc/gentoo-release /etc/revdep-rebuild /etc/sandbox.d /etc/terminfo" >CXXFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >DISTDIR="/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/distdir" >EMERGE_DEFAULT_OPTS="--with-bdeps=y -1 -k -b" >ENV_UNSET="CARGO_HOME DBUS_SESSION_BUS_ADDRESS DISPLAY GOBIN GOPATH PERL5LIB PERL5OPT PERLPREFIX PERL_CORE PERL_MB_OPT PERL_MM_OPT XAUTHORITY XDG_CACHE_HOME XDG_CONFIG_HOME XDG_DATA_HOME XDG_RUNTIME_DIR" >FCFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >FEATURES="assume-digests binpkg-docompress binpkg-dostrip binpkg-logs binpkg-multi-instance buildpkg buildpkg-live config-protect-if-modified distlocks ebuild-locks fixlafiles ipc-sandbox merge-sync multilib-strict network-sandbox news parallel-fetch pid-sandbox preserve-libs protect-owned qa-unresolved-soname-deps sandbox sfperms sign split-log strict test unknown-features-warn unmerge-logs unmerge-orphans userfetch userpriv usersandbox usersync xattr" >FFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >GENTOO_MIRRORS="http://mirror.leaseweb.com/gentoo/ http://ftp.snt.utwente.nl/pub/os/linux/gentoo/ http://ftp.belnet.be/pub/rsync.gentoo.org/gentoo/ http://distfiles.gentoo.org" >LANG="C.UTF8" >LDFLAGS="-Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0" >MAKEOPTS="-j18" >PKGDIR="/root/tbci/binpkg" >PORTAGE_CONFIGROOT="/" >PORTAGE_RSYNC_OPTS="--recursive --links --safe-links --perms --times --omit-dir-times --compress --force --whole-file --delete --stats --human-readable --timeout=180 --exclude=/distfiles --exclude=/local --exclude=/packages --exclude=/.git" >PORTAGE_TMPDIR="/var/tmp" >SHELL="/bin/bash" >USE="acl amd64 bzip2 cli crypt dri elogind fortran gdbm iconv ipv6 jumbo-build libglvnd libtirpc multilib native-symlinks ncurses nls nptl openmp pam pcre readline seccomp split-usr ssl test unicode xattr zlib" ABI_X86="64" ELIBC="glibc" KERNEL="linux" PYTHON_TARGETS="python3_8 python3_9 python3_10" USERLAND="GNU" >Unset: ADDR2LINE, AR, ARFLAGS, AS, ASFLAGS, CC, CCLD, CONFIG_SHELL, CPP, CPPFLAGS, CTARGET, CXX, CXXFILT, ELFEDIT, EXTRA_ECONF, F77FLAGS, FC, GCOV, GPROF, INSTALL_MASK, LC_ALL, LD, LEX, LFLAGS, LIBTOOL, LINGUAS, MAKE, MAKEFLAGS, NM, OBJCOPY, OBJDUMP, PORTAGE_BINHOST, PORTAGE_BUNZIP2_COMMAND, PORTAGE_COMPRESS, PORTAGE_COMPRESS_FLAGS, PORTAGE_RSYNC_EXTRA_OPTS, RANLIB, READELF, RUSTFLAGS, SIZE, STRINGS, STRIP, YACC, YFLAGS > > > > > >############################## ># emerge history (qlop -mv): # >############################## >2022-06-02T09:22:47 >>> dev-python/py-cpuinfo-8.0.0 >2022-06-02T09:22:51 >>> dev-python/colorama-0.4.4-r1 >2022-06-02T09:22:56 >>> dev-python/six-1.16.0-r1 >2022-06-02T09:23:01 >>> dev-python/hunter-3.4.3 >2022-06-02T09:23:07 >>> dev-python/python-dateutil-2.8.2-r1 >2022-06-02T09:23:13 >>> dev-python/fields-5.0.0-r3 >2022-06-02T09:23:19 >>> dev-python/iniconfig-1.1.1-r1 >2022-06-02T09:23:25 >>> dev-python/aspectlib-1.5.2 >2022-06-02T09:23:31 >>> dev-python/pluggy-1.0.0-r2 >2022-06-02T09:23:37 >>> dev-python/py-1.11.0-r1 >2022-06-02T09:23:43 >>> dev-python/zope-interface-5.4.0-r2 >2022-06-02T09:23:50 >>> dev-python/cython-0.29.30 >2022-06-02T09:23:56 >>> dev-python/attrs-21.4.0-r1 >2022-06-02T09:24:05 >>> dev-python/freezegun-1.2.1 >2022-06-02T09:25:07 >>> dev-python/pytest-7.1.2 >2022-06-02T09:25:13 >>> dev-python/pygal-3.0.0-r2 >2022-06-02T09:26:04 >>> dev-python/pygaljs-1.0.2-r1 >2022-06-02T09:27:12 >>> dev-python/elasticsearch-py-7.14.1 > > > > >####################################### ># installed packages (qlist -ICvUSS): # >####################################### >acct-group/audio-0-r1:0 >acct-group/cdrom-0-r1:0 >acct-group/dialout-0-r1:0 >acct-group/disk-0-r1:0 >acct-group/input-0-r1:0 >acct-group/kmem-0-r1:0 >acct-group/kvm-0-r1:0 >acct-group/lp-0-r1:0 >acct-group/man-0-r1:0 >acct-group/messagebus-0-r1:0 >acct-group/polkitd-0-r1:0 >acct-group/portage-0:0 >acct-group/render-0-r1:0 >acct-group/sgx-0:0 >acct-group/sshd-0-r1:0 >acct-group/tape-0-r1:0 >acct-group/tty-0-r1:0 >acct-group/video-0-r1:0 >acct-user/man-1-r1:0 >acct-user/messagebus-0-r1:0 >acct-user/polkitd-0-r1:0 >acct-user/portage-0:0 >acct-user/sshd-0-r1:0 >app-admin/eselect-1.4.20:0 -doc -emacs -vim-syntax >app-admin/perl-cleaner-2.30:0 >app-arch/bzip2-1.0.8-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static -static-libs -verify-sig >app-arch/gzip-1.12:0 -pic -static -verify-sig >app-arch/libarchive-3.6.1:0/13 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -blake2 bzip2 e2fsprogs -expat iconv -lz4 lzma -lzo -nettle -static-libs -verify-sig xattr -zstd >app-arch/tar-1.34:0 acl -minimal nls -selinux -verify-sig xattr >app-arch/unzip-6.0_p26:0 bzip2 -natspec unicode >app-arch/xz-utils-5.2.5-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 extra-filters nls split-usr -static-libs -verify-sig >app-arch/zstd-1.5.2:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -lz4 -static-libs threads >app-crypt/gnupg-2.3.6:0 bzip2 -doc -ldap nls readline -selinux smartcard ssl -test tofu -tools -tpm -usb -user-socket -verify-sig -wks-server >app-crypt/gpgme-1.17.1-r1:1/11.6.15 -common-lisp cxx -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -qt5 -static-libs -test -verify-sig >app-crypt/libb2-0.98.1-r3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -native-cflags openmp -static-libs >app-crypt/libmd-1.0.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >app-crypt/pinentry-1.2.0:0 -caps -efl -emacs -gnome-keyring -gtk ncurses -qt5 >app-crypt/rhash-1.4.2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls ssl -static-libs >app-editors/nano-6.3:0 -debug -justify -magic -minimal ncurses nls spell split-usr -static unicode >app-eselect/eselect-fontconfig-20220403:0 >app-eselect/eselect-iptables-20220320:0 >app-eselect/eselect-lib-bin-symlink-0.1.1-r1:0 >app-eselect/eselect-pinentry-0.7.2:0 >app-i18n/man-pages-ja-20180315-r1:0 >app-i18n/man-pages-l10n-4.12.1-r2:0 l10n_cs l10n_da l10n_de l10n_el l10n_es l10n_fi l10n_fr l10n_hu l10n_id l10n_it l10n_mk l10n_nb l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_sr l10n_sv >app-i18n/man-pages-ru-5.03.2390.2390.20191017-r1:0 >app-i18n/man-pages-zh_CN-1.6.3.6:0 >app-misc/c_rehash-1.7-r1:0 >app-misc/ca-certificates-20211016.3.77:0 -cacert >app-misc/editor-wrapper-4-r1:0 >app-misc/mime-types-2.1.53:0 -nginx >app-misc/pax-utils-1.3.4:0 -caps -debug -python -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 seccomp >app-misc/tmux-3.2a:0 -debug -selinux -utempter -vim-syntax >app-portage/eix-0.36.2:0 -debug -doc nls -sqlite >app-portage/elt-patches-20211104:0 >app-portage/gemato-16.2:0 gpg python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test -tools >app-portage/gentoolkit-0.5.1-r1:0 python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >app-portage/portage-utils-0.93.3:0 nls openmp qmanifest qtegrity -static >app-shells/bash-5.1_p16:0 -afs -bashlogger -examples -mem-scramble net nls -plugins readline -verify-sig >app-shells/bash-completion-2.11:0 eselect -test >app-shells/gentoo-bashcomp-20190211:0 >app-shells/push-3.4:0 >app-shells/quoter-4.2:0 >app-text/ansifilter-2.18:0 -qt5 >app-text/build-docbook-catalog-2.3:0 >app-text/docbook-xml-dtd-4.5-r2:4.5 >app-text/docbook-xml-dtd-4.4-r3:4.4 >app-text/docbook-xml-dtd-4.2-r3:4.2 >app-text/docbook-xml-dtd-4.1.2-r7:4.1.2 >app-text/docbook-xsl-stylesheets-1.79.1-r2:0 -ruby >app-text/manpager-1:0 >app-text/opensp-1.5.2-r7:0 -doc nls -static-libs -test >app-text/po4a-0.66:0 -test -test >app-text/sgml-common-0.6.3-r7:0 >app-text/xmlto-0.0.28-r8:0 -latex text >dev-db/sqlite-3.38.5:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc -icu readline -secure-delete -static-libs -tcl -test -tools >dev-lang/duktape-2.7.0-r1:0/2.7.0 >dev-lang/perl-5.34.1-r3:0/5.34 -berkdb -debug -doc gdbm ithreads -minimal -quadmath >dev-lang/python-3.11.0_beta2_p1:3.11 -bluetooth -build -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst >dev-lang/python-3.10.4_p1:3.10 -bluetooth -build -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-3.9.13:3.9 -bluetooth -build -examples gdbm -hardened -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-3.8.13_p2:3.8 -bluetooth -build -examples gdbm -hardened ncurses readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-2.7.18_p15:2.7 -berkdb -bluetooth -build -examples gdbm -hardened ncurses readline sqlite ssl -tk -verify-sig -wininst xml >dev-lang/python-exec-2.4.9:2 native-symlinks python_targets_pypy3 python_targets_python3_10 python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-lang/python-exec-conf-2.4.6:2 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-lang/tcl-8.6.12:0/8.6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug threads >dev-libs/boehm-gc-8.0.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx large -static-libs threads >dev-libs/elfutils-0.187:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma nls -static-libs -test -threads utils -valgrind -verify-sig -zstd >dev-libs/expat-2.4.8:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples -static-libs unicode >dev-libs/glib-2.72.2:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -dbus -debug elf -fam -gtk-doc mime -selinux -static-libs -sysprof -systemtap -test -utils xattr >dev-libs/gmp-6.2.1-r2:0/10.4 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cxx -doc -pic -static-libs >dev-libs/gobject-introspection-1.72.0:0 -doctool -gtk-doc -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -test >dev-libs/gobject-introspection-common-1.72.0:0 >dev-libs/isl-0.24-r2:0/23 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/jsoncpp-1.9.5:0/25 -doc -test >dev-libs/libassuan-2.5.5:0 >dev-libs/libatomic_ops-7.6.12:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >dev-libs/libbsd-0.11.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig >dev-libs/libevent-2.1.12:0/2.1-7 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 clock-gettime -debug -malloc-replacement ssl -static-libs -test threads -verbose-debug >dev-libs/libffi-3.4.2-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -exec-static-trampoline -pax-kernel -static-libs -test >dev-libs/libgcrypt-1.10.1:0/20 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_aes -cpu_flags_arm_neon -cpu_flags_arm_sha1 -cpu_flags_arm_sha2 -cpu_flags_ppc_altivec -cpu_flags_ppc_vsx2 -cpu_flags_ppc_vsx3 cpu_flags_x86_aes cpu_flags_x86_avx cpu_flags_x86_avx2 -cpu_flags_x86_padlock -cpu_flags_x86_sha cpu_flags_x86_sse4_1 -doc -static-libs -verify-sig >dev-libs/libgpg-error-1.45:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -common-lisp nls -static-libs -test >dev-libs/libksba-1.6.0:0 -static-libs >dev-libs/libltdl-2.4.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/libpcre-8.45-r1:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 cxx jit -libedit pcre16 pcre32 readline split-usr -static-libs unicode zlib >dev-libs/libpcre2-10.40:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 jit -libedit pcre16 pcre32 readline split-usr -static-libs unicode -verify-sig zlib >dev-libs/libpipeline-1.5.6:0 -test >dev-libs/libtasn1-4.18.0:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -test -valgrind >dev-libs/libunistring-1.0:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs >dev-libs/libuv-1.44.1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >dev-libs/libxml2-2.9.14-r1:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -examples -icu -lzma python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 readline -static-libs -test >dev-libs/libxslt-1.1.35:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 crypt -debug -examples -static-libs >dev-libs/lzo-2.10:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples split-usr -static-libs >dev-libs/mpc-1.2.1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/mpfr-4.1.0_p13-r1:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/nettle-3.7.3:0/8-6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_neon cpu_flags_x86_aes -cpu_flags_x86_sha -doc gmp -static-libs -test >dev-libs/npth-1.6-r1:0 -test >dev-libs/openssl-1.1.1o:0/1.1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cpu_flags_x86_sse2 -rfc3779 -sctp -sslv3 -static-libs -test -tls-compression -tls-heartbeat -vanilla -verify-sig -verify-sig -weak-ssl-ciphers >dev-libs/popt-1.18:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static-libs >dev-perl/Devel-CheckLib-1.140.0:0 -test >dev-perl/Encode-EUCJPASCII-0.30.0-r1:0 -test >dev-perl/Encode-HanExtra-0.230.0-r3:0 >dev-perl/Encode-Locale-1.50.0-r1:0 -test >dev-perl/File-BaseDir-0.90.0:0 -test >dev-perl/File-DesktopEntry-0.220.0-r1:0 -test >dev-perl/File-Listing-6.140.0:0 -test >dev-perl/File-MimeInfo-0.300.0:0 -test >dev-perl/HTML-Parser-3.760.0:0 -test >dev-perl/HTML-Tagset-3.200.0-r2:0 >dev-perl/HTTP-Cookies-6.100.0:0 -test >dev-perl/HTTP-Date-6.50.0:0 >dev-perl/HTTP-Message-6.330.0:0 -test -test >dev-perl/HTTP-Negotiate-6.10.0-r2:0 -test >dev-perl/IO-HTML-1.4.0:0 -test >dev-perl/IO-Socket-INET6-2.720.0-r2:0 -test >dev-perl/IO-Socket-SSL-2.74.0:0 -examples -idn -test >dev-perl/IPC-System-Simple-1.300.0:0 -test >dev-perl/libwww-perl-6.600.0-r1:0 ssl -test >dev-perl/Locale-gettext-1.70.0-r1:0 -test >dev-perl/LWP-MediaTypes-6.40.0:0 -test >dev-perl/LWP-Protocol-https-6.100.0:0 -test >dev-perl/MIME-Charset-1.12.2-r1:0 l10n_ja l10n_zh -test >dev-perl/Module-Build-0.423.100:0 -test >dev-perl/Mozilla-CA-20999999-r1:0 -test >dev-perl/Net-HTTP-6.210.0:0 -minimal -test >dev-perl/Net-SSLeay-1.920.0:0 -examples -examples -minimal -test >dev-perl/Pod-Parser-1.630.0-r1:0 -test >dev-perl/SGMLSpm-1.1-r2:0 -test >dev-perl/Socket6-0.290.0:0 -test >dev-perl/TermReadKey-2.380.0:0 -examples -test >dev-perl/Text-CharWidth-0.40.0-r2:0 -test >dev-perl/Text-WrapI18N-0.60.0-r2:0 -test >dev-perl/TimeDate-2.330.0-r1:0 -test >dev-perl/Try-Tiny-0.310.0:0 -minimal -test >dev-perl/Unicode-LineBreak-2019.1.0:0 >dev-perl/URI-5.100.0:0 -test >dev-perl/WWW-RobotRules-6.20.0-r2:0 -test >dev-perl/XML-Parser-2.460.0-r2:0 >dev-perl/YAML-Tiny-1.730.0-r1:0 -minimal -test >dev-python/appdirs-1.4.4-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/aspectlib-1.5.2:0 -doc python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/attrs-21.4.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/certifi-3021.3.16-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/charset_normalizer-2.0.12:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/colorama-0.4.4-r1:0 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/cython-0.29.30:0 -doc -emacs python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/elasticsearch-py-7.14.1:0 -async -doc -doc python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test -test >dev-python/fields-5.0.0-r3:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/flit_core-3.7.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/freezegun-1.2.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/gpep517-6:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/hunter-3.4.3:0 -doc python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/idna-3.3-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/importlib_metadata-4.11.4:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/importlib_resources-5.7.1:0 -doc python_targets_pypy3 python_targets_python3_8 -test >dev-python/iniconfig-1.1.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/installer-0.5.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-context-4.1.1-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-functools-3.5.0-r2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-text-3.7.0-r2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jinja-3.1.2:0 -doc -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/markupsafe-2.1.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/more-itertools-8.13.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/nspektr-0.3.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/ordered-set-4.1.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/packaging-21.3-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pluggy-1.0.0-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/py-1.11.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/py-cpuinfo-8.0.0:0 python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >dev-python/pygal-3.0.0-r2:0 -doc python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pygaljs-1.0.2-r1:0 python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pyparsing-3.0.9:0 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pypy3-7.3.9_p1:0/pypy39-pp73 bzip2 gdbm jit ncurses -sqlite -test -tk >dev-python/pypy3-exe-7.3.9:3.9-7.3.9 bzip2 -cpu_flags_x86_sse2 jit -low-memory ncurses >dev-python/PySocks-1.7.1-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/pytest-7.1.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/python-dateutil-2.8.2-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/requests-2.27.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -socks5 -test >dev-python/setuptools-62.3.2-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/setuptools_scm-6.4.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/six-1.16.0-r1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/tomli-2.0.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/urllib3-1.26.9-r1:0 -brotli python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/wheel-0.37.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/zipp-3.8.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/zope-interface-5.4.0-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-util/checkbashisms-2.22.1:0 >dev-util/cmake-3.23.2:0 -doc -emacs ncurses -qt5 -test -test -verify-sig >dev-util/desktop-file-utils-0.26-r2:0 -emacs >dev-util/glib-utils-2.72.2:0 -python_single_target_python3_10 -python_single_target_python3_11 -python_single_target_python3_8 python_single_target_python3_9 >dev-util/gperf-3.1:0 >dev-util/gtk-doc-am-1.33.2:0 >dev-util/intltool-0.51.0-r2:0 >dev-util/meson-0.62.1:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-util/meson-format-array-0:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >dev-util/ninja-1.11.0:0 -doc -emacs -test -vim-syntax >dev-util/pkgconf-1.8.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -test >dev-util/re2c-2.2:0 -debug -test >dev-vcs/git-2.35.1:0 blksha1 -cgi curl -cvs -doc -emacs -gnome-keyring gpg -highlight iconv -mediawiki -mediawiki-experimental nls pcre -perforce -perl -ppcsha1 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -subversion -test threads -tk webdav -xinetd >media-fonts/liberation-fonts-2.1.3:0 -X -X -fontforge >media-gfx/graphite2-1.3.14_p20210810-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -perl -test >media-libs/fontconfig-2.14.0-r1:1.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs -test >media-libs/freetype-2.12.1:2 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 adobe-cff -brotli bzip2 cleartype-hinting -debug -doc -fontforge harfbuzz -infinality png -static-libs -svg -utils >media-libs/harfbuzz-4.3.0:0/4.0.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 cairo -debug -doc -experimental glib graphite -icu introspection -test truetype >media-libs/libpng-1.6.37-r2:0/16 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -apng -cpu_flags_arm_neon cpu_flags_x86_sse -static-libs >net-dns/libidn2-2.3.2:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig >net-firewall/iptables-1.8.8-r2:0/1.8.3 -conntrack -netlink -nftables -pcap split-usr -static-libs >net-libs/gnutls-3.7.6:0/30.30 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -brotli cxx -dane -doc -examples -guile idn nls openssl -pkcs11 seccomp -sslv2 -sslv3 -static-libs -test -test-full tls-heartbeat -tools -valgrind -verify-sig zlib -zstd >net-libs/libmnl-1.0.5:0/0.2.0 -examples -verify-sig >net-libs/libnsl-2.0.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >net-libs/libtirpc-1.3.2:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 ipv6 -kerberos split-usr -static-libs >net-libs/nghttp2-1.47.0:0/1.14 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx -debug -hpack-tools -jemalloc -static-libs -test threads -utils -xml >net-misc/curl-7.83.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -adns -alt-svc -brotli -curl_ssl_gnutls -curl_ssl_mbedtls -curl_ssl_nss curl_ssl_openssl ftp -gnutls -gopher -hsts http2 -idn imap ipv6 -kerberos -ldap -mbedtls -nghttp3 -nss openssl pop3 progress-meter -quiche -rtmp -samba smtp -ssh ssl -sslv3 -static-libs -telnet -test tftp -threads -verify-sig -zstd >net-misc/dhcpcd-9.4.1:0 -debug embedded ipv6 -privsep udev >net-misc/iputils-20211215:0 arping -caps -clockdiff -doc filecaps -idn nls -rarpd -rdisc -static -test -tracepath >net-misc/netifrc-0.7.3-r1:0 dhcp >net-misc/openssh-9.0_p1-r1:0 -X -X509 -abi_mips_n32 -audit -debug -hpn -kerberos -ldns -libedit -livecd pam pie -sctp -security-key -selinux ssl -static -test -verify-sig -xmss >net-misc/rsync-3.2.4-r1:0 acl -examples iconv ipv6 -lz4 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 ssl -stunnel -system-zlib -verify-sig xattr -xxhash -zstd >net-misc/wget-1.21.3:0 -cookie-check -debug -gnutls -idn ipv6 -metalink nls -ntlm pcre ssl -static -test -uuid -verify-sig zlib >perl-core/CPAN-2.290.0-r1:0 >perl-core/Encode-3.120.0:0 >perl-core/File-Temp-0.231.100:0 >perl-core/Scalar-List-Utils-1.560.0:0 >sec-keys/openpgp-keys-gentoo-release-20220101:0 -test >sys-apps/acl-2.3.1-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls split-usr -static-libs >sys-apps/attr-2.5.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls split-usr -static-libs >sys-apps/baselayout-2.8:0 -build split-usr >sys-apps/coreutils-9.1-r1:0 acl -caps -gmp -hostname -kill -multicall nls -selinux split-usr -static -test -vanilla -verify-sig xattr >sys-apps/dbus-1.14.0-r1:0 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc elogind -selinux -static-libs -systemd -test -test >sys-apps/debianutils-5.7:0 installkernel -static >sys-apps/diffutils-3.8:0 nls -static -verify-sig >sys-apps/file-5.41-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma -python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -seccomp -static-libs zlib >sys-apps/findutils-4.9.0:0 nls -selinux -static -test -verify-sig >sys-apps/gawk-5.1.1-r2:0 -mpfr nls readline -verify-sig >sys-apps/gentoo-functions-0.15:0 >sys-apps/grep-3.7:0 nls pcre -static -verify-sig >sys-apps/groff-1.22.4:0 -X -examples -uchardet >sys-apps/help2man-1.48.5:0 nls >sys-apps/install-xattr-0.8:0 >sys-apps/iproute2-5.18.0-r1:0 -atm -berkdb -bpf -caps -elf iptables -libbsd -minimal -nfs -selinux split-usr >sys-apps/kbd-2.4.0:0 nls pam -test >sys-apps/kmod-29:0 -debug -doc lzma -pkcs7 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs tools zlib -zstd >sys-apps/less-590:0 pcre unicode >sys-apps/man-db-2.10.2-r1:0 manpager nls seccomp -selinux -static-libs zlib >sys-apps/man-pages-5.13:0 l10n_de l10n_es l10n_fr l10n_it l10n_ja l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_ru l10n_zh-CN >sys-apps/man-pages-posix-2017a:0 >sys-apps/miscfiles-1.5-r4:0 -minimal >sys-apps/net-tools-2.10:0 arp hostname ipv6 -nis nls -plipconfig -selinux -slattach -static >sys-apps/openrc-0.44.10:0 -audit -bash -debug ncurses netifrc -newnet pam -selinux -sysv-utils unicode >sys-apps/portage-3.0.30-r5:0 -apidoc -build -doc -gentoo-dev ipc native-extensions python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 rsync-verify -selinux -test xattr >sys-apps/sandbox-2.29:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 nnp >sys-apps/sed-4.8:0 acl nls -selinux -static -verify-sig >sys-apps/shadow-4.11.1:0/4 acl -audit -bcrypt -cracklib nls pam -selinux -skey split-usr -su xattr >sys-apps/systemd-utils-250.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -boot kmod -selinux split-usr -sysusers -test tmpfiles udev >sys-apps/sysvinit-3.04:0 -ibm nls -selinux -static -verify-sig >sys-apps/texinfo-6.8:0 nls standalone -static >sys-apps/util-linux-2.38:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -build -caps cramfs -cryptsetup -fdformat hardlink -kill logger -magic ncurses nls pam -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 readline -rtas -selinux -slang split-usr -static-libs su suid -systemd -test -tty-helpers -udev unicode -verify-sig >sys-apps/which-2.21:0 >sys-auth/elogind-246.10-r2:0 acl -audit cgroup-hybrid -debug -doc pam policykit -selinux -test >sys-auth/pambase-20220214:0 -caps -debug elogind -gnome-keyring -homed -minimal -mktemp nullok -pam_krb5 -pam_ssh passwdqc -pwhistory -pwquality -securetty -selinux sha512 -systemd -yescrypt >sys-auth/passwdqc-2.0.2-r1:0 >sys-auth/polkit-0.120_p20220509:0 duktape -examples -gtk introspection -kde pam -selinux -systemd -test >sys-devel/autoconf-2.71-r1:2.71 -emacs >sys-devel/autoconf-archive-2022.02.11:0 >sys-devel/autoconf-wrapper-20220130:0 >sys-devel/automake-1.16.5:1.16 -test >sys-devel/automake-wrapper-11:0 >sys-devel/binutils-2.38-r2:2.38 -cet -default-gold -doc gold -multitarget nls -pgo plugins -static-libs -test -vanilla >sys-devel/binutils-config-5.4.1:0 native-symlinks >sys-devel/bison-3.8.2:0 -examples nls -static -test -verify-sig >sys-devel/flex-2.6.4-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static -test >sys-devel/gcc-11.3.0:11 -ada -cet -custom-cflags cxx -d -debug -doc -fixed-point fortran -go graphite -hardened -jit -libssp lto multilib nls nptl -objc -objc++ -objc-gc openmp -pch -pgo pie sanitize ssp -systemtap -test -valgrind -vanilla -vtv -zstd >sys-devel/gcc-config-2.5-r1:0 cc-wrappers native-symlinks >sys-devel/gettext-0.21-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -cvs cxx -doc -emacs -git -java -java ncurses nls openmp -static-libs -verify-sig >sys-devel/gnuconfig-20220508:0 >sys-devel/libtool-2.4.7:2 -vanilla >sys-devel/m4-1.4.19:0 -examples nls -verify-sig >sys-devel/make-4.3:0 -guile nls -static -verify-sig >sys-devel/patch-2.7.6-r4:0 -static -test -verify-sig xattr >sys-fs/e2fsprogs-1.46.5-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cron -fuse -lto nls split-usr -static-libs -test threads tools >sys-fs/udev-init-scripts-35:0 >sys-kernel/installkernel-gentoo-5:0 -grub >sys-kernel/linux-headers-5.18:0 -experimental-loong -headers-only >sys-libs/binutils-libs-2.38-r2:0/2.38 -64-bit-bfd -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cet -multitarget nls -static-libs >sys-libs/gdbm-1.23:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 berkdb nls readline -static-libs -verify-sig >sys-libs/glibc-2.35-r5:2.2 -audit -caps -cet clone3 -compile-locales -crypt -custom-cflags -doc -experimental-loong -gd -headers-only multiarch multilib -multilib-bootstrap -nscd -profile -selinux ssp static-libs -suid -systemd -systemtap -test -vanilla >sys-libs/libcap-2.64:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 pam split-usr -static-libs -tools >sys-libs/libseccomp-2.5.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs -test >sys-libs/libxcrypt-4.4.28-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 compat split-usr -static-libs system -test >sys-libs/ncurses-6.3_p20220423:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -ada cxx -debug -doc -gpm -minimal -profile split-usr -static-libs -test tinfo -trace -verify-sig >sys-libs/pam-1.5.2-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -berkdb -debug filecaps -nis -selinux >sys-libs/readline-8.1_p2:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static-libs unicode -utils -verify-sig >sys-libs/timezone-data-2022a:0 -leaps-timezone nls -zic-slim >sys-libs/zlib-1.2.12-r2:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 minizip split-usr -static-libs -verify-sig >sys-process/procps-3.3.17-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 elogind kill -modern-top ncurses nls -selinux split-usr -static-libs -systemd -test unicode >sys-process/psmisc-23.4-r1:0 -X ipv6 nls -selinux >virtual/acl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >virtual/awk-1:0 >virtual/dev-manager-0-r2:0 >virtual/editor-0-r3:0 >virtual/libc-1-r1:0 >virtual/libcrypt-2:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >virtual/libelf-3:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libiconv-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libintl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libudev-232-r7:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -systemd >virtual/man-0-r4:0 >virtual/os-headers-0-r2:0 >virtual/package-manager-1:0 >virtual/pager-0:0 >virtual/perl-Carp-1.520.0-r2:0 >virtual/perl-Compress-Raw-Bzip2-2.103.0-r2:0 >virtual/perl-Compress-Raw-Zlib-2.103.0-r1:0 >virtual/perl-CPAN-2.290.0:0 >virtual/perl-CPAN-Meta-2.150.10-r6:0 >virtual/perl-CPAN-Meta-Requirements-2.140.0-r8:0 >virtual/perl-CPAN-Meta-YAML-0.18.0-r8:0 >virtual/perl-Data-Dumper-2.179.0:0 >virtual/perl-Digest-MD5-2.580.0-r1:0 >virtual/perl-Encode-3.120.0:0 >virtual/perl-Exporter-5.760.0:0 >virtual/perl-ExtUtils-CBuilder-0.280.236-r1:0 >virtual/perl-ExtUtils-Install-2.200.0-r1:0 >virtual/perl-ExtUtils-MakeMaker-7.620.0:0 >virtual/perl-ExtUtils-Manifest-1.730.0-r1:0 >virtual/perl-ExtUtils-ParseXS-3.430.0:0 >virtual/perl-File-Path-2.180.0-r1:0 >virtual/perl-File-Spec-3.800.0:0 >virtual/perl-File-Temp-0.231.100:0 >virtual/perl-Getopt-Long-2.520.0-r1:0 >virtual/perl-IO-1.460.0:0 >virtual/perl-IO-Compress-2.103.0-r1:0 >virtual/perl-IO-Socket-IP-0.410.0-r1:0 >virtual/perl-JSON-PP-4.60.0:0 >virtual/perl-libnet-3.130.0:0 ssl >virtual/perl-MIME-Base64-3.160.0-r1:0 >virtual/perl-Module-Metadata-1.0.37-r2:0 >virtual/perl-parent-0.238.0-r2:0 >virtual/perl-Parse-CPAN-Meta-2.150.10-r6:0 >virtual/perl-Perl-OSType-1.10.0-r6:0 >virtual/perl-podlators-4.140.0-r3:0 >virtual/perl-Scalar-List-Utils-1.560.0:0 >virtual/perl-Test-Harness-3.430.0:0 >virtual/perl-Text-ParseWords-3.300.0-r8:0 >virtual/perl-Time-Local-1.300.0-r1:0 >virtual/perl-version-0.992.800:0 >virtual/perl-XSLoader-0.300.0-r4:0 >virtual/pkgconfig-2-r1:0 >virtual/service-manager-1:0 >virtual/ssh-0:0 -minimal >virtual/tmpfiles-0-r3:0 >virtual/ttf-fonts-1-r1:0 >virtual/udev-217-r5:0 >virtual/w3m-1:0 >virtual/yacc-0:0 >www-client/pybugz-0.13-r2:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >www-client/w3m-0.5.3_p20220429:0 -X -fbcon -gdk-pixbuf -gpm -imlib l10n_ja -lynxkeymap nls -nntp ssl unicode -xface >x11-apps/xprop-1.2.5:0 >x11-apps/xset-1.2.4-r1:0 >x11-base/xcb-proto-1.15:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >x11-base/xorg-proto-2022.1:0 -test >x11-libs/cairo-1.16.0-r5:0 X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -aqua -debug -gles2-only glib -opengl -static-libs svg -utils -valgrind >x11-libs/libICE-1.0.10-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 ipv6 >x11-libs/libSM-1.2.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 uuid >x11-libs/libX11-1.7.5:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 -test >x11-libs/libXau-1.0.9-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libxcb-1.15:0/1.12 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -doc -selinux -test xkb >x11-libs/libXdmcp-1.1.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libXext-1.3.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libXmu-1.1.3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 >x11-libs/libXrender-0.9.10-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >x11-libs/libXt-1.2.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -test >x11-libs/pixman-0.40.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cpu_flags_arm_iwmmxt -cpu_flags_arm_iwmmxt2 -cpu_flags_arm_neon -cpu_flags_ppc_altivec cpu_flags_x86_mmxext cpu_flags_x86_sse2 cpu_flags_x86_ssse3 -loongson2f -static-libs -test >x11-libs/xtrans-1.4.0:0 -doc >x11-misc/compose-tables-1.8:0 >x11-misc/shared-mime-info-2.2:0 -test >x11-misc/xdg-utils-1.1.3_p20210805:0 -dbus -doc -gnome > > >####################### ># build.log # >####################### >>>> Unpacking source... >>>> Unpacking pytest-benchmark-3.4.1.gh.tar.gz to /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work >>>> Source unpacked in /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work >>>> Preparing source in /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1 ... > * Build system packages: > * dev-python/setuptools : 62.3.2-r1 >>>> Source prepared. >>>> Configuring source in /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1 ... >>>> Source configured. >>>> Compiling source in /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1 ... > * python3_8: running distutils-r1_run_phase distutils-r1_python_compile >python3.8 setup.py build -j 18 >running build >running build_py >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/utils.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/timers.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/table.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/stats.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/session.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/plugin.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/pep418.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/logger.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/hookspec.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/histogram.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/fixture.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/csv.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/compat.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/cli.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/__main__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >copying src/pytest_benchmark/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/file.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/elasticsearch.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_8/lib/pytest_benchmark/storage >running egg_info >creating src/pytest_benchmark.egg-info >writing src/pytest_benchmark.egg-info/PKG-INFO >writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt >writing entry points to src/pytest_benchmark.egg-info/entry_points.txt >writing requirements to src/pytest_benchmark.egg-info/requires.txt >writing top-level names to src/pytest_benchmark.egg-info/top_level.txt >writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >listing git files failed - pretending there aren't any >reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >reading manifest template 'MANIFEST.in' >warning: no previously-included files matching '*.py[cod]' found anywhere in distribution >warning: no previously-included files matching '__pycache__/*' found anywhere in distribution >warning: no previously-included files matching '*.so' found anywhere in distribution >warning: no previously-included files matching '*.dylib' found anywhere in distribution >adding license file 'LICENSE' >adding license file 'AUTHORS.rst' >writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >warning: build_py: byte-compiling is disabled, skipping. > > * python3_9: running distutils-r1_run_phase distutils-r1_python_compile >python3.9 setup.py build -j 18 >running build >running build_py >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/utils.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/timers.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/table.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/stats.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/session.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/plugin.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/pep418.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/logger.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/hookspec.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/histogram.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/fixture.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/csv.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/compat.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/cli.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/__main__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >copying src/pytest_benchmark/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/file.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/elasticsearch.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_9/lib/pytest_benchmark/storage >running egg_info >writing src/pytest_benchmark.egg-info/PKG-INFO >writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt >writing entry points to src/pytest_benchmark.egg-info/entry_points.txt >writing requirements to src/pytest_benchmark.egg-info/requires.txt >writing top-level names to src/pytest_benchmark.egg-info/top_level.txt >listing git files failed - pretending there aren't any >reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >reading manifest template 'MANIFEST.in' >warning: no previously-included files matching '*.py[cod]' found anywhere in distribution >warning: no previously-included files matching '__pycache__/*' found anywhere in distribution >warning: no previously-included files matching '*.so' found anywhere in distribution >warning: no previously-included files matching '*.dylib' found anywhere in distribution >adding license file 'LICENSE' >adding license file 'AUTHORS.rst' >writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >warning: build_py: byte-compiling is disabled, skipping. > > * python3_10: running distutils-r1_run_phase distutils-r1_python_compile >python3.10 setup.py build -j 18 >running build >running build_py >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/utils.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/timers.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/table.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/stats.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/session.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/plugin.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/pep418.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/logger.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/hookspec.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/histogram.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/fixture.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/csv.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/compat.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/cli.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/__main__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >copying src/pytest_benchmark/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark >creating /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/file.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/elasticsearch.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark/storage >copying src/pytest_benchmark/storage/__init__.py -> /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1-python3_10/lib/pytest_benchmark/storage >running egg_info >writing src/pytest_benchmark.egg-info/PKG-INFO >writing dependency_links to src/pytest_benchmark.egg-info/dependency_links.txt >writing entry points to src/pytest_benchmark.egg-info/entry_points.txt >writing requirements to src/pytest_benchmark.egg-info/requires.txt >writing top-level names to src/pytest_benchmark.egg-info/top_level.txt >listing git files failed - pretending there aren't any >reading manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >reading manifest template 'MANIFEST.in' >warning: no previously-included files matching '*.py[cod]' found anywhere in distribution >warning: no previously-included files matching '__pycache__/*' found anywhere in distribution >warning: no previously-included files matching '*.so' found anywhere in distribution >warning: no previously-included files matching '*.dylib' found anywhere in distribution >adding license file 'LICENSE' >adding license file 'AUTHORS.rst' >writing manifest file 'src/pytest_benchmark.egg-info/SOURCES.txt' >warning: build_py: byte-compiling is disabled, skipping. > > * Checking whether python3_10 is suitable ... > * >=dev-lang/python-3.10.0_p1-r1:3.10 ... > [ ok ] > * python_check_deps ... > [ ok ] > * Using python3.10 in global scope > * python3_10: running distutils-r1_run_phase python_compile_all >>>> Source compiled. >>>> Test phase: dev-python/pytest-benchmark-3.4.1 > * python3_8: running distutils-r1_run_phase python_test >python3.8 -m pytest -vv -ra -l -Wdefault --color=no -o console_output_style=count -p no:cov -p no:flake8 -p no:flakes -p no:pylint -p no:markdown --deselect tests/test_cli.py::test_help --deselect tests/test_cli.py::test_help_compare -o markers=benchmark >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1, configfile: setup.cfg, testpaths: tests >plugins: aspectlib-1.5.2 >collecting ... collected 232 items / 6 deselected / 226 selected > >tests/test_benchmark.py::test_help FAILED [ 1/226] >tests/test_benchmark.py::test_groups FAILED [ 2/226] >tests/test_benchmark.py::test_group_by_name FAILED [ 3/226] >tests/test_benchmark.py::test_group_by_func FAILED [ 4/226] >tests/test_benchmark.py::test_group_by_fullfunc FAILED [ 5/226] >tests/test_benchmark.py::test_group_by_param_all FAILED [ 6/226] >tests/test_benchmark.py::test_group_by_param_select FAILED [ 7/226] >tests/test_benchmark.py::test_group_by_param_select_multiple FAILED [ 8/226] >tests/test_benchmark.py::test_group_by_fullname FAILED [ 9/226] >tests/test_benchmark.py::test_double_use FAILED [ 10/226] >tests/test_benchmark.py::test_only_override_skip FAILED [ 11/226] >tests/test_benchmark.py::test_fixtures_also_skipped FAILED [ 12/226] >tests/test_benchmark.py::test_conflict_between_only_and_disable FAILED [ 13/226] >tests/test_benchmark.py::test_max_time_min_rounds FAILED [ 14/226] >tests/test_benchmark.py::test_max_time FAILED [ 15/226] >tests/test_benchmark.py::test_bogus_max_time FAILED [ 16/226] >tests/test_benchmark.py::test_pep418_timer FAILED [ 17/226] >tests/test_benchmark.py::test_bad_save FAILED [ 18/226] >tests/test_benchmark.py::test_bad_save_2 FAILED [ 19/226] >tests/test_benchmark.py::test_bad_compare_fail FAILED [ 20/226] >tests/test_benchmark.py::test_bad_rounds FAILED [ 21/226] >tests/test_benchmark.py::test_bad_rounds_2 FAILED [ 22/226] >tests/test_benchmark.py::test_compare FAILED [ 23/226] >tests/test_benchmark.py::test_compare_last FAILED [ 24/226] >tests/test_benchmark.py::test_compare_non_existing FAILED [ 25/226] >tests/test_benchmark.py::test_compare_non_existing_verbose FAILED [ 26/226] >tests/test_benchmark.py::test_compare_no_files FAILED [ 27/226] >tests/test_benchmark.py::test_compare_no_files_verbose FAILED [ 28/226] >tests/test_benchmark.py::test_compare_no_files_match FAILED [ 29/226] >tests/test_benchmark.py::test_compare_no_files_match_verbose FAILED [ 30/226] >tests/test_benchmark.py::test_verbose FAILED [ 31/226] >tests/test_benchmark.py::test_save FAILED [ 32/226] >tests/test_benchmark.py::test_save_extra_info FAILED [ 33/226] >tests/test_benchmark.py::test_update_machine_info_hook_detection FAILED [ 34/226] >tests/test_benchmark.py::test_histogram FAILED [ 35/226] >tests/test_benchmark.py::test_autosave FAILED [ 36/226] >tests/test_benchmark.py::test_bogus_min_time FAILED [ 37/226] >tests/test_benchmark.py::test_disable_gc FAILED [ 38/226] >tests/test_benchmark.py::test_custom_timer FAILED [ 39/226] >tests/test_benchmark.py::test_bogus_timer FAILED [ 40/226] >tests/test_benchmark.py::test_sort_by_mean FAILED [ 41/226] >tests/test_benchmark.py::test_bogus_sort FAILED [ 42/226] >tests/test_benchmark.py::test_xdist SKIPPED (could not import 'xdist': No module named 'xdist') [ 43/226] >tests/test_benchmark.py::test_xdist_verbose SKIPPED (could not import 'xdist': No module named 'xdist') [ 44/226] >tests/test_benchmark.py::test_cprofile FAILED [ 45/226] >tests/test_benchmark.py::test_disabled_and_cprofile FAILED [ 46/226] >tests/test_benchmark.py::test_abort_broken FAILED [ 47/226] >tests/test_benchmark.py::test_basic FAILED [ 48/226] >tests/test_benchmark.py::test_skip FAILED [ 49/226] >tests/test_benchmark.py::test_disable FAILED [ 50/226] >tests/test_benchmark.py::test_mark_selection FAILED [ 51/226] >tests/test_benchmark.py::test_only_benchmarks FAILED [ 52/226] >tests/test_benchmark.py::test_columns FAILED [ 53/226] >tests/test_calibration.py::test_calibrate ERROR [ 54/226] >tests/test_calibration.py::test_calibrate_fast ERROR [ 55/226] >tests/test_calibration.py::test_calibrate_xfast ERROR [ 56/226] >tests/test_calibration.py::test_calibrate_slow ERROR [ 57/226] >tests/test_calibration.py::test_calibrate_stuck[True-0-1] ERROR [ 58/226] >tests/test_calibration.py::test_calibrate_stuck[True-0-0.01] ERROR [ 59/226] >tests/test_calibration.py::test_calibrate_stuck[True-0-1e-09] ERROR [ 60/226] >tests/test_calibration.py::test_calibrate_stuck[True-0-1e-10] ERROR [ 61/226] >tests/test_calibration.py::test_calibrate_stuck[True-0-1.000000000000001] ERROR [ 62/226] >tests/test_calibration.py::test_calibrate_stuck[True-1-1] ERROR [ 63/226] >tests/test_calibration.py::test_calibrate_stuck[True-1-0.01] ERROR [ 64/226] >tests/test_calibration.py::test_calibrate_stuck[True-1-1e-09] ERROR [ 65/226] >tests/test_calibration.py::test_calibrate_stuck[True-1-1e-10] ERROR [ 66/226] >tests/test_calibration.py::test_calibrate_stuck[True-1-1.000000000000001] ERROR [ 67/226] >tests/test_calibration.py::test_calibrate_stuck[True--1-1] ERROR [ 68/226] >tests/test_calibration.py::test_calibrate_stuck[True--1-0.01] ERROR [ 69/226] >tests/test_calibration.py::test_calibrate_stuck[True--1-1e-09] ERROR [ 70/226] >tests/test_calibration.py::test_calibrate_stuck[True--1-1e-10] ERROR [ 71/226] >tests/test_calibration.py::test_calibrate_stuck[True--1-1.000000000000001] ERROR [ 72/226] >tests/test_calibration.py::test_calibrate_stuck[False-0-1] ERROR [ 73/226] >tests/test_calibration.py::test_calibrate_stuck[False-0-0.01] ERROR [ 74/226] >tests/test_calibration.py::test_calibrate_stuck[False-0-1e-09] ERROR [ 75/226] >tests/test_calibration.py::test_calibrate_stuck[False-0-1e-10] ERROR [ 76/226] >tests/test_calibration.py::test_calibrate_stuck[False-0-1.000000000000001] ERROR [ 77/226] >tests/test_calibration.py::test_calibrate_stuck[False-1-1] ERROR [ 78/226] >tests/test_calibration.py::test_calibrate_stuck[False-1-0.01] ERROR [ 79/226] >tests/test_calibration.py::test_calibrate_stuck[False-1-1e-09] ERROR [ 80/226] >tests/test_calibration.py::test_calibrate_stuck[False-1-1e-10] ERROR [ 81/226] >tests/test_calibration.py::test_calibrate_stuck[False-1-1.000000000000001] ERROR [ 82/226] >tests/test_calibration.py::test_calibrate_stuck[False--1-1] ERROR [ 83/226] >tests/test_calibration.py::test_calibrate_stuck[False--1-0.01] ERROR [ 84/226] >tests/test_calibration.py::test_calibrate_stuck[False--1-1e-09] ERROR [ 85/226] >tests/test_calibration.py::test_calibrate_stuck[False--1-1e-10] ERROR [ 86/226] >tests/test_calibration.py::test_calibrate_stuck[False--1-1.000000000000001] ERROR [ 87/226] >tests/test_cli.py::test_list FAILED [ 88/226] >tests/test_cli.py::test_compare[short-<lambda>] FAILED [ 89/226] >tests/test_cli.py::test_compare[long-<lambda>] FAILED [ 90/226] >tests/test_cli.py::test_compare[normal-<lambda>] FAILED [ 91/226] >tests/test_cli.py::test_compare[trial-<lambda>] FAILED [ 92/226] >tests/test_doctest.rst::test_doctest.rst PASSED [ 93/226] >tests/test_elasticsearch_storage.py::test_handle_saving PASSED [ 94/226] >tests/test_elasticsearch_storage.py::test_parse_with_no_creds PASSED [ 95/226] >tests/test_elasticsearch_storage.py::test_parse_with_creds_in_first_host_of_url PASSED [ 96/226] >tests/test_elasticsearch_storage.py::test_parse_with_creds_in_second_host_of_url PASSED [ 97/226] >tests/test_elasticsearch_storage.py::test_parse_with_creds_in_netrc PASSED [ 98/226] >tests/test_elasticsearch_storage.py::test_parse_url_creds_supersedes_netrc_creds PASSED [ 99/226] >tests/test_elasticsearch_storage.py::test__mask_hosts PASSED [100/226] >tests/test_normal.py::test_normal PASSED [101/226] >tests/test_normal.py::test_fast ERROR [102/226] >tests/test_normal.py::test_slow ERROR [103/226] >tests/test_normal.py::test_slower ERROR [104/226] >tests/test_normal.py::test_xfast ERROR [105/226] >tests/test_normal.py::test_parametrized[0] ERROR [106/226] >tests/test_normal.py::test_parametrized[1] ERROR [107/226] >tests/test_normal.py::test_parametrized[2] ERROR [108/226] >tests/test_normal.py::test_parametrized[3] ERROR [109/226] >tests/test_normal.py::test_parametrized[4] ERROR [110/226] >tests/test_pedantic.py::test_single ERROR [111/226] >tests/test_pedantic.py::test_setup ERROR [112/226] >tests/test_pedantic.py::test_setup_cprofile ERROR [113/226] >tests/test_pedantic.py::test_args_kwargs ERROR [114/226] >tests/test_pedantic.py::test_iterations ERROR [115/226] >tests/test_pedantic.py::test_rounds_iterations ERROR [116/226] >tests/test_pedantic.py::test_rounds ERROR [117/226] >tests/test_pedantic.py::test_warmup_rounds ERROR [118/226] >tests/test_pedantic.py::test_rounds_must_be_int[0] ERROR [119/226] >tests/test_pedantic.py::test_rounds_must_be_int[x] ERROR [120/226] >tests/test_pedantic.py::test_warmup_rounds_must_be_int[-15] ERROR [121/226] >tests/test_pedantic.py::test_warmup_rounds_must_be_int[x] ERROR [122/226] >tests/test_pedantic.py::test_setup_many_rounds ERROR [123/226] >tests/test_pedantic.py::test_cant_use_both_args_and_setup_with_return ERROR [124/226] >tests/test_pedantic.py::test_can_use_both_args_and_setup_without_return ERROR [125/226] >tests/test_pedantic.py::test_cant_use_setup_with_many_iterations ERROR [126/226] >tests/test_pedantic.py::test_iterations_must_be_positive_int[0] ERROR [127/226] >tests/test_pedantic.py::test_iterations_must_be_positive_int[-1] ERROR [128/226] >tests/test_pedantic.py::test_iterations_must_be_positive_int[asdf] ERROR [129/226] >tests/test_sample.py::test_proto[SimpleProxy] ERROR [130/226] >tests/test_sample.py::test_proto[CachedPropertyProxy] ERROR [131/226] >tests/test_sample.py::test_proto[LocalsSimpleProxy] ERROR [132/226] >tests/test_sample.py::test_proto[LocalsCachedPropertyProxy] ERROR [133/226] >tests/test_skip.py::test_skip ERROR [134/226] >tests/test_stats.py::test_1 PASSED [135/226] >tests/test_stats.py::test_2 PASSED [136/226] >tests/test_stats.py::test_single_item PASSED [137/226] >tests/test_stats.py::test_length[1] PASSED [138/226] >tests/test_stats.py::test_length[2] PASSED [139/226] >tests/test_stats.py::test_length[3] PASSED [140/226] >tests/test_stats.py::test_length[4] PASSED [141/226] >tests/test_stats.py::test_length[5] PASSED [142/226] >tests/test_stats.py::test_length[6] PASSED [143/226] >tests/test_stats.py::test_length[7] PASSED [144/226] >tests/test_stats.py::test_length[8] PASSED [145/226] >tests/test_stats.py::test_length[9] PASSED [146/226] >tests/test_stats.py::test_iqr PASSED [147/226] >tests/test_stats.py::test_ops PASSED [148/226] >tests/test_storage.py::test_rendering[short] PASSED [149/226] >tests/test_storage.py::test_rendering[normal] PASSED [150/226] >tests/test_storage.py::test_rendering[long] PASSED [151/226] >tests/test_storage.py::test_rendering[trial] PASSED [152/226] >tests/test_storage.py::test_regression_checks[short] PASSED [153/226] >tests/test_storage.py::test_regression_checks[normal] PASSED [154/226] >tests/test_storage.py::test_regression_checks[long] PASSED [155/226] >tests/test_storage.py::test_regression_checks[trial] PASSED [156/226] >tests/test_storage.py::test_regression_checks_inf[short] PASSED [157/226] >tests/test_storage.py::test_regression_checks_inf[normal] PASSED [158/226] >tests/test_storage.py::test_regression_checks_inf[long] PASSED [159/226] >tests/test_storage.py::test_regression_checks_inf[trial] PASSED [160/226] >tests/test_storage.py::test_compare_1[short] PASSED [161/226] >tests/test_storage.py::test_compare_1[normal] PASSED [162/226] >tests/test_storage.py::test_compare_1[long] PASSED [163/226] >tests/test_storage.py::test_compare_1[trial] PASSED [164/226] >tests/test_storage.py::test_compare_2[short] PASSED [165/226] >tests/test_storage.py::test_compare_2[normal] PASSED [166/226] >tests/test_storage.py::test_compare_2[long] PASSED [167/226] >tests/test_storage.py::test_compare_2[trial] PASSED [168/226] >tests/test_storage.py::test_save_json[short] PASSED [169/226] >tests/test_storage.py::test_save_json[normal] PASSED [170/226] >tests/test_storage.py::test_save_json[long] PASSED [171/226] >tests/test_storage.py::test_save_json[trial] PASSED [172/226] >tests/test_storage.py::test_save_with_name[short] PASSED [173/226] >tests/test_storage.py::test_save_with_name[normal] PASSED [174/226] >tests/test_storage.py::test_save_with_name[long] PASSED [175/226] >tests/test_storage.py::test_save_with_name[trial] PASSED [176/226] >tests/test_storage.py::test_save_no_name[short] PASSED [177/226] >tests/test_storage.py::test_save_no_name[normal] PASSED [178/226] >tests/test_storage.py::test_save_no_name[long] PASSED [179/226] >tests/test_storage.py::test_save_no_name[trial] PASSED [180/226] >tests/test_storage.py::test_save_with_error[short] PASSED [181/226] >tests/test_storage.py::test_save_with_error[normal] PASSED [182/226] >tests/test_storage.py::test_save_with_error[long] PASSED [183/226] >tests/test_storage.py::test_save_with_error[trial] PASSED [184/226] >tests/test_storage.py::test_autosave[short] PASSED [185/226] >tests/test_storage.py::test_autosave[normal] PASSED [186/226] >tests/test_storage.py::test_autosave[long] PASSED [187/226] >tests/test_storage.py::test_autosave[trial] PASSED [188/226] >tests/test_utils.py::test_clonefunc[<lambda>] PASSED [189/226] >tests/test_utils.py::test_clonefunc[f2] PASSED [190/226] >tests/test_utils.py::test_clonefunc_not_function PASSED [191/226] >tests/test_utils.py::test_get_commit_info[git-True] PASSED [192/226] >tests/test_utils.py::test_get_commit_info[git-False] PASSED [193/226] >tests/test_utils.py::test_get_commit_info[hg-True] SKIPPED (%r not availabe on $PATH) [194/226] >tests/test_utils.py::test_get_commit_info[hg-False] SKIPPED (%r not availabe on $PATH) [195/226] >tests/test_utils.py::test_missing_scm_bins[git-True] PASSED [196/226] >tests/test_utils.py::test_missing_scm_bins[git-False] PASSED [197/226] >tests/test_utils.py::test_missing_scm_bins[hg-True] SKIPPED (%r not availabe on $PATH) [198/226] >tests/test_utils.py::test_missing_scm_bins[hg-False] SKIPPED (%r not availabe on $PATH) [199/226] >tests/test_utils.py::test_get_branch_info[git] PASSED [200/226] >tests/test_utils.py::test_get_branch_info[hg] SKIPPED (%r not availabe on $PATH) [201/226] >tests/test_utils.py::test_no_branch_info PASSED [202/226] >tests/test_utils.py::test_commit_info_error PASSED [203/226] >tests/test_utils.py::test_parse_warmup PASSED [204/226] >tests/test_utils.py::test_parse_columns PASSED [205/226] >tests/test_utils.py::test_get_project_name[False-None] PASSED [206/226] >tests/test_utils.py::test_get_project_name[False-git] PASSED [207/226] >tests/test_utils.py::test_get_project_name[False-hg] SKIPPED (%r not availabe on $PATH) [208/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-None] PASSED [209/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-git] PASSED [210/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo-hg] SKIPPED (%r not availabe on $PATH) [211/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-None] PASSED [212/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-git] PASSED [213/226] >tests/test_utils.py::test_get_project_name[https://example.com/pytest_benchmark_repo.git-hg] SKIPPED (%r not availabe on $PATH) [214/226] >tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-None] PASSED [215/226] >tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-git] PASSED [216/226] >tests/test_utils.py::test_get_project_name[c:\\foo\\bar\\pytest_benchmark_repo.gitfoo@example.com:pytest_benchmark_repo.git-hg] SKIPPED (%r not availabe on $PATH) [217/226] >tests/test_utils.py::test_get_project_name_broken[git] PASSED [218/226] >tests/test_utils.py::test_get_project_name_broken[hg] PASSED [219/226] >tests/test_utils.py::test_get_project_name_fallback PASSED [220/226] >tests/test_utils.py::test_get_project_name_fallback_broken_hgrc PASSED [221/226] >tests/test_utils.py::test_parse_elasticsearch_storage PASSED [222/226] >tests/test_with_testcase.py::TerribleTerribleWayToWriteTests::test_foo ERROR [223/226] >tests/test_with_testcase.py::TerribleTerribleWayToWritePatchTests::test_foo2 ERROR [224/226] >tests/test_with_weaver.py::test_weave_fixture ERROR [225/226] >tests/test_with_weaver.py::test_weave_method ERROR [226/226] > >==================================== ERRORS ==================================== >_______________________ ERROR at setup of test_calibrate _______________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 13 > @pytest.mark.benchmark(warmup=True, warmup_iterations=10 ** 8, max_time=10) > def test_calibrate(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:13 >____________________ ERROR at setup of test_calibrate_fast _____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 18 > @pytest.mark.benchmark(warmup=True, warmup_iterations=10 ** 8, max_time=10) > def test_calibrate_fast(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:18 >____________________ ERROR at setup of test_calibrate_xfast ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 23 > @pytest.mark.benchmark(warmup=True, warmup_iterations=10 ** 8, max_time=10) > def test_calibrate_xfast(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:23 >____________________ ERROR at setup of test_calibrate_slow _____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 28 > @pytest.mark.benchmark(warmup=True, warmup_iterations=10 ** 8, max_time=10) > def test_calibrate_slow(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:28 >_______________ ERROR at setup of test_calibrate_stuck[True-0-1] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-0-0.01] ______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-0-1e-09] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-0-1e-10] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_______ ERROR at setup of test_calibrate_stuck[True-0-1.000000000000001] _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_______________ ERROR at setup of test_calibrate_stuck[True-1-1] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-1-0.01] ______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-1-1e-09] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True-1-1e-10] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_______ ERROR at setup of test_calibrate_stuck[True-1-1.000000000000001] _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______________ ERROR at setup of test_calibrate_stuck[True--1-1] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[True--1-0.01] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[True--1-1e-09] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[True--1-1e-10] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______ ERROR at setup of test_calibrate_stuck[True--1-1.000000000000001] _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______________ ERROR at setup of test_calibrate_stuck[False-0-1] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[False-0-0.01] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False-0-1e-09] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False-0-1e-10] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______ ERROR at setup of test_calibrate_stuck[False-0-1.000000000000001] _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______________ ERROR at setup of test_calibrate_stuck[False-1-1] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_____________ ERROR at setup of test_calibrate_stuck[False-1-0.01] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False-1-1e-09] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False-1-1e-10] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______ ERROR at setup of test_calibrate_stuck[False-1-1.000000000000001] _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______________ ERROR at setup of test_calibrate_stuck[False--1-1] ______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False--1-0.01] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False--1-1e-09] ____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >____________ ERROR at setup of test_calibrate_stuck[False--1-1e-10] ____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >______ ERROR at setup of test_calibrate_stuck[False--1-1.000000000000001] ______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py, line 47 > @pytest.mark.parametrize("minimum", [1, 0.01, 0.000000001, 0.0000000001, 1.000000000000001]) > @pytest.mark.parametrize("skew_ratio", [0, 1, -1]) > @pytest.mark.parametrize("additive", [True, False]) > @pytest.mark.benchmark(max_time=0, min_rounds=1, calibration_precision=100) > def test_calibrate_stuck(benchmark, minimum, additive, skew_ratio): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_calibration.py:47 >_________________________ ERROR at setup of test_fast __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 15 > @pytest.mark.skipif('sys.platform == "win32"') > def test_fast(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:15 >_________________________ ERROR at setup of test_slow __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 26 > def test_slow(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:26 >________________________ ERROR at setup of test_slower _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 30 > def test_slower(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:30 >_________________________ ERROR at setup of test_xfast _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 34 > @pytest.mark.benchmark(min_rounds=2, timer=time.time, max_time=0.01) > def test_xfast(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:34 >____________________ ERROR at setup of test_parametrized[0] ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 44 > @pytest.mark.skipif('sys.platform == "win32"') > def test_parametrized(benchmark, foo): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:44 >____________________ ERROR at setup of test_parametrized[1] ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 44 > @pytest.mark.skipif('sys.platform == "win32"') > def test_parametrized(benchmark, foo): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:44 >____________________ ERROR at setup of test_parametrized[2] ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 44 > @pytest.mark.skipif('sys.platform == "win32"') > def test_parametrized(benchmark, foo): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:44 >____________________ ERROR at setup of test_parametrized[3] ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 44 > @pytest.mark.skipif('sys.platform == "win32"') > def test_parametrized(benchmark, foo): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:44 >____________________ ERROR at setup of test_parametrized[4] ____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py, line 44 > @pytest.mark.skipif('sys.platform == "win32"') > def test_parametrized(benchmark, foo): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, foo, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_normal.py:44 >________________________ ERROR at setup of test_single _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 6 > def test_single(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:6 >_________________________ ERROR at setup of test_setup _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 12 > def test_setup(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:12 >____________________ ERROR at setup of test_setup_cprofile _____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 25 > @pytest.mark.benchmark(cprofile=True) > def test_setup_cprofile(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:25 >______________________ ERROR at setup of test_args_kwargs ______________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 39 > def test_args_kwargs(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:39 >______________________ ERROR at setup of test_iterations _______________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 49 > def test_iterations(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:49 >___________________ ERROR at setup of test_rounds_iterations ___________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 56 > def test_rounds_iterations(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:56 >________________________ ERROR at setup of test_rounds _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 63 > def test_rounds(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:63 >_____________________ ERROR at setup of test_warmup_rounds _____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 70 > def test_warmup_rounds(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:70 >_________________ ERROR at setup of test_rounds_must_be_int[0] _________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 77 > @mark.parametrize("value", [0, "x"]) > def test_rounds_must_be_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:77 >_________________ ERROR at setup of test_rounds_must_be_int[x] _________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 77 > @mark.parametrize("value", [0, "x"]) > def test_rounds_must_be_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:77 >____________ ERROR at setup of test_warmup_rounds_must_be_int[-15] _____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 84 > @mark.parametrize("value", [-15, "x"]) > def test_warmup_rounds_must_be_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:84 >_____________ ERROR at setup of test_warmup_rounds_must_be_int[x] ______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 84 > @mark.parametrize("value", [-15, "x"]) > def test_warmup_rounds_must_be_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:84 >___________________ ERROR at setup of test_setup_many_rounds ___________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 91 > def test_setup_many_rounds(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:91 >_______ ERROR at setup of test_cant_use_both_args_and_setup_with_return ________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 104 > def test_cant_use_both_args_and_setup_with_return(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:104 >______ ERROR at setup of test_can_use_both_args_and_setup_without_return _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 117 > def test_can_use_both_args_and_setup_without_return(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:117 >__________ ERROR at setup of test_cant_use_setup_with_many_iterations __________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 127 > def test_cant_use_setup_with_many_iterations(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:127 >__________ ERROR at setup of test_iterations_must_be_positive_int[0] ___________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 131 > @mark.parametrize("value", [0, -1, "asdf"]) > def test_iterations_must_be_positive_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:131 >__________ ERROR at setup of test_iterations_must_be_positive_int[-1] __________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 131 > @mark.parametrize("value", [0, -1, "asdf"]) > def test_iterations_must_be_positive_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:131 >_________ ERROR at setup of test_iterations_must_be_positive_int[asdf] _________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py, line 131 > @mark.parametrize("value", [0, -1, "asdf"]) > def test_iterations_must_be_positive_int(benchmark, value): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_pedantic.py:131 >__________________ ERROR at setup of test_proto[SimpleProxy] ___________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py, line 68 > def test_proto(benchmark, impl): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, impl, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py:68 >______________ ERROR at setup of test_proto[CachedPropertyProxy] _______________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py, line 68 > def test_proto(benchmark, impl): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, impl, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py:68 >_______________ ERROR at setup of test_proto[LocalsSimpleProxy] ________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py, line 68 > def test_proto(benchmark, impl): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, impl, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py:68 >___________ ERROR at setup of test_proto[LocalsCachedPropertyProxy] ____________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py, line 68 > def test_proto(benchmark, impl): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, impl, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_sample.py:68 >_________________________ ERROR at setup of test_skip __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_skip.py, line 4 > def test_skip(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_skip.py:4 >__________ ERROR at setup of TerribleTerribleWayToWriteTests.test_foo __________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py, line 12 > def test_foo(self): >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py, line 8 > @pytest.fixture(autouse=True) > def setupBenchmark(self, benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, _unittest_setUpClass_fixture_TerribleTerribleWayToWriteTests, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, setupBenchmark, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py:8 >_______ ERROR at setup of TerribleTerribleWayToWritePatchTests.test_foo2 _______ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py, line 21 > def test_foo2(self): >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py, line 17 > @pytest.fixture(autouse=True) > def setupBenchmark(self, benchmark_weave): >E fixture 'benchmark_weave' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, _unittest_setUpClass_fixture_TerribleTerribleWayToWritePatchTests, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, setupBenchmark, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_testcase.py:17 >_____________________ ERROR at setup of test_weave_fixture _____________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_weaver.py, line 17 > @pytest.mark.benchmark(max_time=0.001) > def test_weave_fixture(benchmark_weave): >E fixture 'benchmark_weave' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_weaver.py:17 >_____________________ ERROR at setup of test_weave_method ______________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_weaver.py, line 24 > @pytest.mark.benchmark(max_time=0.001) > def test_weave_method(benchmark): >E fixture 'benchmark' not found >> available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_with_weaver.py:24 >=================================== FAILURES =================================== >__________________________________ test_help ___________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:12: in test_help > result.stdout.fnmatch_lines([ >E Failed: fnmatch: '*' >E with: 'usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E fnmatch: '*' >E with: '' >E nomatch: 'benchmark:' >E and: 'positional arguments:' >E and: ' file_or_dir' >E and: '' >E and: 'general:' >E and: ' -k EXPRESSION only run tests which match the given substring' >E and: ' expression. An expression is a python evaluatable' >E and: ' expression where all names are substring-matched against' >E and: ' test names and their parent classes. Example: -k' >E and: " 'test_method or test_other' matches all test functions" >E and: " and classes whose name contains 'test_method' or" >E and: " 'test_other', while -k 'not test_method' matches those" >E and: " that don't contain 'test_method' in their names. -k 'not" >E and: " test_method and not test_other' will eliminate the" >E and: ' matches. Additionally keywords are matched to classes' >E and: ' and functions containing extra names in their' >E and: " 'extra_keyword_matches' set, as well as functions which" >E and: ' have names assigned directly to them. The matching is' >E and: ' case-insensitive.' >E and: ' -m MARKEXPR only run tests matching given mark expression.' >E and: " For example: -m 'mark1 and not mark2'." >E and: ' --markers show markers (builtin, plugin and per-project ones).' >E and: ' -x, --exitfirst exit instantly on first error or failed test.' >E and: ' --fixtures, --funcargs' >E and: ' show available fixtures, sorted by plugin appearance' >E and: " (fixtures with leading '_' are only shown with '-v')" >E and: ' --fixtures-per-test show fixtures per test' >E and: ' --pdb start the interactive Python debugger on errors or' >E and: ' KeyboardInterrupt.' >E and: ' --pdbcls=modulename:classname' >E and: ' specify a custom interactive Python debugger for use' >E and: ' with --pdb.For example:' >E and: ' --pdbcls=IPython.terminal.debugger:TerminalPdb' >E and: ' --trace Immediately break when running each test.' >E and: ' --capture=method per-test capturing method: one of fd|sys|no|tee-sys.' >E and: ' -s shortcut for --capture=no.' >E and: ' --runxfail report the results of xfail tests as if they were not' >E and: ' marked' >E and: ' --lf, --last-failed rerun only the tests that failed at the last run (or all' >E and: ' if none failed)' >E and: ' --ff, --failed-first run all tests, but run the last failures first.' >E and: ' This may re-order tests and thus lead to repeated' >E and: ' fixture setup/teardown.' >E and: ' --nf, --new-first run tests from new files first, then the rest of the' >E and: ' tests sorted by file mtime' >E and: ' --cache-show=[CACHESHOW]' >E and: " show cache contents, don't perform collection or tests." >E and: " Optional argument: glob (default: '*')." >E and: ' --cache-clear remove all cache contents at start of test run.' >E and: ' --lfnf={all,none}, --last-failed-no-failures={all,none}' >E and: ' which tests to run with no previously (known) failures.' >E and: ' --sw, --stepwise exit on test failure and continue from last failing test' >E and: ' next time' >E and: ' --sw-skip, --stepwise-skip' >E and: ' ignore the first failing test but stop on the next' >E and: ' failing test.' >E and: ' implicitly enables --stepwise.' >E and: '' >E and: 'reporting:' >E and: ' --durations=N show N slowest setup/test durations (N=0 for all).' >E and: ' --durations-min=N Minimal duration in seconds for inclusion in slowest' >E and: ' list. Default 0.005' >E and: ' -v, --verbose increase verbosity.' >E and: ' --no-header disable header' >E and: ' --no-summary disable summary' >E and: ' -q, --quiet decrease verbosity.' >E and: ' --verbosity=VERBOSE set verbosity. Default is 0.' >E and: ' -r chars show extra test summary info as specified by chars:' >E and: ' (f)ailed, (E)rror, (s)kipped, (x)failed, (X)passed,' >E and: ' (p)assed, (P)assed with output, (a)ll except passed' >E and: ' (p/P), or (A)ll. (w)arnings are enabled by default (see' >E and: " --disable-warnings), 'N' can be used to reset the list." >E and: " (default: 'fE')." >E and: ' --disable-warnings, --disable-pytest-warnings' >E and: ' disable warnings summary' >E and: ' -l, --showlocals show locals in tracebacks (disabled by default).' >E and: ' --tb=style traceback print mode (auto/long/short/line/native/no).' >E and: ' --show-capture={no,stdout,stderr,log,all}' >E and: ' Controls how captured stdout/stderr/log is shown on' >E and: " failed tests. Default is 'all'." >E and: " --full-trace don't cut any tracebacks (default is to cut)." >E and: ' --color=color color terminal output (yes/no/auto).' >E and: ' --code-highlight={yes,no}' >E and: ' Whether code should be highlighted (only if --color is' >E and: ' also enabled)' >E and: ' --pastebin=mode send failed|all info to bpaste.net pastebin service.' >E and: ' --junit-xml=path create junit-xml style report file at given path.' >E and: ' --junit-prefix=str prepend prefix to classnames in junit-xml output' >E and: '' >E and: 'pytest-warnings:' >E and: ' -W PYTHONWARNINGS, --pythonwarnings=PYTHONWARNINGS' >E and: ' set which warnings to report, see -W option of python' >E and: ' itself.' >E and: ' --maxfail=num exit after first num failures or errors.' >E and: ' --strict-config any warnings encountered while parsing the `pytest`' >E and: ' section of the configuration file raise errors.' >E and: ' --strict-markers markers not registered in the `markers` section of the' >E and: ' configuration file raise errors.' >E and: ' --strict (deprecated) alias to --strict-markers.' >E and: ' -c file load configuration from `file` instead of trying to' >E and: ' locate one of the implicit configuration files.' >E and: ' --continue-on-collection-errors' >E and: ' Force test execution even if collection errors occur.' >E and: ' --rootdir=ROOTDIR Define root directory for tests. Can be relative path:' >E and: " 'root_dir', './root_dir', 'root_dir/another_dir/';" >E and: " absolute path: '/home/user/root_dir'; path with" >E and: " variables: '$HOME/root_dir'." >E and: '' >E and: 'collection:' >E and: " --collect-only, --co only collect tests, don't execute them." >E and: ' --pyargs try to interpret all arguments as python packages.' >E and: ' --ignore=path ignore path during collection (multi-allowed).' >E and: ' --ignore-glob=path ignore path pattern during collection (multi-allowed).' >E and: ' --deselect=nodeid_prefix' >E and: ' deselect item (via node id prefix) during collection' >E and: ' (multi-allowed).' >E and: " --confcutdir=dir only load conftest.py's relative to specified dir." >E and: " --noconftest Don't load any conftest.py files." >E and: ' --keep-duplicates Keep duplicate tests.' >E and: ' --collect-in-virtualenv' >E and: " Don't ignore tests in a local virtualenv directory" >E and: ' --import-mode={prepend,append,importlib}' >E and: ' prepend/append to sys.path when importing test modules' >E and: ' and conftest files, default is to prepend.' >E and: ' --doctest-modules run doctests in all .py modules' >E and: ' --doctest-report={none,cdiff,ndiff,udiff,only_first_failure}' >E and: ' choose another output format for diffs on doctest' >E and: ' failure' >E and: ' --doctest-glob=pat doctests file matching pattern, default: test*.txt' >E and: ' --doctest-ignore-import-errors' >E and: ' ignore doctest ImportErrors' >E and: ' --doctest-continue-on-failure' >E and: ' for a given doctest, continue to run after the first' >E and: ' failure' >E and: '' >E and: 'test session debugging and configuration:' >E and: ' --basetemp=dir base temporary directory for this test run.(warning:' >E and: ' this directory is removed if it exists)' >E and: ' -V, --version display pytest version and information about plugins.' >E and: ' When given twice, also display information about' >E and: ' plugins.' >E and: ' -h, --help show help message and configuration info' >E and: ' -p name early-load given plugin module name or entry point' >E and: ' (multi-allowed).' >E and: ' To avoid loading of plugins, use the `no:` prefix, e.g.' >E and: ' `no:doctest`.' >E and: ' --trace-config trace considerations of conftest.py files.' >E and: ' --debug=[DEBUG_FILE_NAME]' >E and: ' store internal tracing debug information in this log' >E and: ' file.' >E and: " This file is opened with 'w' and truncated as a result," >E and: ' care advised.' >E and: " Defaults to 'pytestdebug.log'." >E and: ' -o OVERRIDE_INI, --override-ini=OVERRIDE_INI' >E and: ' override ini option with "option=value" style, e.g. `-o' >E and: ' xfail_strict=True -o cache_dir=cache`.' >E and: ' --assert=MODE Control assertion debugging tools.' >E and: " 'plain' performs no assertion debugging." >E and: " 'rewrite' (the default) rewrites assert statements in" >E and: ' test modules on import to provide assert expression' >E and: ' information.' >E and: ' --setup-only only setup fixtures, do not execute tests.' >E and: ' --setup-show show setup of fixtures while executing tests.' >E and: " --setup-plan show what fixtures and tests would be executed but don't" >E and: ' execute anything.' >E and: '' >E and: 'logging:' >E and: ' --log-level=LEVEL level of messages to catch/display.' >E and: ' Not set by default, so it depends on the root/parent log' >E and: ' handler\'s effective level, where it is "WARNING" by' >E and: ' default.' >E and: ' --log-format=LOG_FORMAT' >E and: ' log format as used by the logging module.' >E and: ' --log-date-format=LOG_DATE_FORMAT' >E and: ' log date format as used by the logging module.' >E and: ' --log-cli-level=LOG_CLI_LEVEL' >E and: ' cli logging level.' >E and: ' --log-cli-format=LOG_CLI_FORMAT' >E and: ' log format as used by the logging module.' >E and: ' --log-cli-date-format=LOG_CLI_DATE_FORMAT' >E and: ' log date format as used by the logging module.' >E and: ' --log-file=LOG_FILE path to a file when logging will be written to.' >E and: ' --log-file-level=LOG_FILE_LEVEL' >E and: ' log file logging level.' >E and: ' --log-file-format=LOG_FILE_FORMAT' >E and: ' log format as used by the logging module.' >E and: ' --log-file-date-format=LOG_FILE_DATE_FORMAT' >E and: ' log date format as used by the logging module.' >E and: ' --log-auto-indent=LOG_AUTO_INDENT' >E and: ' Auto-indent multiline messages passed to the logging' >E and: ' module. Accepts true|on, false|off or an integer.' >E and: '' >E and: '[pytest] ini-options in the first pytest.ini|tox.ini|setup.cfg file found:' >E and: '' >E and: ' markers (linelist): markers for test functions' >E and: ' empty_parameter_set_mark (string):' >E and: ' default marker for empty parametersets' >E and: ' norecursedirs (args): directory patterns to avoid for recursion' >E and: ' testpaths (args): directories to search for tests when no files or' >E and: ' directories are given in the command line.' >E and: ' filterwarnings (linelist):' >E and: ' Each line specifies a pattern for' >E and: ' warnings.filterwarnings. Processed after' >E and: ' -W/--pythonwarnings.' >E and: ' usefixtures (args): list of default fixtures to be used with this project' >E and: ' python_files (args): glob-style file patterns for Python test module' >E and: ' discovery' >E and: ' python_classes (args):' >E and: ' prefixes or glob names for Python test class discovery' >E and: ' python_functions (args):' >E and: ' prefixes or glob names for Python test function and' >E and: ' method discovery' >E and: ' disable_test_id_escaping_and_forfeit_all_rights_to_community_support (bool):' >E and: ' disable string escape non-ascii characters, might cause' >E and: ' unwanted side effects(use at your own risk)' >E and: ' console_output_style (string):' >E and: ' console output: "classic", or with additional progress' >E and: ' information ("progress" (percentage) | "count").' >E and: ' xfail_strict (bool): default for the strict parameter of xfail markers when' >E and: ' not given explicitly (default: False)' >E and: ' enable_assertion_pass_hook (bool):' >E and: ' Enables the pytest_assertion_pass hook.Make sure to' >E and: ' delete any previously generated pyc cache files.' >E and: ' junit_suite_name (string):' >E and: ' Test suite name for JUnit report' >E and: ' junit_logging (string):' >E and: ' Write captured log messages to JUnit report: one of' >E and: ' no|log|system-out|system-err|out-err|all' >E and: ' junit_log_passing_tests (bool):' >E and: ' Capture log information for passing tests to JUnit' >E and: ' report:' >E and: ' junit_duration_report (string):' >E and: ' Duration time to report: one of total|call' >E and: ' junit_family (string):' >E and: ' Emit XML for schema: one of legacy|xunit1|xunit2' >E and: ' doctest_optionflags (args):' >E and: ' option flags for doctests' >E and: ' doctest_encoding (string):' >E and: ' encoding used for doctest files' >E and: ' cache_dir (string): cache directory path.' >E and: ' log_level (string): default value for --log-level' >E and: ' log_format (string): default value for --log-format' >E and: ' log_date_format (string):' >E and: ' default value for --log-date-format' >E and: ' log_cli (bool): enable log display during test run (also known as "live' >E and: ' logging").' >E and: ' log_cli_level (string):' >E and: ' default value for --log-cli-level' >E and: ' log_cli_format (string):' >E and: ' default value for --log-cli-format' >E and: ' log_cli_date_format (string):' >E and: ' default value for --log-cli-date-format' >E and: ' log_file (string): default value for --log-file' >E and: ' log_file_level (string):' >E and: ' default value for --log-file-level' >E and: ' log_file_format (string):' >E and: ' default value for --log-file-format' >E and: ' log_file_date_format (string):' >E and: ' default value for --log-file-date-format' >E and: ' log_auto_indent (string):' >E and: ' default value for --log-auto-indent' >E and: ' pythonpath (paths): Add paths to sys.path' >E and: ' faulthandler_timeout (string):' >E and: ' Dump the traceback of all threads if a test takes more' >E and: ' than TIMEOUT seconds to finish.' >E and: ' addopts (args): extra command line options' >E and: ' minversion (string): minimally required pytest version' >E and: ' required_plugins (args):' >E and: ' plugins that must be present for pytest to run' >E and: '' >E and: 'environment variables:' >E and: ' PYTEST_ADDOPTS extra command line options' >E and: ' PYTEST_PLUGINS comma-separated plugins to load during startup' >E and: ' PYTEST_DISABLE_PLUGIN_AUTOLOAD set to disable plugin auto-loading' >E and: " PYTEST_DEBUG set to enable debug tracing of pytest's internals" >E and: '' >E and: '' >E and: 'to see available markers type: pytest --markers' >E and: 'to see available fixtures type: pytest --fixtures' >E and: "(shown according to specified file_or_dir or current dir if not specified; fixtures with leading '_' are only shown with the '-v' option" >E remains unmatched: 'benchmark:' > result = <RunResult ret=ExitCode.OK len(stdout.lines)=281 len(stderr.lines)=0 duration=0.54s> > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_help0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_help0/runpytest-0 --help > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_help0 >usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] > >positional arguments: > file_or_dir > >general: > -k EXPRESSION only run tests which match the given substring > expression. An expression is a python evaluatable > expression where all names are substring-matched against > test names and their parent classes. Example: -k > 'test_method or test_other' matches all test functions > and classes whose name contains 'test_method' or > 'test_other', while -k 'not test_method' matches those > that don't contain 'test_method' in their names. -k 'not > test_method and not test_other' will eliminate the > matches. Additionally keywords are matched to classes > and functions containing extra names in their > 'extra_keyword_matches' set, as well as functions which > have names assigned directly to them. The matching is > case-insensitive. > -m MARKEXPR only run tests matching given mark expression. > For example: -m 'mark1 and not mark2'. > --markers show markers (builtin, plugin and per-project ones). > -x, --exitfirst exit instantly on first error or failed test. > --fixtures, --funcargs > show available fixtures, sorted by plugin appearance > (fixtures with leading '_' are only shown with '-v') > --fixtures-per-test show fixtures per test > --pdb start the interactive Python debugger on errors or > KeyboardInterrupt. > --pdbcls=modulename:classname > specify a custom interactive Python debugger for use > with --pdb.For example: > --pdbcls=IPython.terminal.debugger:TerminalPdb > --trace Immediately break when running each test. > --capture=method per-test capturing method: one of fd|sys|no|tee-sys. > -s shortcut for --capture=no. > --runxfail report the results of xfail tests as if they were not > marked > --lf, --last-failed rerun only the tests that failed at the last run (or all > if none failed) > --ff, --failed-first run all tests, but run the last failures first. > This may re-order tests and thus lead to repeated > fixture setup/teardown. > --nf, --new-first run tests from new files first, then the rest of the > tests sorted by file mtime > --cache-show=[CACHESHOW] > show cache contents, don't perform collection or tests. > Optional argument: glob (default: '*'). > --cache-clear remove all cache contents at start of test run. > --lfnf={all,none}, --last-failed-no-failures={all,none} > which tests to run with no previously (known) failures. > --sw, --stepwise exit on test failure and continue from last failing test > next time > --sw-skip, --stepwise-skip > ignore the first failing test but stop on the next > failing test. > implicitly enables --stepwise. > >reporting: > --durations=N show N slowest setup/test durations (N=0 for all). > --durations-min=N Minimal duration in seconds for inclusion in slowest > list. Default 0.005 > -v, --verbose increase verbosity. > --no-header disable header > --no-summary disable summary > -q, --quiet decrease verbosity. > --verbosity=VERBOSE set verbosity. Default is 0. > -r chars show extra test summary info as specified by chars: > (f)ailed, (E)rror, (s)kipped, (x)failed, (X)passed, > (p)assed, (P)assed with output, (a)ll except passed > (p/P), or (A)ll. (w)arnings are enabled by default (see > --disable-warnings), 'N' can be used to reset the list. > (default: 'fE'). > --disable-warnings, --disable-pytest-warnings > disable warnings summary > -l, --showlocals show locals in tracebacks (disabled by default). > --tb=style traceback print mode (auto/long/short/line/native/no). > --show-capture={no,stdout,stderr,log,all} > Controls how captured stdout/stderr/log is shown on > failed tests. Default is 'all'. > --full-trace don't cut any tracebacks (default is to cut). > --color=color color terminal output (yes/no/auto). > --code-highlight={yes,no} > Whether code should be highlighted (only if --color is > also enabled) > --pastebin=mode send failed|all info to bpaste.net pastebin service. > --junit-xml=path create junit-xml style report file at given path. > --junit-prefix=str prepend prefix to classnames in junit-xml output > >pytest-warnings: > -W PYTHONWARNINGS, --pythonwarnings=PYTHONWARNINGS > set which warnings to report, see -W option of python > itself. > --maxfail=num exit after first num failures or errors. > --strict-config any warnings encountered while parsing the `pytest` > section of the configuration file raise errors. > --strict-markers markers not registered in the `markers` section of the > configuration file raise errors. > --strict (deprecated) alias to --strict-markers. > -c file load configuration from `file` instead of trying to > locate one of the implicit configuration files. > --continue-on-collection-errors > Force test execution even if collection errors occur. > --rootdir=ROOTDIR Define root directory for tests. Can be relative path: > 'root_dir', './root_dir', 'root_dir/another_dir/'; > absolute path: '/home/user/root_dir'; path with > variables: '$HOME/root_dir'. > >collection: > --collect-only, --co only collect tests, don't execute them. > --pyargs try to interpret all arguments as python packages. > --ignore=path ignore path during collection (multi-allowed). > --ignore-glob=path ignore path pattern during collection (multi-allowed). > --deselect=nodeid_prefix > deselect item (via node id prefix) during collection > (multi-allowed). > --confcutdir=dir only load conftest.py's relative to specified dir. > --noconftest Don't load any conftest.py files. > --keep-duplicates Keep duplicate tests. > --collect-in-virtualenv > Don't ignore tests in a local virtualenv directory > --import-mode={prepend,append,importlib} > prepend/append to sys.path when importing test modules > and conftest files, default is to prepend. > --doctest-modules run doctests in all .py modules > --doctest-report={none,cdiff,ndiff,udiff,only_first_failure} > choose another output format for diffs on doctest > failure > --doctest-glob=pat doctests file matching pattern, default: test*.txt > --doctest-ignore-import-errors > ignore doctest ImportErrors > --doctest-continue-on-failure > for a given doctest, continue to run after the first > failure > >test session debugging and configuration: > --basetemp=dir base temporary directory for this test run.(warning: > this directory is removed if it exists) > -V, --version display pytest version and information about plugins. > When given twice, also display information about > plugins. > -h, --help show help message and configuration info > -p name early-load given plugin module name or entry point > (multi-allowed). > To avoid loading of plugins, use the `no:` prefix, e.g. > `no:doctest`. > --trace-config trace considerations of conftest.py files. > --debug=[DEBUG_FILE_NAME] > store internal tracing debug information in this log > file. > This file is opened with 'w' and truncated as a result, > care advised. > Defaults to 'pytestdebug.log'. > -o OVERRIDE_INI, --override-ini=OVERRIDE_INI > override ini option with "option=value" style, e.g. `-o > xfail_strict=True -o cache_dir=cache`. > --assert=MODE Control assertion debugging tools. > 'plain' performs no assertion debugging. > 'rewrite' (the default) rewrites assert statements in > test modules on import to provide assert expression > information. > --setup-only only setup fixtures, do not execute tests. > --setup-show show setup of fixtures while executing tests. > --setup-plan show what fixtures and tests would be executed but don't > execute anything. > >logging: > --log-level=LEVEL level of messages to catch/display. > Not set by default, so it depends on the root/parent log > handler's effective level, where it is "WARNING" by > default. > --log-format=LOG_FORMAT > log format as used by the logging module. > --log-date-format=LOG_DATE_FORMAT > log date format as used by the logging module. > --log-cli-level=LOG_CLI_LEVEL > cli logging level. > --log-cli-format=LOG_CLI_FORMAT > log format as used by the logging module. > --log-cli-date-format=LOG_CLI_DATE_FORMAT > log date format as used by the logging module. > --log-file=LOG_FILE path to a file when logging will be written to. > --log-file-level=LOG_FILE_LEVEL > log file logging level. > --log-file-format=LOG_FILE_FORMAT > log format as used by the logging module. > --log-file-date-format=LOG_FILE_DATE_FORMAT > log date format as used by the logging module. > --log-auto-indent=LOG_AUTO_INDENT > Auto-indent multiline messages passed to the logging > module. Accepts true|on, false|off or an integer. > >[pytest] ini-options in the first pytest.ini|tox.ini|setup.cfg file found: > > markers (linelist): markers for test functions > empty_parameter_set_mark (string): > default marker for empty parametersets > norecursedirs (args): directory patterns to avoid for recursion > testpaths (args): directories to search for tests when no files or > directories are given in the command line. > filterwarnings (linelist): > Each line specifies a pattern for > warnings.filterwarnings. Processed after > -W/--pythonwarnings. > usefixtures (args): list of default fixtures to be used with this project > python_files (args): glob-style file patterns for Python test module > discovery > python_classes (args): > prefixes or glob names for Python test class discovery > python_functions (args): > prefixes or glob names for Python test function and > method discovery > disable_test_id_escaping_and_forfeit_all_rights_to_community_support (bool): > disable string escape non-ascii characters, might cause > unwanted side effects(use at your own risk) > console_output_style (string): > console output: "classic", or with additional progress > information ("progress" (percentage) | "count"). > xfail_strict (bool): default for the strict parameter of xfail markers when > not given explicitly (default: False) > enable_assertion_pass_hook (bool): > Enables the pytest_assertion_pass hook.Make sure to > delete any previously generated pyc cache files. > junit_suite_name (string): > Test suite name for JUnit report > junit_logging (string): > Write captured log messages to JUnit report: one of > no|log|system-out|system-err|out-err|all > junit_log_passing_tests (bool): > Capture log information for passing tests to JUnit > report: > junit_duration_report (string): > Duration time to report: one of total|call > junit_family (string): > Emit XML for schema: one of legacy|xunit1|xunit2 > doctest_optionflags (args): > option flags for doctests > doctest_encoding (string): > encoding used for doctest files > cache_dir (string): cache directory path. > log_level (string): default value for --log-level > log_format (string): default value for --log-format > log_date_format (string): > default value for --log-date-format > log_cli (bool): enable log display during test run (also known as "live > logging"). > log_cli_level (string): > default value for --log-cli-level > log_cli_format (string): > default value for --log-cli-format > log_cli_date_format (string): > default value for --log-cli-date-format > log_file (string): default value for --log-file > log_file_level (string): > default value for --log-file-level > log_file_format (string): > default value for --log-file-format > log_file_date_format (string): > default value for --log-file-date-format > log_auto_indent (string): > default value for --log-auto-indent > pythonpath (paths): Add paths to sys.path > faulthandler_timeout (string): > Dump the traceback of all threads if a test takes more > than TIMEOUT seconds to finish. > addopts (args): extra command line options > minversion (string): minimally required pytest version > required_plugins (args): > plugins that must be present for pytest to run > >environment variables: > PYTEST_ADDOPTS extra command line options > PYTEST_PLUGINS comma-separated plugins to load during startup > PYTEST_DISABLE_PLUGIN_AUTOLOAD set to disable plugin auto-loading > PYTEST_DEBUG set to enable debug tracing of pytest's internals > > >to see available markers type: pytest --markers >to see available fixtures type: pytest --fixtures >(shown according to specified file_or_dir or current dir if not specified; fixtures with leading '_' are only shown with the '-v' option >_________________________________ test_groups __________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:76: in test_groups > result.stdout.fnmatch_lines([ >E Failed: nomatch: '*collected 5 items' >E and: '============================= test session starts ==============================' >E and: 'platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8' >E and: 'cachedir: .pytest_cache' >E and: 'rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0' >E and: 'plugins: aspectlib-1.5.2' >E fnmatch: '*collected 5 items' >E with: 'collecting ... collected 5 items' >E fnmatch: '*' >E with: '' >E fnmatch: 'test_groups.py::*test_groups PASSED*' >E with: 'test_groups.py::test_groups PASSED [ 20%]' >E nomatch: 'test_groups.py::test_fast PASSED*' >E and: 'test_groups.py::test_fast ERROR [ 40%]' >E and: 'test_groups.py::test_slow ERROR [ 60%]' >E and: 'test_groups.py::test_slower ERROR [ 80%]' >E and: 'test_groups.py::test_xfast ERROR [100%]' >E and: '' >E and: '==================================== ERRORS ====================================' >E and: '_________________________ ERROR at setup of test_fast __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 8' >E and: ' def test_fast(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:8' >E and: '_________________________ ERROR at setup of test_slow __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 12' >E and: ' def test_slow(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:12' >E and: '________________________ ERROR at setup of test_slower _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 16' >E and: ' @pytest.mark.benchmark(group="A")' >E and: ' def test_slower(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:16' >E and: '_________________________ ERROR at setup of test_xfast _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 21' >E and: ' @pytest.mark.benchmark(group="A", warmup=True)' >E and: ' def test_xfast(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:21' >E and: '=============================== warnings summary ===============================' >E and: 'test_groups.py:16' >E and: ' /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html' >E and: ' @pytest.mark.benchmark(group="A")' >E and: '' >E and: 'test_groups.py:21' >E and: ' /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:21: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html' >E and: ' @pytest.mark.benchmark(group="A", warmup=True)' >E and: '' >E and: '-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html' >E and: '=========================== short test summary info ============================' >E and: 'ERROR test_groups.py::test_fast' >E and: 'ERROR test_groups.py::test_slow' >E and: 'ERROR test_groups.py::test_slower' >E and: 'ERROR test_groups.py::test_xfast' >E and: '=================== 1 passed, 2 warnings, 4 errors in 0.12s ====================' >E remains unmatched: 'test_groups.py::test_fast PASSED*' > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=64 len(stderr.lines)=0 duration=0.66s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/runpytest-0 -vv --doctest-modules /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0 >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0 >plugins: aspectlib-1.5.2 >collecting ... collected 5 items > >test_groups.py::test_groups PASSED [ 20%] >test_groups.py::test_fast ERROR [ 40%] >test_groups.py::test_slow ERROR [ 60%] >test_groups.py::test_slower ERROR [ 80%] >test_groups.py::test_xfast ERROR [100%] > >==================================== ERRORS ==================================== >_________________________ ERROR at setup of test_fast __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 8 > def test_fast(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:8 >_________________________ ERROR at setup of test_slow __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 12 > def test_slow(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:12 >________________________ ERROR at setup of test_slower _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 16 > @pytest.mark.benchmark(group="A") > def test_slower(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:16 >_________________________ ERROR at setup of test_xfast _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py, line 21 > @pytest.mark.benchmark(group="A", warmup=True) > def test_xfast(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:21 >=============================== warnings summary =============================== >test_groups.py:16 > /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:16: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html > @pytest.mark.benchmark(group="A") > >test_groups.py:21 > /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_groups0/test_groups.py:21: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html > @pytest.mark.benchmark(group="A", warmup=True) > >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >=========================== short test summary info ============================ >ERROR test_groups.py::test_fast >ERROR test_groups.py::test_slow >ERROR test_groups.py::test_slower >ERROR test_groups.py::test_xfast >=================== 1 passed, 2 warnings, 4 errors in 0.12s ==================== >______________________________ test_group_by_name ______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:164: in test_group_by_name > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.51s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0/test_x.py') > test_y = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0/test_y.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by name /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0/test_x.py /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0/test_y.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_name0 > >______________________________ test_group_by_func ______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:201: in test_group_by_func > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0/test_x.py') > test_y = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0/test_y.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by func /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0/test_x.py /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0/test_y.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_func0 > >____________________________ test_group_by_fullfunc ____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:230: in test_group_by_fullfunc > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0/test_x.py') > test_y = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0/test_y.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by fullfunc /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0/test_x.py /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0/test_y.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullfunc0 > >___________________________ test_group_by_param_all ____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:271: in test_group_by_param_all > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0/test_x.py') > test_y = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0/test_y.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by param /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0/test_x.py /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0/test_y.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_all0 > >__________________________ test_group_by_param_select __________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:304: in test_group_by_param_select > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.53s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0/test_x.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by param:foo --benchmark-sort fullname /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0/test_x.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by --benchmark-sort fullname /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0/test_x.py > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select0 > >_____________________ test_group_by_param_select_multiple ______________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:337: in test_group_by_param_select_multiple > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0/test_x.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by param:foo,param:bar --benchmark-sort fullname /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0/test_x.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by --benchmark-sort fullname /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0/test_x.py > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_param_select_multiple0 > >____________________________ test_group_by_fullname ____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:379: in test_group_by_fullname > result.stdout.fnmatch_lines_random([ >E Failed: line "* benchmark 'test_x.py::test_a[[]0[]]': 1 tests *" not found in output > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test_x = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0/test_x.py') > test_y = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0/test_y.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0/runpytest-0 --benchmark-max-time=0.0000001 --benchmark-group-by fullname /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0/test_x.py /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0/test_y.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-group-by > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_group_by_fullname0 > >_______________________________ test_double_use ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:403: in test_double_use > result.stdout.fnmatch_lines([ >E Failed: nomatch: '*FixtureAlreadyUsed: Fixture can only be used once. Previously it was used in benchmark(...) mode.' >E and: '============================= test session starts ==============================' >E and: 'platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0' >E and: 'rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0' >E and: 'plugins: aspectlib-1.5.2' >E and: 'collected 2 items' >E and: '' >E and: 'test_double_use.py EE [100%]' >E and: '' >E and: '==================================== ERRORS ====================================' >E and: '___________________________ ERROR at setup of test_a ___________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py, line 1' >E and: ' def test_a(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py:1' >E and: '___________________________ ERROR at setup of test_b ___________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py, line 5' >E and: ' def test_b(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py:5' >E and: '=========================== short test summary info ============================' >E and: 'ERROR test_double_use.py::test_a' >E and: 'ERROR test_double_use.py::test_b' >E and: '============================== 2 errors in 0.05s ===============================' >E remains unmatched: '*FixtureAlreadyUsed: Fixture can only be used once. Previously it was used in benchmark(...) mode.' > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=29 len(stderr.lines)=0 duration=0.56s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/runpytest-0 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py --tb=line > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0 >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0 >plugins: aspectlib-1.5.2 >collected 2 items > >test_double_use.py EE [100%] > >==================================== ERRORS ==================================== >___________________________ ERROR at setup of test_a ___________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py, line 1 > def test_a(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py:1 >___________________________ ERROR at setup of test_b ___________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py, line 5 > def test_b(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_double_use0/test_double_use.py:5 >=========================== short test summary info ============================ >ERROR test_double_use.py::test_a >ERROR test_double_use.py::test_b >============================== 2 errors in 0.05s =============================== >___________________________ test_only_override_skip ____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:412: in test_only_override_skip > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 2 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0/test_only_override_skip.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0/runpytest-0 --benchmark-only --benchmark-skip /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0/test_only_override_skip.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-only --benchmark-skip > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_override_skip0 > >__________________________ test_fixtures_also_skipped __________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:428: in test_fixtures_also_skipped > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 2 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0/test_fixtures_also_skipped.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0/runpytest-0 --benchmark-only -s /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0/test_fixtures_also_skipped.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-only > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_fixtures_also_skipped0 > >____________________ test_conflict_between_only_and_disable ____________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:438: in test_conflict_between_only_and_disable > result.stderr.fnmatch_lines([ >E Failed: nomatch: "ERROR: Can't have both --benchmark-only and --benchmark-disable options. Note that --benchmark-disable is automatically activated if xdist is on or you're missing the statistics dependency." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-only --benchmark-disable' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0' >E and: '' >E remains unmatched: "ERROR: Can't have both --benchmark-only and --benchmark-disable options. Note that --benchmark-disable is automatically activated if xdist is on or you're missing the statistics dependency." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0/test_conflict_between_only_and_disable.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0/runpytest-0 --benchmark-only --benchmark-disable /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0/test_conflict_between_only_and_disable.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-only --benchmark-disable > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_conflict_between_only_and_disable0 > >___________________________ test_max_time_min_rounds ___________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:448: in test_max_time_min_rounds > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 3 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0/test_max_time_min_rounds.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0/runpytest-0 --doctest-modules --benchmark-max-time=0.000001 --benchmark-min-rounds=1 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0/test_max_time_min_rounds.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.000001 --benchmark-min-rounds=1 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time_min_rounds0 > >________________________________ test_max_time _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:464: in test_max_time > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 3 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.53s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0/test_max_time.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0/runpytest-0 --doctest-modules --benchmark-max-time=0.000001 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0/test_max_time.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_max_time0 > >_____________________________ test_bogus_max_time ______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:480: in test_bogus_max_time > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-max-time: Invalid decimal value 'bogus': InvalidOperation*" >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=bogus' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-max-time: Invalid decimal value 'bogus': InvalidOperation*" > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0/test_bogus_max_time.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0/runpytest-0 --doctest-modules --benchmark-max-time=bogus /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0/test_bogus_max_time.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=bogus > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_max_time0 > >______________________________ test_pep418_timer _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:491: in test_pep418_timer > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '* (defaults: timer=*.perf_counter*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0/test_pep418_timer.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-timer=pep418.perf_counter /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0/test_pep418_timer.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-timer=pep418.perf_counter > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_pep418_timer0 > >________________________________ test_bad_save _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:499: in test_bad_save > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-save: Must not contain any of these characters: /:*?<>|\\ (it has ':?')" >E and: '__main__.py: error: unrecognized arguments: --benchmark-save=asd:f?' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-save: Must not contain any of these characters: /:*?<>|\\ (it has ':?')" > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0/test_bad_save.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0/runpytest-0 --doctest-modules --benchmark-save=asd:f? /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0/test_bad_save.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-save=asd:f? > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save0 > >_______________________________ test_bad_save_2 ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:508: in test_bad_save_2 > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-save: Can't be empty." >E and: '__main__.py: error: unrecognized arguments: --benchmark-save=' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20' >E and: '' >E remains unmatched: "*: error: argument --benchmark-save: Can't be empty." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20/test_bad_save_2.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20/runpytest-0 --doctest-modules --benchmark-save= /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20/test_bad_save_2.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-save= > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_save_20 > >____________________________ test_bad_compare_fail _____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:517: in test_bad_compare_fail > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-compare-fail: Could not parse value: '?'." >E and: '__main__.py: error: unrecognized arguments: --benchmark-compare-fail=?' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-compare-fail: Could not parse value: '?'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.51s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0/test_bad_compare_fail.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0/runpytest-0 --doctest-modules --benchmark-compare-fail=? /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0/test_bad_compare_fail.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-compare-fail=? > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_compare_fail0 > >_______________________________ test_bad_rounds ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:526: in test_bad_rounds > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-min-rounds: invalid literal for int() with base 10: 'asd'" >E and: '__main__.py: error: unrecognized arguments: --benchmark-min-rounds=asd' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-min-rounds: invalid literal for int() with base 10: 'asd'" > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0/test_bad_rounds.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0/runpytest-0 --doctest-modules --benchmark-min-rounds=asd /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0/test_bad_rounds.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-min-rounds=asd > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds0 > >______________________________ test_bad_rounds_2 _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:535: in test_bad_rounds_2 > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: '*: error: argument --benchmark-min-rounds: Value for --benchmark-rounds must be at least 1.' >E and: '__main__.py: error: unrecognized arguments: --benchmark-min-rounds=0' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20' >E and: '' >E remains unmatched: '*: error: argument --benchmark-min-rounds: Value for --benchmark-rounds must be at least 1.' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20/test_bad_rounds_2.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20/runpytest-0 --doctest-modules --benchmark-min-rounds=0 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20/test_bad_rounds_2.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-min-rounds=0 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bad_rounds_20 > >_________________________________ test_compare _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:547: in test_compare > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Comparing against benchmarks from: *0001_unversioned_*.json' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0001 --benchmark-compare-fail=min:0.1' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0' >E and: '' >E remains unmatched: 'Comparing against benchmarks from: *0001_unversioned_*.json' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0/test_compare.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-autosave /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0/test_compare.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0 >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0/runpytest-1 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-compare=0001 --benchmark-compare-fail=min:0.1 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0/test_compare.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-autosave > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0 > >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0001 --benchmark-compare-fail=min:0.1 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare0 > >______________________________ test_compare_last _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:563: in test_compare_last > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Comparing against benchmarks from: *0001_unversioned_*.json' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare --benchmark-compare-fail=min:0.1' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0' >E and: '' >E remains unmatched: 'Comparing against benchmarks from: *0001_unversioned_*.json' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.51s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0/test_compare_last.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-autosave /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0/test_compare_last.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0 >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0/runpytest-1 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-compare --benchmark-compare-fail=min:0.1 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0/test_compare_last.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-autosave > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0 > >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare --benchmark-compare-fail=min:0.1 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_last0 > >__________________________ test_compare_non_existing ___________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:579: in test_compare_non_existing > result.stderr.fnmatch_lines([ >E Failed: nomatch: "* PytestBenchmarkWarning: Can't compare. No benchmark files * '0002'." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0002' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0' >E and: '' >E remains unmatched: "* PytestBenchmarkWarning: Can't compare. No benchmark files * '0002'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0/test_compare_non_existing.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-autosave /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0/test_compare_non_existing.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0 >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0/runpytest-1 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-compare=0002 -rw /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0/test_compare_non_existing.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-autosave > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0 > >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0002 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing0 > >______________________ test_compare_non_existing_verbose _______________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:590: in test_compare_non_existing_verbose > result.stderr.fnmatch_lines([ >E Failed: nomatch: " WARNING: Can't compare. No benchmark files * '0002'." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0002 --benchmark-verbose' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0' >E and: '' >E remains unmatched: " WARNING: Can't compare. No benchmark files * '0002'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0/test_compare_non_existing_verbose.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-autosave /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0/test_compare_non_existing_verbose.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0 >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0/runpytest-1 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-compare=0002 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0/test_compare_non_existing_verbose.py --benchmark-verbose > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-autosave > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0 > >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=0002 --benchmark-verbose > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_non_existing_verbose0 > >____________________________ test_compare_no_files _____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:599: in test_compare_no_files > result.stderr.fnmatch_lines([ >E Failed: nomatch: "* PytestBenchmarkWarning: Can't compare. No benchmark files in '*'. Can't load the previous benchmark." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0' >E and: '' >E remains unmatched: "* PytestBenchmarkWarning: Can't compare. No benchmark files in '*'. Can't load the previous benchmark." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0/test_compare_no_files.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules -rw /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0/test_compare_no_files.py --benchmark-compare > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files0 > >________________________ test_compare_no_files_verbose _________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:608: in test_compare_no_files_verbose > result.stderr.fnmatch_lines([ >E Failed: nomatch: " WARNING: Can't compare. No benchmark files in '*'. Can't load the previous benchmark." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare --benchmark-verbose' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0' >E and: '' >E remains unmatched: " WARNING: Can't compare. No benchmark files in '*'. Can't load the previous benchmark." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0/test_compare_no_files_verbose.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0/test_compare_no_files_verbose.py --benchmark-compare --benchmark-verbose > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare --benchmark-verbose > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_verbose0 > >_________________________ test_compare_no_files_match __________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:618: in test_compare_no_files_match > result.stderr.fnmatch_lines([ >E Failed: nomatch: "* PytestBenchmarkWarning: Can't compare. No benchmark files in '*' match '1'." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=1' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0' >E and: '' >E remains unmatched: "* PytestBenchmarkWarning: Can't compare. No benchmark files in '*' match '1'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0/test_compare_no_files_match.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules -rw /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0/test_compare_no_files_match.py --benchmark-compare=1 > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=1 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match0 > >_____________________ test_compare_no_files_match_verbose ______________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:627: in test_compare_no_files_match_verbose > result.stderr.fnmatch_lines([ >E Failed: nomatch: " WARNING: Can't compare. No benchmark files in '*' match '1'." >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=1 --benchmark-verbose' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0' >E and: '' >E remains unmatched: " WARNING: Can't compare. No benchmark files in '*' match '1'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0/test_compare_no_files_match_verbose.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0/test_compare_no_files_match_verbose.py --benchmark-compare=1 --benchmark-verbose > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-compare=1 --benchmark-verbose > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare_no_files_match_verbose0 > >_________________________________ test_verbose _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:636: in test_verbose > result.stderr.fnmatch_lines([ >E Failed: nomatch: ' Calibrating to target round *s; will estimate when reaching *s (using: *, precision: *).' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-verbose' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0' >E and: '' >E remains unmatched: ' Calibrating to target round *s; will estimate when reaching *s (using: *, precision: *).' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0/test_verbose.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0/runpytest-0 --benchmark-max-time=0.0000001 --doctest-modules --benchmark-verbose -vv /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0/test_verbose.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-max-time=0.0000001 --benchmark-verbose > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_verbose0 > >__________________________________ test_save ___________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:648: in test_save > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Saved benchmark data in: *' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-save=foobar --benchmark-max-time=0.0000001' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0' >E and: '' >E remains unmatched: 'Saved benchmark data in: *' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0/test_save.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0/runpytest-0 --doctest-modules --benchmark-save=foobar --benchmark-max-time=0.0000001 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0/test_save.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-save=foobar --benchmark-max-time=0.0000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save0 > >_____________________________ test_save_extra_info _____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:662: in test_save_extra_info > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Saved benchmark data in: *' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-save=foobar --benchmark-max-time=0.0000001' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0' >E and: '' >E remains unmatched: 'Saved benchmark data in: *' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0/test_save_extra_info.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0/runpytest-0 --doctest-modules --benchmark-save=foobar --benchmark-max-time=0.0000001 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0/test_save_extra_info.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-save=foobar --benchmark-max-time=0.0000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_save_extra_info0 > >___________________ test_update_machine_info_hook_detection ____________________ >/usr/lib/python3.8/site-packages/py/_error.py:66: in checked_call > return func(*args, **kwargs) >E FileNotFoundError: [Errno 2] No such file or directory: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json' > __tracebackhide__ = False > args = ('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json', > 'r') > cls = <class 'py.error.ENOENT'> > errno = 2 > func = <built-in function open> > kwargs = {} > self = <module 'py.error'> > tb = <traceback object at 0x7f9303b36880> > value = FileNotFoundError(2, 'No such file or directory') > >During handling of the above exception, another exception occurred: >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:720: in test_update_machine_info_hook_detection > run_verify_pytest("test_module/tests") > record_path_conftest = ('\n' > 'import os\n' > '\n' > 'def pytest_benchmark_update_machine_info(config, machine_info):\n' > ' machine_info["conftest_path"] = (\n' > ' machine_info.get("conftest_path", []) + [os.path.relpath(__file__)]\n' > ' )\n' > ' ') > run_verify_pytest = <function test_update_machine_info_hook_detection.<locals>.run_verify_pytest at 0x7f9303c1f0d0> > simple_test = ('\n' > 'def test_simple(benchmark):\n' > ' @benchmark\n' > ' def resuilt():\n' > ' 1+1\n' > ' ') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0')> >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:709: in run_verify_pytest > benchmark_json = json.loads(testdir.tmpdir.join('benchmark.json').read()) > args = ('test_module/tests',) > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0')> >/usr/lib/python3.8/site-packages/py/_path/common.py:176: in read > with self.open(mode) as f: > mode = 'r' > self = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json') >/usr/lib/python3.8/site-packages/py/_path/local.py:369: in open > return py.error.checked_call(open, self.strpath, mode) > encoding = None > ensure = False > mode = 'r' > self = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json') >/usr/lib/python3.8/site-packages/py/_error.py:86: in checked_call > raise cls("%s%r" % (func.__name__, args)) >E py.error.ENOENT: [No such file or directory]: open('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json', 'r') > __tracebackhide__ = False > args = ('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/benchmark.json', > 'r') > cls = <class 'py.error.ENOENT'> > errno = 2 > func = <built-in function open> > kwargs = {} > self = <module 'py.error'> > tb = <traceback object at 0x7f9303b36880> > value = FileNotFoundError(2, 'No such file or directory') >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0/runpytest-0 --benchmark-json=benchmark.json --benchmark-max-time=0.0000001 test_module/tests > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-json=benchmark.json --benchmark-max-time=0.0000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_update_machine_info_hook_detection0 > >________________________________ test_histogram ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:729: in test_histogram > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Generated histogram: *foobar.svg' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-histogram=foobar --benchmark-max-time=0.0000001' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0' >E and: '' >E remains unmatched: 'Generated histogram: *foobar.svg' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0/test_histogram.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0/runpytest-0 --doctest-modules --benchmark-histogram=foobar --benchmark-max-time=0.0000001 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0/test_histogram.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-histogram=foobar --benchmark-max-time=0.0000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_histogram0 > >________________________________ test_autosave _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:741: in test_autosave > result.stderr.fnmatch_lines([ >E Failed: nomatch: 'Saved benchmark data in: *' >E and: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E and: '__main__.py: error: unrecognized arguments: --benchmark-autosave --benchmark-max-time=0.0000001' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0' >E and: '' >E remains unmatched: 'Saved benchmark data in: *' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0/test_autosave.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0/runpytest-0 --doctest-modules --benchmark-autosave --benchmark-max-time=0.0000001 /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0/test_autosave.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-autosave --benchmark-max-time=0.0000001 > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_autosave0 > >_____________________________ test_bogus_min_time ______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:750: in test_bogus_min_time > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-min-time: Invalid decimal value 'bogus': InvalidOperation*" >E and: '__main__.py: error: unrecognized arguments: --benchmark-min-time=bogus' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-min-time: Invalid decimal value 'bogus': InvalidOperation*" > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0/test_bogus_min_time.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0/runpytest-0 --doctest-modules --benchmark-min-time=bogus /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0/test_bogus_min_time.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-min-time=bogus > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_min_time0 > >_______________________________ test_disable_gc ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:759: in test_disable_gc > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 2 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0/test_disable_gc.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0/runpytest-0 --benchmark-disable-gc /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0/test_disable_gc.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-disable-gc > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable_gc0 > >______________________________ test_custom_timer _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:775: in test_custom_timer > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 2 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.47s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0/test_custom_timer.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0/runpytest-0 --benchmark-timer=time.time /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0/test_custom_timer.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-timer=time.time > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_custom_timer0 > >_______________________________ test_bogus_timer _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:791: in test_bogus_timer > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-timer: Value for --benchmark-timer must be in dotted form. Eg: 'module.attr'." >E and: '__main__.py: error: unrecognized arguments: --benchmark-timer=bogus' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-timer: Value for --benchmark-timer must be in dotted form. Eg: 'module.attr'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.47s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0/test_bogus_timer.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0/runpytest-0 --benchmark-timer=bogus /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0/test_bogus_timer.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-timer=bogus > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_timer0 > >______________________________ test_sort_by_mean _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:801: in test_sort_by_mean > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 2 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0/test_sort_by_mean.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0/runpytest-0 --benchmark-sort=mean /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0/test_sort_by_mean.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-sort=mean > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_sort_by_mean0 > >_______________________________ test_bogus_sort ________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:817: in test_bogus_sort > result.stderr.fnmatch_lines([ >E Failed: fnmatch: '*usage: *' >E with: 'ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...]' >E nomatch: "*: error: argument --benchmark-sort: Unacceptable value: 'bogus'. Value for --benchmark-sort must be one of: 'min', 'max', 'mean', 'stddev', 'name', 'fullname'." >E and: '__main__.py: error: unrecognized arguments: --benchmark-sort=bogus' >E and: ' inifile: None' >E and: ' rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0' >E and: '' >E remains unmatched: "*: error: argument --benchmark-sort: Unacceptable value: 'bogus'. Value for --benchmark-sort must be one of: 'min', 'max', 'mean', 'stddev', 'name', 'fullname'." > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0/test_bogus_sort.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0/runpytest-0 --benchmark-sort=bogus /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0/test_bogus_sort.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-sort=bogus > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_bogus_sort0 > >________________________________ test_cprofile _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:849: in test_cprofile > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '------------*----------- cProfile (time in s) ------------*-----------' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0/test_cprofile.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0/runpytest-0 --benchmark-cprofile=cumtime /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0/test_cprofile.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-cprofile=cumtime > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_cprofile0 > >__________________________ test_disabled_and_cprofile __________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:868: in test_disabled_and_cprofile > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*==== 2 passed*' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0/test_disabled_and_cprofile.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0/runpytest-0 --benchmark-disable --benchmark-cprofile=cumtime /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0/test_disabled_and_cprofile.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-disable --benchmark-cprofile=cumtime > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disabled_and_cprofile0 > >______________________________ test_abort_broken _______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:908: in test_abort_broken > result.stdout.fnmatch_lines([ >E Failed: nomatch: '*collected 5 items' >E and: '============================= test session starts ==============================' >E and: 'platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8' >E and: 'cachedir: .pytest_cache' >E and: 'rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0' >E and: 'plugins: aspectlib-1.5.2' >E fnmatch: '*collected 5 items' >E with: 'collecting ... collected 5 items' >E fnmatch: '*' >E with: '' >E nomatch: 'test_abort_broken.py::test_bad FAILED*' >E and: 'test_abort_broken.py::test_bad ERROR [ 20%]' >E and: 'test_abort_broken.py::test_bad2 ERROR [ 40%]' >E and: 'test_abort_broken.py::test_ok[a] ERROR [ 60%]' >E and: 'test_abort_broken.py::test_ok[b] ERROR [ 80%]' >E and: 'test_abort_broken.py::test_ok[c] ERROR [100%]' >E and: '' >E and: '==================================== ERRORS ====================================' >E and: '__________________________ ERROR at setup of test_bad __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 8' >E and: ' def test_bad(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:8' >E and: '_________________________ ERROR at setup of test_bad2 __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 14' >E and: ' def test_bad2(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:14' >E and: '_________________________ ERROR at setup of test_ok[a] _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24' >E and: ' def test_ok(benchmark, bad_fixture):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24' >E and: '_________________________ ERROR at setup of test_ok[b] _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24' >E and: ' def test_ok(benchmark, bad_fixture):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24' >E and: '_________________________ ERROR at setup of test_ok[c] _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24' >E and: ' def test_ok(benchmark, bad_fixture):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24' >E and: '=========================== short test summary info ============================' >E and: 'ERROR test_abort_broken.py::test_bad' >E and: 'ERROR test_abort_broken.py::test_bad2' >E and: 'ERROR test_abort_broken.py::test_ok[a]' >E and: 'ERROR test_abort_broken.py::test_ok[b]' >E and: 'ERROR test_abort_broken.py::test_ok[c]' >E and: '============================== 5 errors in 0.06s ===============================' >E remains unmatched: 'test_abort_broken.py::test_bad FAILED*' > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=61 len(stderr.lines)=0 duration=0.59s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/runpytest-0 -vv /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0 >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0 >plugins: aspectlib-1.5.2 >collecting ... collected 5 items > >test_abort_broken.py::test_bad ERROR [ 20%] >test_abort_broken.py::test_bad2 ERROR [ 40%] >test_abort_broken.py::test_ok[a] ERROR [ 60%] >test_abort_broken.py::test_ok[b] ERROR [ 80%] >test_abort_broken.py::test_ok[c] ERROR [100%] > >==================================== ERRORS ==================================== >__________________________ ERROR at setup of test_bad __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 8 > def test_bad(benchmark): >E fixture 'benchmark' not found >> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:8 >_________________________ ERROR at setup of test_bad2 __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 14 > def test_bad2(benchmark): >E fixture 'benchmark' not found >> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:14 >_________________________ ERROR at setup of test_ok[a] _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24 > def test_ok(benchmark, bad_fixture): >E fixture 'benchmark' not found >> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24 >_________________________ ERROR at setup of test_ok[b] _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24 > def test_ok(benchmark, bad_fixture): >E fixture 'benchmark' not found >> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24 >_________________________ ERROR at setup of test_ok[c] _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py, line 24 > def test_ok(benchmark, bad_fixture): >E fixture 'benchmark' not found >> available fixtures: bad_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_abort_broken0/test_abort_broken.py:24 >=========================== short test summary info ============================ >ERROR test_abort_broken.py::test_bad >ERROR test_abort_broken.py::test_bad2 >ERROR test_abort_broken.py::test_ok[a] >ERROR test_abort_broken.py::test_ok[b] >ERROR test_abort_broken.py::test_ok[c] >============================== 5 errors in 0.06s =============================== >__________________________________ test_basic __________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1028: in test_basic > result.stdout.fnmatch_lines([ >E Failed: nomatch: '*collected 5 items' >E and: '============================= test session starts ==============================' >E and: 'platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8' >E and: 'cachedir: .pytest_cache' >E and: 'rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0' >E and: 'plugins: aspectlib-1.5.2' >E fnmatch: '*collected 5 items' >E with: 'collecting ... collected 5 items' >E nomatch: 'test_basic.py::*test_basic PASSED*' >E and: '' >E fnmatch: 'test_basic.py::*test_basic PASSED*' >E with: 'test_basic.py::test_basic PASSED [ 20%]' >E nomatch: 'test_basic.py::test_slow PASSED*' >E and: 'test_basic.py::test_fast ERROR [ 40%]' >E and: 'test_basic.py::test_slow ERROR [ 60%]' >E and: 'test_basic.py::test_slower ERROR [ 80%]' >E and: 'test_basic.py::test_xfast ERROR [100%]' >E and: '' >E and: '==================================== ERRORS ====================================' >E and: '_________________________ ERROR at setup of test_fast __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 27' >E and: ' def test_fast(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:27' >E and: '_________________________ ERROR at setup of test_slow __________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 17' >E and: ' def test_slow(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:17' >E and: '________________________ ERROR at setup of test_slower _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 20' >E and: ' def test_slower(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:20' >E and: '_________________________ ERROR at setup of test_xfast _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 23' >E and: ' @pytest.mark.benchmark(min_rounds=2)' >E and: ' def test_xfast(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:23' >E and: '=============================== warnings summary ===============================' >E and: 'test_basic.py:23' >E and: ' /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html' >E and: ' @pytest.mark.benchmark(min_rounds=2)' >E and: '' >E and: '-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html' >E and: '=========================== short test summary info ============================' >E and: 'ERROR test_basic.py::test_fast' >E and: 'ERROR test_basic.py::test_slow' >E and: 'ERROR test_basic.py::test_slower' >E and: 'ERROR test_basic.py::test_xfast' >E and: '==================== 1 passed, 1 warning, 4 errors in 0.12s ====================' >E remains unmatched: 'test_basic.py::test_slow PASSED*' > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=59 len(stderr.lines)=0 duration=0.65s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/runpytest-0 -vv --doctest-modules /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0 >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0 >plugins: aspectlib-1.5.2 >collecting ... collected 5 items > >test_basic.py::test_basic PASSED [ 20%] >test_basic.py::test_fast ERROR [ 40%] >test_basic.py::test_slow ERROR [ 60%] >test_basic.py::test_slower ERROR [ 80%] >test_basic.py::test_xfast ERROR [100%] > >==================================== ERRORS ==================================== >_________________________ ERROR at setup of test_fast __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 27 > def test_fast(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:27 >_________________________ ERROR at setup of test_slow __________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 17 > def test_slow(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:17 >________________________ ERROR at setup of test_slower _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 20 > def test_slower(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:20 >_________________________ ERROR at setup of test_xfast _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py, line 23 > @pytest.mark.benchmark(min_rounds=2) > def test_xfast(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:23 >=============================== warnings summary =============================== >test_basic.py:23 > /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_basic0/test_basic.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html > @pytest.mark.benchmark(min_rounds=2) > >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >=========================== short test summary info ============================ >ERROR test_basic.py::test_fast >ERROR test_basic.py::test_slow >ERROR test_basic.py::test_slower >ERROR test_basic.py::test_xfast >==================== 1 passed, 1 warning, 4 errors in 0.12s ==================== >__________________________________ test_skip ___________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1052: in test_skip > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 5 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0/test_skip.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0/runpytest-0 -vv --doctest-modules --benchmark-skip /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0/test_skip.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-skip > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_skip0 > >_________________________________ test_disable _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1066: in test_disable > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 5 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.49s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0/test_disable.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0/runpytest-0 -vv --doctest-modules --benchmark-disable /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0/test_disable.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-disable > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_disable0 > >_____________________________ test_mark_selection ______________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1080: in test_mark_selection > result.stdout.fnmatch_lines([ >E Failed: nomatch: '*collected 5 items*' >E and: '============================= test session starts ==============================' >E and: 'platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8' >E and: 'cachedir: .pytest_cache' >E and: 'rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0' >E and: 'plugins: aspectlib-1.5.2' >E fnmatch: '*collected 5 items*' >E with: 'collecting ... collected 5 items / 4 deselected / 1 selected' >E nomatch: 'test_mark_selection.py::test_xfast PASSED*' >E and: '' >E and: 'test_mark_selection.py::test_xfast ERROR [100%]' >E and: '' >E and: '==================================== ERRORS ====================================' >E and: '_________________________ ERROR at setup of test_xfast _________________________' >E and: 'file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py, line 23' >E and: ' @pytest.mark.benchmark(min_rounds=2)' >E and: ' def test_xfast(benchmark):' >E and: "E fixture 'benchmark' not found" >E and: '> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave' >E and: "> use 'pytest --fixtures [testpath]' for help on them." >E and: '' >E and: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py:23' >E and: '=============================== warnings summary ===============================' >E and: 'test_mark_selection.py:23' >E and: ' /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html' >E and: ' @pytest.mark.benchmark(min_rounds=2)' >E and: '' >E and: '-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html' >E and: '=========================== short test summary info ============================' >E and: 'ERROR test_mark_selection.py::test_xfast' >E and: '================== 4 deselected, 1 warning, 1 error in 0.11s ===================' >E remains unmatched: 'test_mark_selection.py::test_xfast PASSED*' > result = <RunResult ret=ExitCode.TESTS_FAILED len(stdout.lines)=28 len(stderr.lines)=0 duration=0.67s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/runpytest-0 -vv --doctest-modules -m benchmark /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0 >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0 >plugins: aspectlib-1.5.2 >collecting ... collected 5 items / 4 deselected / 1 selected > >test_mark_selection.py::test_xfast ERROR [100%] > >==================================== ERRORS ==================================== >_________________________ ERROR at setup of test_xfast _________________________ >file /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py, line 23 > @pytest.mark.benchmark(min_rounds=2) > def test_xfast(benchmark): >E fixture 'benchmark' not found >> available fixtures: cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, weave >> use 'pytest --fixtures [testpath]' for help on them. > >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py:23 >=============================== warnings summary =============================== >test_mark_selection.py:23 > /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_mark_selection0/test_mark_selection.py:23: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html > @pytest.mark.benchmark(min_rounds=2) > >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >=========================== short test summary info ============================ >ERROR test_mark_selection.py::test_xfast >================== 4 deselected, 1 warning, 1 error in 0.11s =================== >_____________________________ test_only_benchmarks _____________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1095: in test_only_benchmarks > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 5 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.50s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0/test_only_benchmarks.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0/runpytest-0 -vv --doctest-modules --benchmark-only /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0/test_only_benchmarks.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-only > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_only_benchmarks0 > >_________________________________ test_columns _________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_benchmark.py:1117: in test_columns > result.stdout.fnmatch_lines([ >E Failed: remains unmatched: '*collected 3 items' > result = <RunResult ret=ExitCode.USAGE_ERROR len(stdout.lines)=0 len(stderr.lines)=5 duration=0.48s> > test = local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0/test_columns.py') > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0')> >----------------------------- Captured stdout call ----------------------------- >running: /usr/bin/python3.8 -mpytest --basetemp=/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0/runpytest-0 --doctest-modules --benchmark-columns=max,iterations,min /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0/test_columns.py > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0 >----------------------------- Captured stderr call ----------------------------- >ERROR: usage: __main__.py [options] [file_or_dir] [file_or_dir] [...] >__main__.py: error: unrecognized arguments: --benchmark-columns=max,iterations,min > inifile: None > rootdir: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_columns0 > >__________________________________ test_list ___________________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:137: in test_list > result = testdir.run('py.test-benchmark', '--storage', STORAGE, 'list') > testdir = testdir(tmpdir=local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0'), run=<function testdir.<locals>.<lambda> at 0x7f9303aa49d0>) >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:18: in <lambda> > lambda bin, *args: testdir.run(bin+".exe" if sys.platform == "win32" else bin, *args)) > args = ('--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'list') > bin = 'py.test-benchmark' > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0')> >/usr/lib/python3.8/site-packages/_pytest/legacypath.py:226: in run > return self._pytester.run(*cmdargs, timeout=timeout, stdin=stdin) > cmdargs = ('py.test-benchmark', > '--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'list') > self = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0')> > stdin = <NotSetType.token: 0> > timeout = None >/usr/lib/python3.8/site-packages/_pytest/pytester.py:1338: in popen > popen = subprocess.Popen(cmdargs, stdout=stdout, stderr=stderr, **kw) > cmdargs = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'list') > errors = None > errread = -1 > errwrite = 12 > executable = None > f = <_io.BufferedWriter name=14> > p2cread = 13 > p2cwrite = 14 > pass_fds = () > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f9303931d90> > shell = False > start_new_session = False > startupinfo = None > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0/stderr' mode='w' encoding='utf8'> > stdin = -1 > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0/stdout' mode='w' encoding='utf8'> > text = None > universal_newlines = None >/usr/lib/python3.8/subprocess.py:1704: in _execute_child > raise child_exception_type(errno_num, err_msg, err_filename) >E FileNotFoundError: [Errno 2] No such file or directory: 'py.test-benchmark' > args = ['py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'list'] > c2pread = -1 > c2pwrite = 11 > child_exception_type = <class 'OSError'> > child_exec_never_called = False > close_fds = True > creationflags = 0 > cwd = None > err_filename = 'py.test-benchmark' > err_msg = 'No such file or directory' > errno_num = 2 > errpipe_data = bytearray(b'OSError:2:') > errpipe_read = 15 > errpipe_write = 16 > errread = -1 > errwrite = 12 > exception_name = bytearray(b'OSError') > executable = b'py.test-benchmark' > executable_list = (b'/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/python3.8/bin/py.tes' > b't-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/xattr/py.test-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/py.test-benchmark', > b'/usr/local/sbin/py.test-benchmark', > b'/usr/local/bin/py.test-benchmark', > b'/usr/sbin/py.test-benchmark', > b'/usr/bin/py.test-benchmark', > b'/sbin/py.test-benchmark', > b'/bin/py.test-benchmark', > b'/opt/bin/py.test-benchmark') > fds_to_keep = {16} > hex_errno = bytearray(b'2') > k = b'PY_COLORS' > low_fds_to_close = [] > orig_executable = 'py.test-benchmark' > p2cread = 13 > p2cwrite = 14 > part = b'' > pass_fds = () > pid = 180 > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f9303931d90> > shell = False > start_new_session = False > startupinfo = None > sts = 65280 > v = '0' >----------------------------- Captured stdout call ----------------------------- >running: py.test-benchmark --storage /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage list > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_list0 >_________________________ test_compare[short-<lambda>] _________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:181: in test_compare > result = testdir.run('py.test-benchmark', '--storage', STORAGE, 'compare', '0001', '0002', '0003', > name = 'short' > name_pattern_generator = <function <lambda> at 0x7f9304602280> > testdir = testdir(tmpdir=local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1'), run=<function testdir.<locals>.<lambda> at 0x7f930383ef70>) >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:18: in <lambda> > lambda bin, *args: testdir.run(bin+".exe" if sys.platform == "win32" else bin, *args)) > args = ('--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'short', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bin = 'py.test-benchmark' > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1')> >/usr/lib/python3.8/site-packages/_pytest/legacypath.py:226: in run > return self._pytester.run(*cmdargs, timeout=timeout, stdin=stdin) > cmdargs = ('py.test-benchmark', > '--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'short', > '--histogram', > 'foobar', > '--csv', > 'foobar') > self = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1')> > stdin = <NotSetType.token: 0> > timeout = None >/usr/lib/python3.8/site-packages/_pytest/pytester.py:1338: in popen > popen = subprocess.Popen(cmdargs, stdout=stdout, stderr=stderr, **kw) > cmdargs = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'short', > '--histogram', > 'foobar', > '--csv', > 'foobar') > 'stdin': -1} > self = <Pytester PosixPath('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1')> > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1/stderr' mode='w' encoding='utf8'> > stdin = <NotSetType.token: 0> > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1/stdout' mode='w' encoding='utf8'> >/usr/lib/python3.8/subprocess.py:858: in __init__ > self._execute_child(args, executable, preexec_fn, close_fds, > args = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'short', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bufsize = -1 > c2pread = -1 > c2pwrite = 11 > close_fds = True > creationflags = 0 > cwd = None > encoding = None > errors = None > errread = -1 > errwrite = 12 > executable = None > f = <_io.BufferedWriter name=14> > p2cread = 13 > p2cwrite = 14 > pass_fds = () > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f93034692b0> > shell = False > start_new_session = False > startupinfo = None > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1/stderr' mode='w' encoding='utf8'> > stdin = -1 > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1/stdout' mode='w' encoding='utf8'> > text = None > universal_newlines = None >/usr/lib/python3.8/subprocess.py:1704: in _execute_child > raise child_exception_type(errno_num, err_msg, err_filename) >E FileNotFoundError: [Errno 2] No such file or directory: 'py.test-benchmark' > args = ['py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'short', > '--histogram', > 'foobar', > '--csv', > 'foobar'] > c2pread = -1 > c2pwrite = 11 > child_exception_type = <class 'OSError'> > child_exec_never_called = False > close_fds = True > creationflags = 0 > cwd = None > err_filename = 'py.test-benchmark' > err_msg = 'No such file or directory' > errno_num = 2 > errpipe_data = bytearray(b'OSError:2:') > errpipe_read = 15 > errpipe_write = 16 > errread = -1 > errwrite = 12 > exception_name = bytearray(b'OSError') > executable = b'py.test-benchmark' > executable_list = (b'/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/python3.8/bin/py.tes' > b't-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/xattr/py.test-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/py.test-benchmark', > b'/usr/local/sbin/py.test-benchmark', > b'/usr/local/bin/py.test-benchmark', > b'/usr/sbin/py.test-benchmark', > b'/usr/bin/py.test-benchmark', > b'/sbin/py.test-benchmark', > b'/bin/py.test-benchmark', > b'/opt/bin/py.test-benchmark') > fds_to_keep = {16} > hex_errno = bytearray(b'2') > k = b'PY_COLORS' > low_fds_to_close = [] > orig_executable = 'py.test-benchmark' > p2cread = 13 > p2cwrite = 14 > part = b'' > pass_fds = () > pid = 181 > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f93034692b0> > shell = False > start_new_session = False > startupinfo = None > sts = 65280 > v = '0' >----------------------------- Captured stdout call ----------------------------- >running: py.test-benchmark --storage /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage compare 0001 0002 0003 --sort min --columns min,max --name short --histogram foobar --csv foobar > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare1 >_________________________ test_compare[long-<lambda>] __________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:181: in test_compare > result = testdir.run('py.test-benchmark', '--storage', STORAGE, 'compare', '0001', '0002', '0003', > name = 'long' > name_pattern_generator = <function <lambda> at 0x7f9304602310> > testdir = testdir(tmpdir=local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2'), run=<function testdir.<locals>.<lambda> at 0x7f9303aa45e0>) >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:18: in <lambda> > lambda bin, *args: testdir.run(bin+".exe" if sys.platform == "win32" else bin, *args)) > args = ('--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'long', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bin = 'py.test-benchmark' > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2')> >/usr/lib/python3.8/site-packages/_pytest/legacypath.py:226: in run > return self._pytester.run(*cmdargs, timeout=timeout, stdin=stdin) > cmdargs = ('py.test-benchmark', > '--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'long', > '--histogram', > 'foobar', > '--csv', > 'foobar') > self = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2')> > stdin = <NotSetType.token: 0> > timeout = None >/usr/lib/python3.8/site-packages/_pytest/pytester.py:1338: in popen > popen = subprocess.Popen(cmdargs, stdout=stdout, stderr=stderr, **kw) > cmdargs = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'long', > '--histogram', > 'foobar', > '--csv', > 'foobar') > 'stdin': -1} > self = <Pytester PosixPath('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2')> > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2/stderr' mode='w' encoding='utf8'> > stdin = <NotSetType.token: 0> > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2/stdout' mode='w' encoding='utf8'> >/usr/lib/python3.8/subprocess.py:858: in __init__ > self._execute_child(args, executable, preexec_fn, close_fds, > args = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'long', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bufsize = -1 > c2pread = -1 > c2pwrite = 11 > close_fds = True > creationflags = 0 > cwd = None > encoding = None > err_filename = 'py.test-benchmark' > err_msg = 'No such file or directory' > errno_num = 2 > errpipe_data = bytearray(b'OSError:2:') > errpipe_read = 15 > errpipe_write = 16 > errread = -1 > errwrite = 12 > exception_name = bytearray(b'OSError') > executable = b'py.test-benchmark' > executable_list = (b'/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/python3.8/bin/py.tes' > b't-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/xattr/py.test-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/py.test-benchmark', > b'/usr/local/sbin/py.test-benchmark', > b'/usr/local/bin/py.test-benchmark', > b'/usr/sbin/py.test-benchmark', > b'/usr/bin/py.test-benchmark', > b'/sbin/py.test-benchmark', > b'/bin/py.test-benchmark', > b'/opt/bin/py.test-benchmark') > fds_to_keep = {16} > hex_errno = bytearray(b'2') > k = b'PY_COLORS' > low_fds_to_close = [] > orig_executable = 'py.test-benchmark' > p2cread = 13 > p2cwrite = 14 > part = b'' > pass_fds = () > pid = 182 > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f9303823b20> > shell = False > start_new_session = False > startupinfo = None > sts = 65280 > v = '0' >----------------------------- Captured stdout call ----------------------------- >running: py.test-benchmark --storage /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage compare 0001 0002 0003 --sort min --columns min,max --name long --histogram foobar --csv foobar > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare2 >________________________ test_compare[normal-<lambda>] _________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:181: in test_compare > result = testdir.run('py.test-benchmark', '--storage', STORAGE, 'compare', '0001', '0002', '0003', > name = 'normal' > name_pattern_generator = <function <lambda> at 0x7f93046023a0> > testdir = testdir(tmpdir=local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3'), run=<function testdir.<locals>.<lambda> at 0x7f93039833a0>) >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:18: in <lambda> > lambda bin, *args: testdir.run(bin+".exe" if sys.platform == "win32" else bin, *args)) > args = ('--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'normal', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bin = 'py.test-benchmark' > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3')> >/usr/lib/python3.8/site-packages/_pytest/legacypath.py:226: in run > return self._pytester.run(*cmdargs, timeout=timeout, stdin=stdin) > cmdargs = ('py.test-benchmark', > '--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'normal', > '--histogram', > 'foobar', > '--csv', > 'foobar') > self = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3')> > stdin = <NotSetType.token: 0> > timeout = None >/usr/lib/python3.8/site-packages/_pytest/pytester.py:1338: in popen > popen = subprocess.Popen(cmdargs, stdout=stdout, stderr=stderr, **kw) > cmdargs = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'normal', > '--histogram', > 'foobar', > '--csv', > 'foobar') > errors = None > errread = -1 > errwrite = 12 > executable = None > f = <_io.BufferedWriter name=14> > p2cread = 13 > p2cwrite = 14 > pass_fds = () > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f930384e100> > shell = False > start_new_session = False > startupinfo = None > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3/stderr' mode='w' encoding='utf8'> > stdin = -1 > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3/stdout' mode='w' encoding='utf8'> > text = None > universal_newlines = None >/usr/lib/python3.8/subprocess.py:1704: in _execute_child > raise child_exception_type(errno_num, err_msg, err_filename) >E FileNotFoundError: [Errno 2] No such file or directory: 'py.test-benchmark' > args = ['py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'normal', > '--histogram', > 'foobar', > '--csv', > 'foobar'] > c2pread = -1 > c2pwrite = 11 > child_exception_type = <class 'OSError'> > child_exec_never_called = False > close_fds = True > creationflags = 0 > cwd = None > err_filename = 'py.test-benchmark' > err_msg = 'No such file or directory' > errno_num = 2 > errpipe_data = bytearray(b'OSError:2:') > errpipe_read = 15 > errpipe_write = 16 > errread = -1 > errwrite = 12 > exception_name = bytearray(b'OSError') > executable = b'py.test-benchmark' > executable_list = (b'/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/python3.8/bin/py.tes' > b't-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/xattr/py.test-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/py.test-benchmark', > b'/usr/local/sbin/py.test-benchmark', > b'/usr/local/bin/py.test-benchmark', > b'/usr/sbin/py.test-benchmark', > b'/usr/bin/py.test-benchmark', > b'/sbin/py.test-benchmark', > b'/bin/py.test-benchmark', > b'/opt/bin/py.test-benchmark') > fds_to_keep = {16} > hex_errno = bytearray(b'2') > k = b'PY_COLORS' > low_fds_to_close = [] > orig_executable = 'py.test-benchmark' > p2cread = 13 > p2cwrite = 14 > part = b'' > pass_fds = () > pid = 183 > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f930384e100> > shell = False > start_new_session = False > startupinfo = None > sts = 65280 > v = '0' >----------------------------- Captured stdout call ----------------------------- >running: py.test-benchmark --storage /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage compare 0001 0002 0003 --sort min --columns min,max --name normal --histogram foobar --csv foobar > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare3 >_________________________ test_compare[trial-<lambda>] _________________________ >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:181: in test_compare > result = testdir.run('py.test-benchmark', '--storage', STORAGE, 'compare', '0001', '0002', '0003', > name = 'trial' > name_pattern_generator = <function <lambda> at 0x7f9304602430> > testdir = testdir(tmpdir=local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4'), run=<function testdir.<locals>.<lambda> at 0x7f9303b5faf0>) >/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_cli.py:18: in <lambda> > lambda bin, *args: testdir.run(bin+".exe" if sys.platform == "win32" else bin, *args)) > args = ('--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'trial', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bin = 'py.test-benchmark' > testdir = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4')> >/usr/lib/python3.8/site-packages/_pytest/legacypath.py:226: in run > return self._pytester.run(*cmdargs, timeout=timeout, stdin=stdin) > cmdargs = ('py.test-benchmark', > '--storage', > local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage'), > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'trial', > '--histogram', > 'foobar', > '--csv', > 'foobar') > self = <Testdir local('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4')> > stdin = <NotSetType.token: 0> > timeout = None >/usr/lib/python3.8/site-packages/_pytest/pytester.py:1338: in popen > popen = subprocess.Popen(cmdargs, stdout=stdout, stderr=stderr, **kw) > cmdargs = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'trial', > '--histogram', > 'foobar', > '--csv', > 'foobar') > 'stdin': -1} > self = <Pytester PosixPath('/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4')> > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4/stderr' mode='w' encoding='utf8'> > stdin = <NotSetType.token: 0> > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4/stdout' mode='w' encoding='utf8'> >/usr/lib/python3.8/subprocess.py:858: in __init__ > self._execute_child(args, executable, preexec_fn, close_fds, > args = ('py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'trial', > '--histogram', > 'foobar', > '--csv', > 'foobar') > bufsize = -1 > c2pread = -1 > c2pwrite = 11 > close_fds = True > creationflags = 0 > cwd = None > encoding = None > errors = None > errread = -1 > errwrite = 12 > executable = None > f = <_io.BufferedWriter name=14> > p2cread = 13 > p2cwrite = 14 > pass_fds = () > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f9303a4f490> > shell = False > start_new_session = False > startupinfo = None > stderr = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4/stderr' mode='w' encoding='utf8'> > stdin = -1 > stdout = <_io.TextIOWrapper name='/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4/stdout' mode='w' encoding='utf8'> > text = None > universal_newlines = None >/usr/lib/python3.8/subprocess.py:1704: in _execute_child > raise child_exception_type(errno_num, err_msg, err_filename) >E FileNotFoundError: [Errno 2] No such file or directory: 'py.test-benchmark' > args = ['py.test-benchmark', > '--storage', > '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage', > 'compare', > '0001', > '0002', > '0003', > '--sort', > 'min', > '--columns', > 'min,max', > '--name', > 'trial', > '--histogram', > 'foobar', > '--csv', > 'foobar'] > c2pread = -1 > c2pwrite = 11 > child_exception_type = <class 'OSError'> > child_exec_never_called = False > close_fds = True > creationflags = 0 > cwd = None > err_filename = 'py.test-benchmark' > err_msg = 'No such file or directory' > errno_num = 2 > errpipe_data = bytearray(b'OSError:2:') > errpipe_read = 15 > errpipe_write = 16 > errread = -1 > errwrite = 12 > exception_name = bytearray(b'OSError') > executable = b'py.test-benchmark' > executable_list = (b'/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/python3.8/bin/py.tes' > b't-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/xattr/py.test-benchmark', > b'/usr/lib/portage/python3.10/ebuild-helpers/py.test-benchmark', > b'/usr/local/sbin/py.test-benchmark', > b'/usr/local/bin/py.test-benchmark', > b'/usr/sbin/py.test-benchmark', > b'/usr/bin/py.test-benchmark', > b'/sbin/py.test-benchmark', > b'/bin/py.test-benchmark', > b'/opt/bin/py.test-benchmark') > fds_to_keep = {16} > hex_errno = bytearray(b'2') > k = b'PY_COLORS' > low_fds_to_close = [] > orig_executable = 'py.test-benchmark' > p2cread = 13 > p2cwrite = 14 > part = b'' > pass_fds = () > pid = 184 > preexec_fn = None > restore_signals = True > self = <subprocess.Popen object at 0x7f9303a4f490> > shell = False > start_new_session = False > startupinfo = None > sts = 65280 > v = '0' >----------------------------- Captured stdout call ----------------------------- >running: py.test-benchmark --storage /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_storage compare 0001 0002 0003 --sort min --columns min,max --name trial --histogram foobar --csv foobar > in: /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/pytest-of-portage/pytest-0/test_compare4 >=============================== warnings summary =============================== >../../../../../../../usr/lib/python3.8/site-packages/_pytest/config/__init__.py:1198 > /usr/lib/python3.8/site-packages/_pytest/config/__init__.py:1198: PytestRemovedIn8Warning: The --strict option is deprecated, use --strict-markers instead. > self.issue_config_time_warning( > >tests/test_utils.py:35 > /var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1/tests/test_utils.py:35: PytestDeprecationWarning: @pytest.yield_fixture is deprecated. > Use @pytest.fixture instead; they are the same. > @pytest.yield_fixture(params=(True, False)) > >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >=========================== short test summary info ============================ >SKIPPED [1] tests/test_benchmark.py:825: could not import 'xdist': No module named 'xdist' >SKIPPED [1] tests/test_benchmark.py:835: could not import 'xdist': No module named 'xdist' >SKIPPED [5] tests/test_utils.py:47: %r not availabe on $PATH >SKIPPED [4] tests/test_utils.py:160: %r not availabe on $PATH >ERROR tests/test_calibration.py::test_calibrate >ERROR tests/test_calibration.py::test_calibrate_fast >ERROR tests/test_calibration.py::test_calibrate_xfast >ERROR tests/test_calibration.py::test_calibrate_slow >ERROR tests/test_calibration.py::test_calibrate_stuck[True-0-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-0-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-0-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-0-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-0-1.000000000000001] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-1-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-1-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-1-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-1-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[True-1-1.000000000000001] >ERROR tests/test_calibration.py::test_calibrate_stuck[True--1-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[True--1-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[True--1-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[True--1-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[True--1-1.000000000000001] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-0-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-0-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-0-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-0-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-0-1.000000000000001] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-1-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-1-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-1-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-1-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[False-1-1.000000000000001] >ERROR tests/test_calibration.py::test_calibrate_stuck[False--1-1] >ERROR tests/test_calibration.py::test_calibrate_stuck[False--1-0.01] >ERROR tests/test_calibration.py::test_calibrate_stuck[False--1-1e-09] >ERROR tests/test_calibration.py::test_calibrate_stuck[False--1-1e-10] >ERROR tests/test_calibration.py::test_calibrate_stuck[False--1-1.000000000000001] >ERROR tests/test_normal.py::test_fast >ERROR tests/test_normal.py::test_slow >ERROR tests/test_normal.py::test_slower >ERROR tests/test_normal.py::test_xfast >ERROR tests/test_normal.py::test_parametrized[0] >ERROR tests/test_normal.py::test_parametrized[1] >ERROR tests/test_normal.py::test_parametrized[2] >ERROR tests/test_normal.py::test_parametrized[3] >ERROR tests/test_normal.py::test_parametrized[4] >ERROR tests/test_pedantic.py::test_single >ERROR tests/test_pedantic.py::test_setup >ERROR tests/test_pedantic.py::test_setup_cprofile >ERROR tests/test_pedantic.py::test_args_kwargs >ERROR tests/test_pedantic.py::test_iterations >ERROR tests/test_pedantic.py::test_rounds_iterations >ERROR tests/test_pedantic.py::test_rounds >ERROR tests/test_pedantic.py::test_warmup_rounds >ERROR tests/test_pedantic.py::test_rounds_must_be_int[0] >ERROR tests/test_pedantic.py::test_rounds_must_be_int[x] >ERROR tests/test_pedantic.py::test_warmup_rounds_must_be_int[-15] >ERROR tests/test_pedantic.py::test_warmup_rounds_must_be_int[x] >ERROR tests/test_pedantic.py::test_setup_many_rounds >ERROR tests/test_pedantic.py::test_cant_use_both_args_and_setup_with_return >ERROR tests/test_pedantic.py::test_can_use_both_args_and_setup_without_return >ERROR tests/test_pedantic.py::test_cant_use_setup_with_many_iterations >ERROR tests/test_pedantic.py::test_iterations_must_be_positive_int[0] >ERROR tests/test_pedantic.py::test_iterations_must_be_positive_int[-1] >ERROR tests/test_pedantic.py::test_iterations_must_be_positive_int[asdf] >ERROR tests/test_sample.py::test_proto[SimpleProxy] >ERROR tests/test_sample.py::test_proto[CachedPropertyProxy] >ERROR tests/test_sample.py::test_proto[LocalsSimpleProxy] >ERROR tests/test_sample.py::test_proto[LocalsCachedPropertyProxy] >ERROR tests/test_skip.py::test_skip >ERROR tests/test_with_testcase.py::TerribleTerribleWayToWriteTests::test_foo >ERROR tests/test_with_testcase.py::TerribleTerribleWayToWritePatchTests::test_foo2 >ERROR tests/test_with_weaver.py::test_weave_fixture >ERROR tests/test_with_weaver.py::test_weave_method >FAILED tests/test_benchmark.py::test_help - Failed: fnmatch: '*' >FAILED tests/test_benchmark.py::test_groups - Failed: nomatch: '*collected 5 ... >FAILED tests/test_benchmark.py::test_group_by_name - Failed: remains unmatche... >FAILED tests/test_benchmark.py::test_group_by_func - Failed: remains unmatche... >FAILED tests/test_benchmark.py::test_group_by_fullfunc - Failed: remains unma... >FAILED tests/test_benchmark.py::test_group_by_param_all - Failed: remains unm... >FAILED tests/test_benchmark.py::test_group_by_param_select - Failed: remains ... >FAILED tests/test_benchmark.py::test_group_by_param_select_multiple - Failed:... >FAILED tests/test_benchmark.py::test_group_by_fullname - Failed: line "* benc... >FAILED tests/test_benchmark.py::test_double_use - Failed: nomatch: '*FixtureA... >FAILED tests/test_benchmark.py::test_only_override_skip - Failed: remains unm... >FAILED tests/test_benchmark.py::test_fixtures_also_skipped - Failed: remains ... >FAILED tests/test_benchmark.py::test_conflict_between_only_and_disable - Fail... >FAILED tests/test_benchmark.py::test_max_time_min_rounds - Failed: remains un... >FAILED tests/test_benchmark.py::test_max_time - Failed: remains unmatched: '*... >FAILED tests/test_benchmark.py::test_bogus_max_time - Failed: fnmatch: '*usag... >FAILED tests/test_benchmark.py::test_pep418_timer - Failed: remains unmatched... >FAILED tests/test_benchmark.py::test_bad_save - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_bad_save_2 - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_bad_compare_fail - Failed: fnmatch: '*us... >FAILED tests/test_benchmark.py::test_bad_rounds - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_bad_rounds_2 - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_compare - Failed: nomatch: 'Comparing ag... >FAILED tests/test_benchmark.py::test_compare_last - Failed: nomatch: 'Compari... >FAILED tests/test_benchmark.py::test_compare_non_existing - Failed: nomatch: ... >FAILED tests/test_benchmark.py::test_compare_non_existing_verbose - Failed: n... >FAILED tests/test_benchmark.py::test_compare_no_files - Failed: nomatch: "* P... >FAILED tests/test_benchmark.py::test_compare_no_files_verbose - Failed: nomat... >FAILED tests/test_benchmark.py::test_compare_no_files_match - Failed: nomatch... >FAILED tests/test_benchmark.py::test_compare_no_files_match_verbose - Failed:... >FAILED tests/test_benchmark.py::test_verbose - Failed: nomatch: ' Calibratin... >FAILED tests/test_benchmark.py::test_save - Failed: nomatch: 'Saved benchmark... >FAILED tests/test_benchmark.py::test_save_extra_info - Failed: nomatch: 'Save... >FAILED tests/test_benchmark.py::test_update_machine_info_hook_detection - py.... >FAILED tests/test_benchmark.py::test_histogram - Failed: nomatch: 'Generated ... >FAILED tests/test_benchmark.py::test_autosave - Failed: nomatch: 'Saved bench... >FAILED tests/test_benchmark.py::test_bogus_min_time - Failed: fnmatch: '*usag... >FAILED tests/test_benchmark.py::test_disable_gc - Failed: remains unmatched: ... >FAILED tests/test_benchmark.py::test_custom_timer - Failed: remains unmatched... >FAILED tests/test_benchmark.py::test_bogus_timer - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_sort_by_mean - Failed: remains unmatched... >FAILED tests/test_benchmark.py::test_bogus_sort - Failed: fnmatch: '*usage: *' >FAILED tests/test_benchmark.py::test_cprofile - Failed: remains unmatched: '-... >FAILED tests/test_benchmark.py::test_disabled_and_cprofile - Failed: remains ... >FAILED tests/test_benchmark.py::test_abort_broken - Failed: nomatch: '*collec... >FAILED tests/test_benchmark.py::test_basic - Failed: nomatch: '*collected 5 i... >FAILED tests/test_benchmark.py::test_skip - Failed: remains unmatched: '*coll... >FAILED tests/test_benchmark.py::test_disable - Failed: remains unmatched: '*c... >FAILED tests/test_benchmark.py::test_mark_selection - Failed: nomatch: '*coll... >FAILED tests/test_benchmark.py::test_only_benchmarks - Failed: remains unmatc... >FAILED tests/test_benchmark.py::test_columns - Failed: remains unmatched: '*c... >FAILED tests/test_cli.py::test_list - FileNotFoundError: [Errno 2] No such fi... >FAILED tests/test_cli.py::test_compare[short-<lambda>] - FileNotFoundError: [... >FAILED tests/test_cli.py::test_compare[long-<lambda>] - FileNotFoundError: [E... >FAILED tests/test_cli.py::test_compare[normal-<lambda>] - FileNotFoundError: ... >FAILED tests/test_cli.py::test_compare[trial-<lambda>] - FileNotFoundError: [... >= 56 failed, 88 passed, 11 skipped, 6 deselected, 2 warnings, 71 errors in 36.63s = > * ERROR: dev-python/pytest-benchmark-3.4.1::guru failed (test phase): > * pytest failed with python3.8 > * > * Call stack: > * ebuild.sh, line 127: Called src_test > * environment, line 3386: Called distutils-r1_src_test > * environment, line 1629: Called _distutils-r1_run_foreach_impl 'python_test' > * environment, line 694: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' > * environment, line 3045: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > * environment, line 2576: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > * environment, line 2574: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' > * environment, line 1030: Called distutils-r1_run_phase 'python_test' > * environment, line 1555: Called python_test > * environment, line 3335: Called epytest '-o' 'markers=benchmark' > * environment, line 2087: Called die > * The specific snippet of code: > * "${@}" || die -n "pytest failed with ${EPYTHON}"; > * > * If you need support, post the output of `emerge --info '=dev-python/pytest-benchmark-3.4.1::guru'`, > * the complete build log and the output of `emerge -pqv '=dev-python/pytest-benchmark-3.4.1::guru'`. > * The complete build log is located at '/var/log/emerge-log/build/dev-python/pytest-benchmark-3.4.1:20220602-072725.log'. > * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/build.log'. > * The ebuild environment file is located at '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/temp/environment'. > * Working directory: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1' > * S: '/var/tmp/portage/dev-python/pytest-benchmark-3.4.1/work/pytest-benchmark-3.4.1' >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 849605
: 782636