Go to:
Gentoo Home
Documentation
Forums
Lists
Bugs
Planet
Store
Wiki
Get Gentoo!
Gentoo's Bugzilla – Attachment 782180 Details for
Bug 849233
[guru] dev-python/confluent-kafka-1.8.2-r1 fails tests
Home
|
New
–
[Ex]
|
Browse
|
Search
|
Privacy Policy
|
[?]
|
Reports
|
Requests
|
Help
|
New Account
|
Log In
[x]
|
Forgot Password
Login:
[x]
build.log
build.log (text/plain), 457.20 KB, created by
Agostino Sarubbo
on 2022-06-02 05:58:02 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Agostino Sarubbo
Created:
2022-06-02 05:58:02 UTC
Size:
457.20 KB
patch
obsolete
> * Package: dev-python/confluent-kafka-1.8.2-r1 > * Repository: guru > * Maintainer: lssndrbarbieri@gmail.com > * USE: abi_x86_64 amd64 elibc_glibc kernel_linux python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 test userland_GNU > * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; >This ebuild was merged at the following commit: >https://github.com/gentoo/gentoo/commit/0a8a45b7e39c2946f7fbf2645e9697ac94f28d2c (Wed Jun 1 12:04:11 UTC 2022) >@@@@@ END @@@@@ > > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; >This ebuild was merged at the following commit: >https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=3995b1b59177a12964d0f922eda2aca41a8c2947 (Wed Jun 1 06:39:01 UTC 2022) >@@@@@ END @@@@@ > > > >@@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ >This ebuild was merged (directly or as a dependency) because of the following commit: >https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=da53774afd5a894da52e9587562d07c7b07c0400 >@@@@@ END @@@@@ > > > >################## ># emerge --info: # >################## >Portage 3.0.30 (python 3.10.4-final-0, default/linux/amd64/17.1, gcc-11.3.0, glibc-2.35-r5, 4.19.174-gentoo x86_64) >================================================================= >System uname: Linux-4.19.174-gentoo-x86_64-Intel-R-_Xeon-R-_CPU_E5-2650_v4_@_2.20GHz-with-glibc2.35 >KiB Mem: 264046488 total, 36789000 free >KiB Swap: 0 total, 0 free >sh bash 5.1_p16 >ld GNU ld (Gentoo 2.38 p4) 2.38 >app-misc/pax-utils: 1.3.4::gentoo >app-shells/bash: 5.1_p16::gentoo >dev-lang/perl: 5.34.1-r3::gentoo >dev-lang/python: 2.7.18_p15::gentoo, 3.8.13_p2::gentoo, 3.9.13::gentoo, 3.10.4_p1::gentoo, 3.11.0_beta2_p1::gentoo >dev-lang/rust: 1.61.0-r1::gentoo >dev-util/cmake: 3.23.2::gentoo >dev-util/meson: 0.62.1::gentoo >sys-apps/baselayout: 2.8::gentoo >sys-apps/openrc: 0.44.10::gentoo >sys-apps/sandbox: 2.29::gentoo >sys-devel/autoconf: 2.71-r1::gentoo >sys-devel/automake: 1.16.5::gentoo >sys-devel/binutils: 2.38-r2::gentoo >sys-devel/binutils-config: 5.4.1::gentoo >sys-devel/gcc: 11.3.0::gentoo >sys-devel/gcc-config: 2.5-r1::gentoo >sys-devel/libtool: 2.4.7::gentoo >sys-devel/make: 4.3::gentoo >sys-kernel/linux-headers: 5.18::gentoo (virtual/os-headers) >sys-libs/glibc: 2.35-r5::gentoo >Repositories: > >gentoo > location: /usr/portage > sync-type: rsync > sync-uri: rsync://rsync.gentoo.org/gentoo-portage > priority: -1000 > sync-rsync-verify-jobs: 1 > sync-rsync-verify-metamanifest: yes > sync-rsync-extra-opts: > sync-rsync-verify-max-age: 24 > >guru > location: /opt/guru > masters: gentoo > priority: 0 > >ACCEPT_KEYWORDS="amd64 ~amd64" >ACCEPT_LICENSE="* Apache-2.0" >CBUILD="x86_64-pc-linux-gnu" >CFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >CHOST="x86_64-pc-linux-gnu" >CONFIG_PROTECT="/etc /usr/share/gnupg/qualified.txt" >CONFIG_PROTECT_MASK="/etc/ca-certificates.conf /etc/env.d /etc/fonts/fonts.conf /etc/gconf /etc/gentoo-release /etc/revdep-rebuild /etc/sandbox.d /etc/terminfo" >CXXFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >DISTDIR="/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/distdir" >EMERGE_DEFAULT_OPTS="--with-bdeps=y -1 -k -b" >ENV_UNSET="CARGO_HOME DBUS_SESSION_BUS_ADDRESS DISPLAY GOBIN GOPATH PERL5LIB PERL5OPT PERLPREFIX PERL_CORE PERL_MB_OPT PERL_MM_OPT XAUTHORITY XDG_CACHE_HOME XDG_CONFIG_HOME XDG_DATA_HOME XDG_RUNTIME_DIR" >FCFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >FEATURES="assume-digests binpkg-docompress binpkg-dostrip binpkg-logs binpkg-multi-instance buildpkg buildpkg-live config-protect-if-modified distlocks ebuild-locks fixlafiles ipc-sandbox merge-sync multilib-strict network-sandbox news parallel-fetch pid-sandbox preserve-libs protect-owned qa-unresolved-soname-deps sandbox sfperms sign split-log strict test unknown-features-warn unmerge-logs unmerge-orphans userfetch userpriv usersandbox usersync xattr" >FFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" >GENTOO_MIRRORS="http://mirror.leaseweb.com/gentoo/ http://ftp.snt.utwente.nl/pub/os/linux/gentoo/ http://ftp.belnet.be/pub/rsync.gentoo.org/gentoo/ http://distfiles.gentoo.org" >LANG="C.UTF8" >LDFLAGS="-Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0" >MAKEOPTS="-j17" >PKGDIR="/root/tbci/binpkg" >PORTAGE_CONFIGROOT="/" >PORTAGE_RSYNC_OPTS="--recursive --links --safe-links --perms --times --omit-dir-times --compress --force --whole-file --delete --stats --human-readable --timeout=180 --exclude=/distfiles --exclude=/local --exclude=/packages --exclude=/.git" >PORTAGE_TMPDIR="/var/tmp" >SHELL="/bin/bash" >USE="acl amd64 bzip2 cli crypt dri elogind fortran gdbm iconv ipv6 jumbo-build libglvnd libtirpc multilib native-symlinks ncurses nls nptl openmp pam pcre readline seccomp split-usr ssl test unicode xattr zlib" ABI_X86="64" ELIBC="glibc" KERNEL="linux" PYTHON_TARGETS="python3_8 python3_9 python3_10" USERLAND="GNU" >Unset: ADDR2LINE, AR, ARFLAGS, AS, ASFLAGS, CC, CCLD, CONFIG_SHELL, CPP, CPPFLAGS, CTARGET, CXX, CXXFILT, ELFEDIT, EXTRA_ECONF, F77FLAGS, FC, GCOV, GPROF, INSTALL_MASK, LC_ALL, LD, LEX, LFLAGS, LIBTOOL, LINGUAS, MAKE, MAKEFLAGS, NM, OBJCOPY, OBJDUMP, PORTAGE_BINHOST, PORTAGE_BUNZIP2_COMMAND, PORTAGE_COMPRESS, PORTAGE_COMPRESS_FLAGS, PORTAGE_RSYNC_EXTRA_OPTS, RANLIB, READELF, RUSTFLAGS, SIZE, STRINGS, STRIP, YACC, YFLAGS > > > > > >############################## ># emerge history (qlop -mv): # >############################## >2022-06-02T00:51:28 >>> dev-libs/protobuf-3.19.3 >2022-06-02T00:51:31 >>> app-eselect/eselect-rust-20210703 >2022-06-02T00:51:36 >>> sys-apps/lsb-release-1.4-r4 >2022-06-02T00:51:41 >>> dev-libs/librdkafka-1.8.2 >2022-06-02T00:51:47 >>> dev-lang/rust-1.61.0-r1 >2022-06-02T00:51:52 >>> dev-python/typing-extensions-4.2.0 >2022-06-02T00:52:01 >>> virtual/rust-1.61.0 >2022-06-02T00:52:06 >>> dev-python/six-1.16.0-r1 >2022-06-02T00:52:11 >>> dev-python/pyrsistent-0.18.1-r1 >2022-06-02T00:52:17 >>> dev-python/fastavro-1.4.12 >2022-06-02T00:52:23 >>> dev-python/avro-1.11.0 >2022-06-02T00:52:29 >>> dev-python/pytest-timeout-2.1.0 >2022-06-02T00:52:35 >>> dev-python/protobuf-python-3.19.3-r3 >2022-06-02T00:52:40 >>> dev-python/iniconfig-1.1.1-r1 >2022-06-02T00:52:46 >>> dev-python/pluggy-1.0.0-r2 >2022-06-02T00:52:51 >>> dev-python/py-1.11.0-r1 >2022-06-02T00:52:57 >>> dev-python/zope-interface-5.4.0-r2 >2022-06-02T00:53:04 >>> dev-python/fqdn-1.5.1-r2 >2022-06-02T00:53:10 >>> dev-python/attrs-21.4.0-r1 >2022-06-02T00:53:15 >>> dev-python/jsonpointer-2.3 >2022-06-02T00:53:21 >>> dev-python/pytest-7.1.2 >2022-06-02T00:53:27 >>> dev-python/uri_template-1.2.0 >2022-06-02T00:53:33 >>> dev-python/webcolors-1.12 >2022-06-02T00:53:39 >>> dev-python/regex-2022.4.24 >2022-06-02T00:53:44 >>> dev-python/strict-rfc3339-0.7-r2 >2022-06-02T00:53:50 >>> dev-python/rfc3987-1.3.8-r2 >2022-06-02T00:53:56 >>> dev-python/rfc3339-validator-0.1.4-r1 >2022-06-02T00:54:02 >>> dev-python/rfc3986-validator-0.1.1-r1 >2022-06-02T00:54:07 >>> dev-python/python-dateutil-2.8.2-r1 >2022-06-02T00:54:12 >>> dev-python/cryptography-37.0.2 >2022-06-02T00:54:18 >>> dev-python/arrow-1.2.2 >2022-06-02T00:54:25 >>> dev-python/wrapt-1.14.1 >2022-06-02T00:54:30 >>> dev-python/isoduration-20.11.0-r1 >2022-06-02T00:54:36 >>> dev-python/deprecated-1.2.13-r1 >2022-06-02T00:54:41 >>> dev-python/jsonschema-4.5.1 >2022-06-02T00:54:47 >>> dev-python/jwcrypto-1.3.1 >2022-06-02T00:54:53 >>> dev-python/semantic_version-2.10.0 >2022-06-02T00:54:58 >>> dev-python/ply-3.11-r2 >2022-06-02T00:55:04 >>> dev-python/pyflakes-2.4.0-r1 >2022-06-02T00:55:57 >>> dev-python/python_jwt-3.3.2 >2022-06-02T00:56:03 >>> dev-python/pycparser-2.21-r1 >2022-06-02T00:56:08 >>> dev-python/pbr-5.9.0 >2022-06-02T00:56:14 >>> dev-python/cffi-1.15.0-r1 >2022-06-02T00:56:20 >>> dev-python/cython-0.29.30 >2022-06-02T00:56:26 >>> dev-python/setuptools-rust-1.3.0 >2022-06-02T00:56:33 >>> dev-python/requests-mock-1.9.3-r1 >2022-06-02T00:56:39 >>> dev-python/trivup-0.10.0 > > > > >####################################### ># installed packages (qlist -ICvUSS): # >####################################### >acct-group/audio-0-r1:0 >acct-group/cdrom-0-r1:0 >acct-group/dialout-0-r1:0 >acct-group/disk-0-r1:0 >acct-group/input-0-r1:0 >acct-group/kmem-0-r1:0 >acct-group/kvm-0-r1:0 >acct-group/lp-0-r1:0 >acct-group/man-0-r1:0 >acct-group/messagebus-0-r1:0 >acct-group/polkitd-0-r1:0 >acct-group/portage-0:0 >acct-group/render-0-r1:0 >acct-group/sgx-0:0 >acct-group/sshd-0-r1:0 >acct-group/tape-0-r1:0 >acct-group/tty-0-r1:0 >acct-group/video-0-r1:0 >acct-user/man-1-r1:0 >acct-user/messagebus-0-r1:0 >acct-user/polkitd-0-r1:0 >acct-user/portage-0:0 >acct-user/sshd-0-r1:0 >app-admin/eselect-1.4.20:0 -doc -emacs -vim-syntax >app-admin/perl-cleaner-2.30:0 >app-arch/bzip2-1.0.8-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static -static-libs -verify-sig >app-arch/gzip-1.12:0 -pic -static -verify-sig >app-arch/libarchive-3.6.1:0/13 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -blake2 bzip2 e2fsprogs -expat iconv -lz4 lzma -lzo -nettle -static-libs -verify-sig xattr -zstd >app-arch/tar-1.34:0 acl -minimal nls -selinux -verify-sig xattr >app-arch/unzip-6.0_p26:0 bzip2 -natspec unicode >app-arch/xz-utils-5.2.5-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 extra-filters nls split-usr -static-libs -verify-sig >app-arch/zstd-1.5.2:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -lz4 -static-libs threads >app-crypt/gnupg-2.3.6:0 bzip2 -doc -ldap nls readline -selinux smartcard ssl -test tofu -tools -tpm -usb -user-socket -verify-sig -wks-server >app-crypt/gpgme-1.17.1-r1:1/11.6.15 -common-lisp cxx -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -qt5 -static-libs -test -verify-sig >app-crypt/libb2-0.98.1-r3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -native-cflags openmp -static-libs >app-crypt/libmd-1.0.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >app-crypt/pinentry-1.2.0:0 -caps -efl -emacs -gnome-keyring -gtk ncurses -qt5 >app-crypt/rhash-1.4.2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls ssl -static-libs >app-editors/nano-6.3:0 -debug -justify -magic -minimal ncurses nls spell split-usr -static unicode >app-eselect/eselect-fontconfig-20220403:0 >app-eselect/eselect-iptables-20220320:0 >app-eselect/eselect-lib-bin-symlink-0.1.1-r1:0 >app-eselect/eselect-pinentry-0.7.2:0 >app-eselect/eselect-rust-20210703:0 >app-i18n/man-pages-ja-20180315-r1:0 >app-i18n/man-pages-l10n-4.12.1-r2:0 l10n_cs l10n_da l10n_de l10n_el l10n_es l10n_fi l10n_fr l10n_hu l10n_id l10n_it l10n_mk l10n_nb l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_sr l10n_sv >app-i18n/man-pages-ru-5.03.2390.2390.20191017-r1:0 >app-i18n/man-pages-zh_CN-1.6.3.6:0 >app-misc/c_rehash-1.7-r1:0 >app-misc/ca-certificates-20211016.3.77:0 -cacert >app-misc/editor-wrapper-4-r1:0 >app-misc/mime-types-2.1.53:0 -nginx >app-misc/pax-utils-1.3.4:0 -caps -debug -python -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 seccomp >app-misc/tmux-3.2a:0 -debug -selinux -utempter -vim-syntax >app-portage/eix-0.36.2:0 -debug -doc nls -sqlite >app-portage/elt-patches-20211104:0 >app-portage/gemato-16.2:0 gpg python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test -tools >app-portage/gentoolkit-0.5.1-r1:0 python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >app-portage/portage-utils-0.93.3:0 nls openmp qmanifest qtegrity -static >app-shells/bash-5.1_p16:0 -afs -bashlogger -examples -mem-scramble net nls -plugins readline -verify-sig >app-shells/bash-completion-2.11:0 eselect -test >app-shells/gentoo-bashcomp-20190211:0 >app-shells/push-3.4:0 >app-shells/quoter-4.2:0 >app-text/ansifilter-2.18:0 -qt5 >app-text/build-docbook-catalog-2.3:0 >app-text/docbook-xml-dtd-4.5-r2:4.5 >app-text/docbook-xml-dtd-4.4-r3:4.4 >app-text/docbook-xml-dtd-4.2-r3:4.2 >app-text/docbook-xml-dtd-4.1.2-r7:4.1.2 >app-text/docbook-xsl-stylesheets-1.79.1-r2:0 -ruby >app-text/manpager-1:0 >app-text/opensp-1.5.2-r7:0 -doc nls -static-libs -test >app-text/po4a-0.66:0 -test -test >app-text/sgml-common-0.6.3-r7:0 >app-text/xmlto-0.0.28-r8:0 -latex text >dev-db/sqlite-3.38.5:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc -icu readline -secure-delete -static-libs -tcl -test -tools >dev-lang/duktape-2.7.0-r1:0/2.7.0 >dev-lang/perl-5.34.1-r3:0/5.34 -berkdb -debug -doc gdbm ithreads -minimal -quadmath >dev-lang/python-3.11.0_beta2_p1:3.11 -bluetooth -build -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst >dev-lang/python-3.10.4_p1:3.10 -bluetooth -build -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-3.9.13:3.9 -bluetooth -build -examples gdbm -hardened -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-3.8.13_p2:3.8 -bluetooth -build -examples gdbm -hardened ncurses readline sqlite ssl -test -tk -verify-sig -wininst xml >dev-lang/python-2.7.18_p15:2.7 -berkdb -bluetooth -build -examples gdbm -hardened ncurses readline sqlite ssl -tk -verify-sig -wininst xml >dev-lang/python-exec-2.4.9:2 native-symlinks python_targets_pypy3 python_targets_python3_10 python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-lang/python-exec-conf-2.4.6:2 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-lang/rust-1.61.0-r1:stable/1.61 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -clippy cpu_flags_x86_sse2 -debug -dist -doc -llvm_targets_AArch64 -llvm_targets_AMDGPU -llvm_targets_ARM -llvm_targets_AVR -llvm_targets_BPF -llvm_targets_Hexagon -llvm_targets_Lanai -llvm_targets_MSP430 -llvm_targets_Mips -llvm_targets_NVPTX -llvm_targets_PowerPC -llvm_targets_RISCV -llvm_targets_Sparc -llvm_targets_SystemZ -llvm_targets_WebAssembly llvm_targets_X86 -llvm_targets_XCore -miri -nightly -parallel-compiler -profiler -rls -rust-src rustfmt -system-bootstrap -system-llvm -test -verify-sig -wasm >dev-lang/tcl-8.6.12:0/8.6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug threads >dev-libs/boehm-gc-8.0.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx large -static-libs threads >dev-libs/elfutils-0.187:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma nls -static-libs -test -threads utils -valgrind -verify-sig -zstd >dev-libs/expat-2.4.8:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples -static-libs unicode >dev-libs/glib-2.72.2:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -dbus -debug elf -fam -gtk-doc mime -selinux -static-libs -sysprof -systemtap -test -utils xattr >dev-libs/gmp-6.2.1-r2:0/10.4 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cxx -doc -pic -static-libs >dev-libs/gobject-introspection-1.72.0:0 -doctool -gtk-doc -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -test >dev-libs/gobject-introspection-common-1.72.0:0 >dev-libs/isl-0.24-r2:0/23 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/jsoncpp-1.9.5:0/25 -doc -test >dev-libs/libassuan-2.5.5:0 >dev-libs/libatomic_ops-7.6.12:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >dev-libs/libbsd-0.11.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig >dev-libs/libevent-2.1.12:0/2.1-7 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 clock-gettime -debug -malloc-replacement ssl -static-libs -test threads -verbose-debug >dev-libs/libffi-3.4.2-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -exec-static-trampoline -pax-kernel -static-libs -test >dev-libs/libgcrypt-1.10.1:0/20 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_aes -cpu_flags_arm_neon -cpu_flags_arm_sha1 -cpu_flags_arm_sha2 -cpu_flags_ppc_altivec -cpu_flags_ppc_vsx2 -cpu_flags_ppc_vsx3 cpu_flags_x86_aes cpu_flags_x86_avx cpu_flags_x86_avx2 -cpu_flags_x86_padlock -cpu_flags_x86_sha cpu_flags_x86_sse4_1 -doc -static-libs -verify-sig >dev-libs/libgpg-error-1.45:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -common-lisp nls -static-libs -test >dev-libs/libksba-1.6.0:0 -static-libs >dev-libs/libltdl-2.4.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/libpcre-8.45-r1:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 cxx jit -libedit pcre16 pcre32 readline split-usr -static-libs unicode zlib >dev-libs/libpcre2-10.40:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 jit -libedit pcre16 pcre32 readline split-usr -static-libs unicode -verify-sig zlib >dev-libs/libpipeline-1.5.6:0 -test >dev-libs/librdkafka-1.8.2:0/1 -lz4 -sasl ssl -static-libs -zstd >dev-libs/libtasn1-4.18.0:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -test -valgrind >dev-libs/libunistring-1.0:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs >dev-libs/libuv-1.44.1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >dev-libs/libxml2-2.9.14-r1:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -examples -icu -lzma python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 readline -static-libs -test >dev-libs/libxslt-1.1.35:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 crypt -debug -examples -static-libs >dev-libs/lzo-2.10:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples split-usr -static-libs >dev-libs/mpc-1.2.1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/mpfr-4.1.0_p13-r1:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >dev-libs/nettle-3.7.3:0/8-6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_neon cpu_flags_x86_aes -cpu_flags_x86_sha -doc gmp -static-libs -test >dev-libs/npth-1.6-r1:0 -test >dev-libs/openssl-1.1.1o:0/1.1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cpu_flags_x86_sse2 -rfc3779 -sctp -sslv3 -static-libs -test -tls-compression -tls-heartbeat -vanilla -verify-sig -verify-sig -weak-ssl-ciphers >dev-libs/popt-1.18:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static-libs >dev-libs/protobuf-3.19.3:0/30 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -emacs -examples -static-libs -test zlib >dev-perl/Devel-CheckLib-1.140.0:0 -test >dev-perl/Encode-EUCJPASCII-0.30.0-r1:0 -test >dev-perl/Encode-HanExtra-0.230.0-r3:0 >dev-perl/Encode-Locale-1.50.0-r1:0 -test >dev-perl/File-BaseDir-0.90.0:0 -test >dev-perl/File-DesktopEntry-0.220.0-r1:0 -test >dev-perl/File-Listing-6.140.0:0 -test >dev-perl/File-MimeInfo-0.300.0:0 -test >dev-perl/HTML-Parser-3.760.0:0 -test >dev-perl/HTML-Tagset-3.200.0-r2:0 >dev-perl/HTTP-Cookies-6.100.0:0 -test >dev-perl/HTTP-Date-6.50.0:0 >dev-perl/HTTP-Message-6.330.0:0 -test -test >dev-perl/HTTP-Negotiate-6.10.0-r2:0 -test >dev-perl/IO-HTML-1.4.0:0 -test >dev-perl/IO-Socket-INET6-2.720.0-r2:0 -test >dev-perl/IO-Socket-SSL-2.74.0:0 -examples -idn -test >dev-perl/IPC-System-Simple-1.300.0:0 -test >dev-perl/libwww-perl-6.600.0-r1:0 ssl -test >dev-perl/Locale-gettext-1.70.0-r1:0 -test >dev-perl/LWP-MediaTypes-6.40.0:0 -test >dev-perl/LWP-Protocol-https-6.100.0:0 -test >dev-perl/MIME-Charset-1.12.2-r1:0 l10n_ja l10n_zh -test >dev-perl/Module-Build-0.423.100:0 -test >dev-perl/Mozilla-CA-20999999-r1:0 -test >dev-perl/Net-HTTP-6.210.0:0 -minimal -test >dev-perl/Net-SSLeay-1.920.0:0 -examples -examples -minimal -test >dev-perl/Pod-Parser-1.630.0-r1:0 -test >dev-perl/SGMLSpm-1.1-r2:0 -test >dev-perl/Socket6-0.290.0:0 -test >dev-perl/TermReadKey-2.380.0:0 -examples -test >dev-perl/Text-CharWidth-0.40.0-r2:0 -test >dev-perl/Text-WrapI18N-0.60.0-r2:0 -test >dev-perl/TimeDate-2.330.0-r1:0 -test >dev-perl/Try-Tiny-0.310.0:0 -minimal -test >dev-perl/Unicode-LineBreak-2019.1.0:0 >dev-perl/URI-5.100.0:0 -test >dev-perl/WWW-RobotRules-6.20.0-r2:0 -test >dev-perl/XML-Parser-2.460.0-r2:0 >dev-perl/YAML-Tiny-1.730.0-r1:0 -minimal -test >dev-python/appdirs-1.4.4-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/arrow-1.2.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/attrs-21.4.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/avro-1.11.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/certifi-3021.3.16-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/cffi-1.15.0-r1:0/1.15.0 -doc python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/charset_normalizer-2.0.12:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/cryptography-37.0.2:0 -debug python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/cython-0.29.30:0 -doc -emacs python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/deprecated-1.2.13-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/fastavro-1.4.12:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/flit_core-3.7.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/fqdn-1.5.1-r2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/gpep517-6:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/idna-3.3-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/importlib_metadata-4.11.4:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/importlib_resources-5.7.1:0 -doc python_targets_pypy3 python_targets_python3_8 -test >dev-python/iniconfig-1.1.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/installer-0.5.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/isoduration-20.11.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-context-4.1.1-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-functools-3.5.0-r2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jaraco-text-3.7.0-r2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jinja-3.1.2:0 -doc -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jsonpointer-2.3:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/jsonschema-4.5.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/jwcrypto-1.3.1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/markupsafe-2.1.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/more-itertools-8.13.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/nspektr-0.3.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/ordered-set-4.1.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/packaging-21.3-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pbr-5.9.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pluggy-1.0.0-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/ply-3.11-r2:0/3.11 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/protobuf-python-3.19.3-r3:0/30 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/py-1.11.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pycparser-2.21-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pyflakes-2.4.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pyparsing-3.0.9:0 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pypy3-7.3.9_p1:0/pypy39-pp73 bzip2 gdbm jit ncurses -sqlite -test -tk >dev-python/pypy3-exe-7.3.9:3.9-7.3.9 bzip2 -cpu_flags_x86_sse2 jit -low-memory ncurses >dev-python/pyrsistent-0.18.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/PySocks-1.7.1-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/pytest-7.1.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/pytest-timeout-2.1.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/python-dateutil-2.8.2-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/python_jwt-3.3.2:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/regex-2022.4.24:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/requests-2.27.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -socks5 -test >dev-python/requests-mock-1.9.3-r1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/rfc3339-validator-0.1.4-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/rfc3986-validator-0.1.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/rfc3987-1.3.8-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/semantic_version-2.10.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/setuptools-62.3.2-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/setuptools-rust-1.3.0:0 -debug python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/setuptools_scm-6.4.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/six-1.16.0-r1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/strict-rfc3339-0.7-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/tomli-2.0.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/trivup-0.10.0:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-python/typing-extensions-4.2.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/uri_template-1.2.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 >dev-python/urllib3-1.26.9-r1:0 -brotli python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/webcolors-1.12:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/wheel-0.37.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/wrapt-1.14.1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/zipp-3.8.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-python/zope-interface-5.4.0-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test >dev-util/checkbashisms-2.22.1:0 >dev-util/cmake-3.23.2:0 -doc -emacs ncurses -qt5 -test -test -verify-sig >dev-util/desktop-file-utils-0.26-r2:0 -emacs >dev-util/glib-utils-2.72.2:0 -python_single_target_python3_10 -python_single_target_python3_11 -python_single_target_python3_8 python_single_target_python3_9 >dev-util/gperf-3.1:0 >dev-util/gtk-doc-am-1.33.2:0 >dev-util/intltool-0.51.0-r2:0 >dev-util/meson-0.62.1:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test >dev-util/meson-format-array-0:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >dev-util/ninja-1.11.0:0 -doc -emacs -test -vim-syntax >dev-util/pkgconf-1.8.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -test >dev-util/re2c-2.2:0 -debug -test >dev-vcs/git-2.35.1:0 blksha1 -cgi curl -cvs -doc -emacs -gnome-keyring gpg -highlight iconv -mediawiki -mediawiki-experimental nls pcre -perforce -perl -ppcsha1 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -subversion -test threads -tk webdav -xinetd >media-fonts/liberation-fonts-2.1.3:0 -X -X -fontforge >media-gfx/graphite2-1.3.14_p20210810-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -perl -test >media-libs/fontconfig-2.14.0-r1:1.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs -test >media-libs/freetype-2.12.1:2 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 adobe-cff -brotli bzip2 cleartype-hinting -debug -doc -fontforge harfbuzz -infinality png -static-libs -svg -utils >media-libs/harfbuzz-4.3.0:0/4.0.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 cairo -debug -doc -experimental glib graphite -icu introspection -test truetype >media-libs/libpng-1.6.37-r2:0/16 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -apng -cpu_flags_arm_neon cpu_flags_x86_sse -static-libs >net-dns/libidn2-2.3.2:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig >net-firewall/iptables-1.8.8-r2:0/1.8.3 -conntrack -netlink -nftables -pcap split-usr -static-libs >net-libs/gnutls-3.7.6:0/30.30 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -brotli cxx -dane -doc -examples -guile idn nls openssl -pkcs11 seccomp -sslv2 -sslv3 -static-libs -test -test-full tls-heartbeat -tools -valgrind -verify-sig zlib -zstd >net-libs/libmnl-1.0.5:0/0.2.0 -examples -verify-sig >net-libs/libnsl-2.0.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >net-libs/libtirpc-1.3.2:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 ipv6 -kerberos split-usr -static-libs >net-libs/nghttp2-1.47.0:0/1.14 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx -debug -hpack-tools -jemalloc -static-libs -test threads -utils -xml >net-misc/curl-7.83.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -adns -alt-svc -brotli -curl_ssl_gnutls -curl_ssl_mbedtls -curl_ssl_nss curl_ssl_openssl ftp -gnutls -gopher -hsts http2 -idn imap ipv6 -kerberos -ldap -mbedtls -nghttp3 -nss openssl pop3 progress-meter -quiche -rtmp -samba smtp -ssh ssl -sslv3 -static-libs -telnet -test tftp -threads -verify-sig -zstd >net-misc/dhcpcd-9.4.1:0 -debug embedded ipv6 -privsep udev >net-misc/iputils-20211215:0 arping -caps -clockdiff -doc filecaps -idn nls -rarpd -rdisc -static -test -tracepath >net-misc/netifrc-0.7.3-r1:0 dhcp >net-misc/openssh-9.0_p1-r1:0 -X -X509 -abi_mips_n32 -audit -debug -hpn -kerberos -ldns -libedit -livecd pam pie -sctp -security-key -selinux ssl -static -test -verify-sig -xmss >net-misc/rsync-3.2.4-r1:0 acl -examples iconv ipv6 -lz4 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 ssl -stunnel -system-zlib -verify-sig xattr -xxhash -zstd >net-misc/wget-1.21.3:0 -cookie-check -debug -gnutls -idn ipv6 -metalink nls -ntlm pcre ssl -static -test -uuid -verify-sig zlib >perl-core/CPAN-2.290.0-r1:0 >perl-core/Encode-3.120.0:0 >perl-core/File-Temp-0.231.100:0 >perl-core/Scalar-List-Utils-1.560.0:0 >sec-keys/openpgp-keys-gentoo-release-20220101:0 -test >sys-apps/acl-2.3.1-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls split-usr -static-libs >sys-apps/attr-2.5.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls split-usr -static-libs >sys-apps/baselayout-2.8:0 -build split-usr >sys-apps/coreutils-9.1-r1:0 acl -caps -gmp -hostname -kill -multicall nls -selinux split-usr -static -test -vanilla -verify-sig xattr >sys-apps/dbus-1.14.0-r1:0 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc elogind -selinux -static-libs -systemd -test -test >sys-apps/debianutils-5.7:0 installkernel -static >sys-apps/diffutils-3.8:0 nls -static -verify-sig >sys-apps/file-5.41-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma -python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -seccomp -static-libs zlib >sys-apps/findutils-4.9.0:0 nls -selinux -static -test -verify-sig >sys-apps/gawk-5.1.1-r2:0 -mpfr nls readline -verify-sig >sys-apps/gentoo-functions-0.15:0 >sys-apps/grep-3.7:0 nls pcre -static -verify-sig >sys-apps/groff-1.22.4:0 -X -examples -uchardet >sys-apps/help2man-1.48.5:0 nls >sys-apps/install-xattr-0.8:0 >sys-apps/iproute2-5.18.0-r1:0 -atm -berkdb -bpf -caps -elf iptables -libbsd -minimal -nfs -selinux split-usr >sys-apps/kbd-2.4.0:0 nls pam -test >sys-apps/kmod-29:0 -debug -doc lzma -pkcs7 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs tools zlib -zstd >sys-apps/less-590:0 pcre unicode >sys-apps/lsb-release-1.4-r4:0 >sys-apps/man-db-2.10.2-r1:0 manpager nls seccomp -selinux -static-libs zlib >sys-apps/man-pages-5.13:0 l10n_de l10n_es l10n_fr l10n_it l10n_ja l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_ru l10n_zh-CN >sys-apps/man-pages-posix-2017a:0 >sys-apps/miscfiles-1.5-r4:0 -minimal >sys-apps/net-tools-2.10:0 arp hostname ipv6 -nis nls -plipconfig -selinux -slattach -static >sys-apps/openrc-0.44.10:0 -audit -bash -debug ncurses netifrc -newnet pam -selinux -sysv-utils unicode >sys-apps/portage-3.0.30-r5:0 -apidoc -build -doc -gentoo-dev ipc native-extensions python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 rsync-verify -selinux -test xattr >sys-apps/sandbox-2.29:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 nnp >sys-apps/sed-4.8:0 acl nls -selinux -static -verify-sig >sys-apps/shadow-4.11.1:0/4 acl -audit -bcrypt -cracklib nls pam -selinux -skey split-usr -su xattr >sys-apps/systemd-utils-250.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -boot kmod -selinux split-usr -sysusers -test tmpfiles udev >sys-apps/sysvinit-3.04:0 -ibm nls -selinux -static -verify-sig >sys-apps/texinfo-6.8:0 nls standalone -static >sys-apps/util-linux-2.38:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -build -caps cramfs -cryptsetup -fdformat hardlink -kill logger -magic ncurses nls pam -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 readline -rtas -selinux -slang split-usr -static-libs su suid -systemd -test -tty-helpers -udev unicode -verify-sig >sys-apps/which-2.21:0 >sys-auth/elogind-246.10-r2:0 acl -audit cgroup-hybrid -debug -doc pam policykit -selinux -test >sys-auth/pambase-20220214:0 -caps -debug elogind -gnome-keyring -homed -minimal -mktemp nullok -pam_krb5 -pam_ssh passwdqc -pwhistory -pwquality -securetty -selinux sha512 -systemd -yescrypt >sys-auth/passwdqc-2.0.2-r1:0 >sys-auth/polkit-0.120_p20220509:0 duktape -examples -gtk introspection -kde pam -selinux -systemd -test >sys-devel/autoconf-2.71-r1:2.71 -emacs >sys-devel/autoconf-archive-2022.02.11:0 >sys-devel/autoconf-wrapper-20220130:0 >sys-devel/automake-1.16.5:1.16 -test >sys-devel/automake-wrapper-11:0 >sys-devel/binutils-2.38-r2:2.38 -cet -default-gold -doc gold -multitarget nls -pgo plugins -static-libs -test -vanilla >sys-devel/binutils-config-5.4.1:0 native-symlinks >sys-devel/bison-3.8.2:0 -examples nls -static -test -verify-sig >sys-devel/flex-2.6.4-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static -test >sys-devel/gcc-11.3.0:11 -ada -cet -custom-cflags cxx -d -debug -doc -fixed-point fortran -go graphite -hardened -jit -libssp lto multilib nls nptl -objc -objc++ -objc-gc openmp -pch -pgo pie sanitize ssp -systemtap -test -valgrind -vanilla -vtv -zstd >sys-devel/gcc-config-2.5-r1:0 cc-wrappers native-symlinks >sys-devel/gettext-0.21-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -cvs cxx -doc -emacs -git -java -java ncurses nls openmp -static-libs -verify-sig >sys-devel/gnuconfig-20220508:0 >sys-devel/libtool-2.4.7:2 -vanilla >sys-devel/m4-1.4.19:0 -examples nls -verify-sig >sys-devel/make-4.3:0 -guile nls -static -verify-sig >sys-devel/patch-2.7.6-r4:0 -static -test -verify-sig xattr >sys-fs/e2fsprogs-1.46.5-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cron -fuse -lto nls split-usr -static-libs -test threads tools >sys-fs/udev-init-scripts-35:0 >sys-kernel/installkernel-gentoo-5:0 -grub >sys-kernel/linux-headers-5.18:0 -experimental-loong -headers-only >sys-libs/binutils-libs-2.38-r2:0/2.38 -64-bit-bfd -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cet -multitarget nls -static-libs >sys-libs/gdbm-1.23:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 berkdb nls readline -static-libs -verify-sig >sys-libs/glibc-2.35-r5:2.2 -audit -caps -cet clone3 -compile-locales -crypt -custom-cflags -doc -experimental-loong -gd -headers-only multiarch multilib -multilib-bootstrap -nscd -profile -selinux ssp static-libs -suid -systemd -systemtap -test -vanilla >sys-libs/libcap-2.64:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 pam split-usr -static-libs -tools >sys-libs/libseccomp-2.5.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs -test >sys-libs/libxcrypt-4.4.28-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 compat split-usr -static-libs system -test >sys-libs/ncurses-6.3_p20220423:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -ada cxx -debug -doc -gpm -minimal -profile split-usr -static-libs -test tinfo -trace -verify-sig >sys-libs/pam-1.5.2-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -berkdb -debug filecaps -nis -selinux >sys-libs/readline-8.1_p2:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static-libs unicode -utils -verify-sig >sys-libs/timezone-data-2022a:0 -leaps-timezone nls -zic-slim >sys-libs/zlib-1.2.12-r2:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 minizip split-usr -static-libs -verify-sig >sys-process/procps-3.3.17-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 elogind kill -modern-top ncurses nls -selinux split-usr -static-libs -systemd -test unicode >sys-process/psmisc-23.4-r1:0 -X ipv6 nls -selinux >virtual/acl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >virtual/awk-1:0 >virtual/dev-manager-0-r2:0 >virtual/editor-0-r3:0 >virtual/libc-1-r1:0 >virtual/libcrypt-2:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs >virtual/libelf-3:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libiconv-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libintl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >virtual/libudev-232-r7:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -systemd >virtual/man-0-r4:0 >virtual/os-headers-0-r2:0 >virtual/package-manager-1:0 >virtual/pager-0:0 >virtual/perl-Carp-1.520.0-r2:0 >virtual/perl-Compress-Raw-Bzip2-2.103.0-r2:0 >virtual/perl-Compress-Raw-Zlib-2.103.0-r1:0 >virtual/perl-CPAN-2.290.0:0 >virtual/perl-CPAN-Meta-2.150.10-r6:0 >virtual/perl-CPAN-Meta-Requirements-2.140.0-r8:0 >virtual/perl-CPAN-Meta-YAML-0.18.0-r8:0 >virtual/perl-Data-Dumper-2.179.0:0 >virtual/perl-Digest-MD5-2.580.0-r1:0 >virtual/perl-Encode-3.120.0:0 >virtual/perl-Exporter-5.760.0:0 >virtual/perl-ExtUtils-CBuilder-0.280.236-r1:0 >virtual/perl-ExtUtils-Install-2.200.0-r1:0 >virtual/perl-ExtUtils-MakeMaker-7.620.0:0 >virtual/perl-ExtUtils-Manifest-1.730.0-r1:0 >virtual/perl-ExtUtils-ParseXS-3.430.0:0 >virtual/perl-File-Path-2.180.0-r1:0 >virtual/perl-File-Spec-3.800.0:0 >virtual/perl-File-Temp-0.231.100:0 >virtual/perl-Getopt-Long-2.520.0-r1:0 >virtual/perl-IO-1.460.0:0 >virtual/perl-IO-Compress-2.103.0-r1:0 >virtual/perl-IO-Socket-IP-0.410.0-r1:0 >virtual/perl-JSON-PP-4.60.0:0 >virtual/perl-libnet-3.130.0:0 ssl >virtual/perl-MIME-Base64-3.160.0-r1:0 >virtual/perl-Module-Metadata-1.0.37-r2:0 >virtual/perl-parent-0.238.0-r2:0 >virtual/perl-Parse-CPAN-Meta-2.150.10-r6:0 >virtual/perl-Perl-OSType-1.10.0-r6:0 >virtual/perl-podlators-4.140.0-r3:0 >virtual/perl-Scalar-List-Utils-1.560.0:0 >virtual/perl-Test-Harness-3.430.0:0 >virtual/perl-Text-ParseWords-3.300.0-r8:0 >virtual/perl-Time-Local-1.300.0-r1:0 >virtual/perl-version-0.992.800:0 >virtual/perl-XSLoader-0.300.0-r4:0 >virtual/pkgconfig-2-r1:0 >virtual/rust-1.61.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -rustfmt >virtual/service-manager-1:0 >virtual/ssh-0:0 -minimal >virtual/tmpfiles-0-r3:0 >virtual/ttf-fonts-1-r1:0 >virtual/udev-217-r5:0 >virtual/w3m-1:0 >virtual/yacc-0:0 >www-client/pybugz-0.13-r2:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >www-client/w3m-0.5.3_p20220429:0 -X -fbcon -gdk-pixbuf -gpm -imlib l10n_ja -lynxkeymap nls -nntp ssl unicode -xface >x11-apps/xprop-1.2.5:0 >x11-apps/xset-1.2.4-r1:0 >x11-base/xcb-proto-1.15:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 >x11-base/xorg-proto-2022.1:0 -test >x11-libs/cairo-1.16.0-r5:0 X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -aqua -debug -gles2-only glib -opengl -static-libs svg -utils -valgrind >x11-libs/libICE-1.0.10-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 ipv6 >x11-libs/libSM-1.2.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 uuid >x11-libs/libX11-1.7.5:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 -test >x11-libs/libXau-1.0.9-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libxcb-1.15:0/1.12 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -doc -selinux -test xkb >x11-libs/libXdmcp-1.1.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libXext-1.3.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc >x11-libs/libXmu-1.1.3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 >x11-libs/libXrender-0.9.10-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 >x11-libs/libXt-1.2.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -test >x11-libs/pixman-0.40.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cpu_flags_arm_iwmmxt -cpu_flags_arm_iwmmxt2 -cpu_flags_arm_neon -cpu_flags_ppc_altivec cpu_flags_x86_mmxext cpu_flags_x86_sse2 cpu_flags_x86_ssse3 -loongson2f -static-libs -test >x11-libs/xtrans-1.4.0:0 -doc >x11-misc/compose-tables-1.8:0 >x11-misc/shared-mime-info-2.2:0 -test >x11-misc/xdg-utils-1.1.3_p20210805:0 -dbus -doc -gnome > > >####################### ># build.log # >####################### >>>> Unpacking source... >>>> Unpacking confluent-kafka-1.8.2.gh.tar.gz to /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work >>>> Source unpacked in /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work >>>> Preparing source in /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 ... > * Build system packages: > * dev-python/setuptools : 62.3.2-r1 >>>> Source prepared. >>>> Configuring source in /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 ... >>>> Source configured. >>>> Compiling source in /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 ... > * python3_8: running distutils-r1_run_phase distutils-r1_python_compile >python3.8 setup.py build -j 17 >running build >running build_py >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka >copying src/confluent_kafka/serializing_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka >copying src/confluent_kafka/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka >copying src/confluent_kafka/deserializing_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka >copying src/confluent_kafka/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/serialization >copying src/confluent_kafka/serialization/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/serialization >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/protobuf.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/json_schema.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/avro.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/schema_registry >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/kafkatest >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/load.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/cached_schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/admin >copying src/confluent_kafka/admin/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/admin >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/message_serializer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/avro/serializer >warning: build_py: byte-compiling is disabled, skipping. > >running build_ext >building 'confluent_kafka.cimpl' extension >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.8 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o >x86_64-pc-linux-gnu-gcc -shared -Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0 -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/temp.linux-x86_64-3.8/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o -L/usr/lib64 -lrdkafka -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_8/lib/confluent_kafka/cimpl.cpython-38-x86_64-linux-gnu.so > * python3_9: running distutils-r1_run_phase distutils-r1_python_compile >python3.9 setup.py build -j 17 >running build >running build_py >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka >copying src/confluent_kafka/serializing_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka >copying src/confluent_kafka/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka >copying src/confluent_kafka/deserializing_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka >copying src/confluent_kafka/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/serialization >copying src/confluent_kafka/serialization/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/serialization >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/protobuf.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/json_schema.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/avro.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/schema_registry >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/kafkatest >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/load.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/cached_schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/admin >copying src/confluent_kafka/admin/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/admin >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/message_serializer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/avro/serializer >warning: build_py: byte-compiling is disabled, skipping. > >running build_ext >building 'confluent_kafka.cimpl' extension >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.9 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o >/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c: In function ‘_init_cimpl’: >/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c:2545:9: warning: ‘PyEval_InitThreads’ is deprecated [-Wdeprecated-declarations] > 2545 | PyEval_InitThreads(); > | ^~~~~~~~~~~~~~~~~~ >In file included from /usr/include/python3.9/Python.h:140, > from /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.h:18, > from /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c:17: >/usr/include/python3.9/ceval.h:130:37: note: declared here > 130 | Py_DEPRECATED(3.9) PyAPI_FUNC(void) PyEval_InitThreads(void); > | ^~~~~~~~~~~~~~~~~~ >x86_64-pc-linux-gnu-gcc -shared -Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0 -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/temp.linux-x86_64-3.9/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o -L/usr/lib64 -lrdkafka -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_9/lib/confluent_kafka/cimpl.cpython-39-x86_64-linux-gnu.so > * python3_10: running distutils-r1_run_phase distutils-r1_python_compile >python3.10 setup.py build -j 17 >running build >running build_py >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka >copying src/confluent_kafka/serializing_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka >copying src/confluent_kafka/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka >copying src/confluent_kafka/deserializing_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka >copying src/confluent_kafka/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/serialization >copying src/confluent_kafka/serialization/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/serialization >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/protobuf.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/json_schema.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/avro.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >copying src/confluent_kafka/schema_registry/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/schema_registry >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_producer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/verifiable_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/kafkatest >copying src/confluent_kafka/kafkatest/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/kafkatest >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/load.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/error.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/cached_schema_registry_client.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro >copying src/confluent_kafka/avro/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/admin >copying src/confluent_kafka/admin/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/admin >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/message_serializer.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro/serializer >copying src/confluent_kafka/avro/serializer/__init__.py -> /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/avro/serializer >warning: build_py: byte-compiling is disabled, skipping. > >running build_ext >building 'confluent_kafka.cimpl' extension >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2 >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka >creating /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o >x86_64-pc-linux-gnu-gcc -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 -fPIC -I/usr/include/python3.10 -c /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o >/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c: In function ‘_init_cimpl’: >/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c:2545:9: warning: ‘PyEval_InitThreads’ is deprecated [-Wdeprecated-declarations] > 2545 | PyEval_InitThreads(); > | ^~~~~~~~~~~~~~~~~~ >In file included from /usr/include/python3.10/Python.h:130, > from /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.h:18, > from /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.c:17: >/usr/include/python3.10/ceval.h:122:37: note: declared here > 122 | Py_DEPRECATED(3.9) PyAPI_FUNC(void) PyEval_InitThreads(void); > | ^~~~~~~~~~~~~~~~~~ >x86_64-pc-linux-gnu-gcc -shared -Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0 -O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0 /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Admin.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/AdminTypes.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Consumer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Metadata.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/Producer.o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/temp.linux-x86_64-3.10/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/src/confluent_kafka/src/confluent_kafka.o -L/usr/lib64 -lrdkafka -o /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2-python3_10/lib/confluent_kafka/cimpl.cpython-310-x86_64-linux-gnu.so >>>> Source compiled. >>>> Test phase: dev-python/confluent-kafka-1.8.2-r1 > * python3_8: running distutils-r1_run_phase python_test >python3.8 -m pytest -vv -ra -l -Wdefault --color=no -o console_output_style=count -p no:cov -p no:flake8 -p no:flakes -p no:pylint -p no:markdown >============================= test session starts ============================== >platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /usr/bin/python3.8 >cachedir: .pytest_cache >rootdir: /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2, configfile: tox.ini, testpaths: tests >plugins: timeout-2.1.0, requests-mock-1.9.3 >collecting ... collected 252 items > >tests/test_Admin.py::test_types PASSED [ 1/252] >tests/test_Admin.py::test_basic_api PASSED [ 2/252] >tests/test_Admin.py::test_create_topics_api PASSED [ 3/252] >tests/test_Admin.py::test_delete_topics_api PASSED [ 4/252] >tests/test_Admin.py::test_create_partitions_api PASSED [ 5/252] >tests/test_Admin.py::test_describe_configs_api PASSED [ 6/252] >tests/test_Admin.py::test_alter_configs_api PASSED [ 7/252] >tests/test_Consumer.py::test_basic_api PASSED [ 8/252] >tests/test_Consumer.py::test_store_offsets PASSED [ 9/252] >tests/test_Consumer.py::test_on_commit PASSED [ 10/252] >tests/test_Consumer.py::test_subclassing PASSED [ 11/252] >tests/test_Consumer.py::test_offsets_for_times PASSED [ 12/252] >tests/test_Consumer.py::test_multiple_close_does_not_throw_exception PASSED [ 13/252] >tests/test_Consumer.py::test_any_method_after_close_throws_exception PASSED [ 14/252] >tests/test_Consumer.py::test_calling_store_offsets_after_close_throws_erro PASSED [ 15/252] >tests/test_Consumer.py::test_consumer_without_groupid PASSED [ 16/252] >tests/test_KafkaError.py::test_error_cb PASSED [ 17/252] >tests/test_KafkaError.py::test_fatal PASSED [ 18/252] >tests/test_KafkaError.py::test_retriable PASSED [ 19/252] >tests/test_KafkaError.py::test_abortable PASSED [ 20/252] >tests/test_KafkaError.py::test_subclassing PASSED [ 21/252] >tests/test_KafkaError.py::test_kafkaError_custom_msg PASSED [ 22/252] >tests/test_KafkaError.py::test_kafkaError_unknonw_error PASSED [ 23/252] >tests/test_KafkaError.py::test_kafkaException_unknown_KafkaError_with_subclass PASSED [ 24/252] >tests/test_Producer.py::test_basic_api PASSED [ 25/252] >tests/test_Producer.py::test_produce_timestamp PASSED [ 26/252] >tests/test_Producer.py::test_produce_headers PASSED [ 27/252] >tests/test_Producer.py::test_produce_headers_should_fail SKIPPED (Old versions should fail when using headers) [ 28/252] >tests/test_Producer.py::test_subclassing PASSED [ 29/252] >tests/test_Producer.py::test_dr_msg_errstr PASSED [ 30/252] >tests/test_Producer.py::test_set_partitioner_murmur2 PASSED [ 31/252] >tests/test_Producer.py::test_set_partitioner_murmur2_random PASSED [ 32/252] >tests/test_Producer.py::test_set_invalid_partitioner_murmur PASSED [ 33/252] >tests/test_Producer.py::test_transaction_api PASSED [ 34/252] >tests/test_Producer.py::test_purge PASSED [ 35/252] >tests/test_SerializerError.py::SerializerErrorTest::test_message PASSED [ 36/252] >tests/test_TopicPartition.py::test_sort PASSED [ 37/252] >tests/test_TopicPartition.py::test_cmp PASSED [ 38/252] >tests/test_TopicPartition.py::test_hash PASSED [ 39/252] >tests/test_TopicPartition.py::test_subclassing PASSED [ 40/252] >tests/test_docs.py::test_verify_docs PASSED [ 41/252] >tests/test_enums.py::test_enums PASSED [ 42/252] >tests/test_enums.py::test_tstype_enums PASSED [ 43/252] >tests/test_enums.py::test_offset_consts PASSED [ 44/252] >tests/test_error.py::test_new_consume_error_constant PASSED [ 45/252] >tests/test_error.py::test_new_consume_error_caused_by PASSED [ 46/252] >tests/test_error.py::test_new_consume_error_custom_message PASSED [ 47/252] >tests/test_error.py::test_new_produce_error_constant PASSED [ 48/252] >tests/test_error.py::test_new_produce_error_caused_by PASSED [ 49/252] >tests/test_error.py::test_new_produce_error_custom_message PASSED [ 50/252] >tests/test_log.py::test_logging_consumer PASSED [ 51/252] >tests/test_log.py::test_logging_avro_consumer PASSED [ 52/252] >tests/test_log.py::test_logging_producer PASSED [ 53/252] >tests/test_log.py::test_logging_avro_producer PASSED [ 54/252] >tests/test_log.py::test_logging_constructor PASSED [ 55/252] >tests/test_misc.py::test_version PASSED [ 56/252] >tests/test_misc.py::test_error_cb PASSED [ 57/252] >tests/test_misc.py::test_stats_cb PASSED [ 58/252] >tests/test_misc.py::test_conf_none PASSED [ 59/252] >tests/test_misc.py::test_throttle_event_types PASSED [ 60/252] >tests/test_misc.py::test_oauth_cb PASSED [ 61/252] >tests/test_misc.py::test_unordered_dict[Consumer] SKIPPED (requires confluent-librdkafka-plugins be installed and copied to the current directory) [ 62/252] >tests/test_misc.py::test_unordered_dict[Producer] SKIPPED (requires confluent-librdkafka-plugins be installed and copied to the current directory) [ 63/252] >tests/test_misc.py::test_unordered_dict[AdminClient] SKIPPED (requires confluent-librdkafka-plugins be installed and copied to the current directory) [ 64/252] >tests/test_misc.py::test_topic_config_update PASSED [ 65/252] >tests/test_threads.py::test_thread_safety PASSED [ 66/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_instantiation PASSED [ 67/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_no_key PASSED [ 68/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_no_key_schema PASSED [ 69/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_no_value PASSED [ 70/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_no_value_schema PASSED [ 71/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_primitive_key_and_value PASSED [ 72/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_primitive_string_key PASSED [ 73/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_value_and_key_schemas PASSED [ 74/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_with_custom_registry PASSED [ 75/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_with_custom_registry_and_registry_url PASSED [ 76/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_with_empty_key_no_schema PASSED [ 77/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_with_empty_key_value_with_schema PASSED [ 78/252] >tests/avro/test_avro_producer.py::TestAvroProducer::test_produce_with_empty_value_no_schema PASSED [ 79/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_basic_auth_invalid PASSED [ 80/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_basic_auth_sasl_inherit PASSED [ 81/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_basic_auth_sasl_inherit_invalid PASSED [ 82/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_basic_auth_url PASSED [ 83/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_basic_auth_userinfo PASSED [ 84/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_no_key PASSED [ 85/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_path PASSED [ 86/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_with_key PASSED [ 87/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_check_registration PASSED [ 88/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_context PASSED [ 89/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_dupe_register PASSED [ 90/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_empty_url PASSED [ 91/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_getters PASSED [ 92/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_init_with_dict PASSED [ 93/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_invalid_conf PASSED [ 94/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_invalid_type_url PASSED [ 95/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_invalid_type_url_dict PASSED [ 96/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_invalid_url PASSED [ 97/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_multi_register PASSED [ 98/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_multi_subject_register PASSED [ 99/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_register PASSED [100/252] >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_trailing_slash_removal PASSED [101/252] >tests/avro/test_message_serializer.py::TestMessageSerializer::test_decode_none PASSED [102/252] >tests/avro/test_message_serializer.py::TestMessageSerializer::test_encode_record_with_schema PASSED [103/252] >tests/avro/test_message_serializer.py::TestMessageSerializer::test_encode_record_with_schema_sets_writers_cache_once PASSED [104/252] >tests/avro/test_message_serializer.py::TestMessageSerializer::test_encode_with_schema_id PASSED [105/252] >tests/avro/test_mock_client.py::TestMockSchemaRegistryClient::test_dupe_register PASSED [106/252] >tests/avro/test_mock_client.py::TestMockSchemaRegistryClient::test_getters PASSED [107/252] >tests/avro/test_mock_client.py::TestMockSchemaRegistryClient::test_multi_register PASSED [108/252] >tests/avro/test_mock_client.py::TestMockSchemaRegistryClient::test_multi_subject_register PASSED [109/252] >tests/avro/test_mock_client.py::TestMockSchemaRegistryClient::test_register PASSED [110/252] >tests/avro/test_util.py::TestUtil::test_schema_from_file PASSED [111/252] >tests/avro/test_util.py::TestUtil::test_schema_from_string PASSED [112/252] >tests/avro/test_util.py::TestUtil::test_schema_load_parse_error PASSED [113/252] >tests/integration/consumer/test_consumer_error.py::test_consume_error ERROR [114/252] >tests/integration/consumer/test_cooperative_rebalance_1.py::test_cooperative_rebalance_1 ERROR [115/252] >tests/integration/consumer/test_cooperative_rebalance_2.py::test_cooperative_rebalance_2 ERROR [116/252] >tests/integration/consumer/test_incremental_assign.py::test_incremental_assign ERROR [117/252] >tests/integration/producer/test_transactions.py::test_commit_transaction ERROR [118/252] >tests/integration/producer/test_transactions.py::test_abort_transaction ERROR [119/252] >tests/integration/producer/test_transactions.py::test_abort_retry_commit_transaction ERROR [120/252] >tests/integration/producer/test_transactions.py::test_send_offsets_committed_transaction ERROR [121/252] >tests/integration/schema_registry/test_api_client.py::test_api_register_schema ERROR [122/252] >tests/integration/schema_registry/test_api_client.py::test_api_register_schema_incompatible ERROR [123/252] >tests/integration/schema_registry/test_api_client.py::test_api_register_schema_invalid ERROR [124/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_schema ERROR [125/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_schema_not_found ERROR [126/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_registration_subject_not_found ERROR [127/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_register_schema_invalid ERROR [128/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_subjects ERROR [129/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_subject_versions ERROR [130/252] >tests/integration/schema_registry/test_api_client.py::test_api_delete_subject ERROR [131/252] >tests/integration/schema_registry/test_api_client.py::test_api_delete_subject_not_found ERROR [132/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version ERROR [133/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version_no_version ERROR [134/252] >tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version_invalid ERROR [135/252] >tests/integration/schema_registry/test_api_client.py::test_api_post_subject_registration ERROR [136/252] >tests/integration/schema_registry/test_api_client.py::test_api_delete_subject_version ERROR [137/252] >tests/integration/schema_registry/test_api_client.py::test_api_subject_config_update ERROR [138/252] >tests/integration/schema_registry/test_api_client.py::test_api_config_invalid ERROR [139/252] >tests/integration/schema_registry/test_api_client.py::test_api_config_update ERROR [140/252] >tests/integration/schema_registry/test_api_client.py::test_api_register_logical_schema ERROR [141/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[basic_schema.avsc-data0-record] ERROR [142/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_string.avsc-J\xe4mtland-string] ERROR [143/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_bool.avsc-True-bool] ERROR [144/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_float.avsc-32768.2342-float] ERROR [145/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_double.avsc-68.032768-float] ERROR [146/252] >tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[basic_schema.avsc-data0-record] ERROR [147/252] >tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_string.avsc-J\xe4mtland-string] ERROR [148/252] >tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_bool.avsc-True-bool] ERROR [149/252] >tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_float.avsc-768.234-float] ERROR [150/252] >tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_double.avsc-6.868-float] ERROR [151/252] >tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization_custom ERROR [152/252] >tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization ERROR [153/252] >tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_incompatible ERROR [154/252] >tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_no_title ERROR [155/252] >tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_custom ERROR [156/252] >tests/integration/schema_registry/test_json_serializers.py::test_json_record_deserialization_mismatch ERROR [157/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[TestMessage-data0] ERROR [158/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[TestMessage-data1] ERROR [159/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[NestedMessage-data2] ERROR [160/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[TestMessage-expected_refs0] ERROR [161/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[NestedMessage-expected_refs1] ERROR [162/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[DependencyMessage-expected_refs2] ERROR [163/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[ClickCas-expected_refs3] ERROR [164/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_serializer_type_mismatch ERROR [165/252] >tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_deserializer_type_mismatch ERROR [166/252] >tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer0-deserializer0-6.21682154508147] ERROR [167/252] >tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer1-deserializer1-4124] ERROR [168/252] >tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer2-deserializer2-None] ERROR [169/252] >tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer3-deserializer3-None] ERROR [170/252] >tests/integration/serialization/test_serializers.py::test_string_serialization[J\xe4mtland-utf_8] ERROR [171/252] >tests/integration/serialization/test_serializers.py::test_string_serialization[H\xe4rjedalen-utf_16] ERROR [172/252] >tests/integration/serialization/test_serializers.py::test_string_serialization[None-utf_32] ERROR [173/252] >tests/integration/serialization/test_serializers.py::test_mixed_serialization[key_serializer0-value_serializer0-key_deserializer0-value_deserializer0--31.216821545081448-J\xe4mtland] ERROR [174/252] >tests/integration/serialization/test_serializers.py::test_mixed_serialization[key_serializer1-value_serializer1-key_deserializer1-value_deserializer1-H\xe4rjedalen-1.2168215450814477] ERROR [175/252] >tests/schema_registry/test_api_client.py::test_basic_auth_unauthorized PASSED [176/252] >tests/schema_registry/test_api_client.py::test_basic_auth_authorized PASSED [177/252] >tests/schema_registry/test_api_client.py::test_register_schema PASSED [178/252] >tests/schema_registry/test_api_client.py::test_register_schema_incompatible PASSED [179/252] >tests/schema_registry/test_api_client.py::test_register_schema_invalid PASSED [180/252] >tests/schema_registry/test_api_client.py::test_register_schema_cache PASSED [181/252] >tests/schema_registry/test_api_client.py::test_get_schema PASSED [182/252] >tests/schema_registry/test_api_client.py::test_get_schema_not_found PASSED [183/252] >tests/schema_registry/test_api_client.py::test_get_schema_cache PASSED [184/252] >tests/schema_registry/test_api_client.py::test_get_registration PASSED [185/252] >tests/schema_registry/test_api_client.py::test_get_registration_subject_not_found PASSED [186/252] >tests/schema_registry/test_api_client.py::test_get_registration_schema_not_found PASSED [187/252] >tests/schema_registry/test_api_client.py::test_get_subjects PASSED [188/252] >tests/schema_registry/test_api_client.py::test_delete PASSED [189/252] >tests/schema_registry/test_api_client.py::test_delete_subject_not_found PASSED [190/252] >tests/schema_registry/test_api_client.py::test_get_version PASSED [191/252] >tests/schema_registry/test_api_client.py::test_get_version_no_version PASSED [192/252] >tests/schema_registry/test_api_client.py::test_get_version_invalid PASSED [193/252] >tests/schema_registry/test_api_client.py::test_get_version_subject_not_found PASSED [194/252] >tests/schema_registry/test_api_client.py::test_delete_version PASSED [195/252] >tests/schema_registry/test_api_client.py::test_delete_version_not_found PASSED [196/252] >tests/schema_registry/test_api_client.py::test_delete_version_subject_not_found PASSED [197/252] >tests/schema_registry/test_api_client.py::test_delete_version_invalid PASSED [198/252] >tests/schema_registry/test_api_client.py::test_set_compatibility PASSED [199/252] >tests/schema_registry/test_api_client.py::test_set_compatibility_invalid PASSED [200/252] >tests/schema_registry/test_api_client.py::test_get_compatibility_subject_not_found PASSED [201/252] >tests/schema_registry/test_api_client.py::test_schema_equivilence PASSED [202/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_no_error[conflict-latest-False] PASSED [203/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_no_error[conflict-1-False] PASSED [204/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_no_error[test-key-latest-True] PASSED [205/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_no_error[test-key-1-True] PASSED [206/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_with_error[notfound-latest-Subject not found-404-40401] PASSED [207/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_with_error[invalid-latest-Invalid Schema-422-42201] PASSED [208/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_with_error[invalid-422-Invalid version-422-42202] PASSED [209/252] >tests/schema_registry/test_api_client.py::test_test_compatibility_with_error[notfound-404-Version not found-404-40402] PASSED [210/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_auto_register_schemas PASSED [211/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_auto_register_schemas_invalid PASSED [212/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_auto_register_schemas_false PASSED [213/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_use_latest_version PASSED [214/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_subject_name_strategy PASSED [215/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_config_subject_name_strategy_invalid PASSED [216/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_record_subject_name_strategy PASSED [217/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_record_subject_name_strategy_primitive PASSED [218/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_topic_record_subject_name_strategy PASSED [219/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_topic_record_subject_name_strategy_primitive PASSED [220/252] >tests/schema_registry/test_avro_serializer.py::test_avro_serializer_subject_name_strategy_default PASSED [221/252] >tests/schema_registry/test_config.py::test_config_url_invalid PASSED [222/252] >tests/schema_registry/test_config.py::test_config_url_invalid_type PASSED [223/252] >tests/schema_registry/test_config.py::test_config_url_None PASSED [224/252] >tests/schema_registry/test_config.py::test_config_url_trailing_slash PASSED [225/252] >tests/schema_registry/test_config.py::test_config_ssl_certificate PASSED [226/252] >tests/schema_registry/test_config.py::test_config_ssl_certificate_no_key PASSED [227/252] >tests/schema_registry/test_config.py::test_config_ssl_key_no_certificate PASSED [228/252] >tests/schema_registry/test_config.py::test_config_auth_url PASSED [229/252] >tests/schema_registry/test_config.py::test_config_auth_url_and_userinfo PASSED [230/252] >tests/schema_registry/test_config.py::test_config_auth_userinfo PASSED [231/252] >tests/schema_registry/test_config.py::test_config_auth_userinfo_invalid PASSED [232/252] >tests/schema_registry/test_config.py::test_config_unknown_prop PASSED [233/252] >tests/schema_registry/test_proto.py::test_create_index[DependencyMessage-coordinates0] PASSED [234/252] >tests/schema_registry/test_proto.py::test_create_index[Watermark-coordinates1] PASSED [235/252] >tests/schema_registry/test_proto.py::test_create_index[KacohaConfig-coordinates2] PASSED [236/252] >tests/schema_registry/test_proto.py::test_index_serialization[True-DependencyMessage] PASSED [237/252] >tests/schema_registry/test_proto.py::test_index_serialization[True-Watermark] PASSED [238/252] >tests/schema_registry/test_proto.py::test_index_serialization[True-KacohaConfig] PASSED [239/252] >tests/schema_registry/test_proto.py::test_index_serialization[False-DependencyMessage] PASSED [240/252] >tests/schema_registry/test_proto.py::test_index_serialization[False-Watermark] PASSED [241/252] >tests/schema_registry/test_proto.py::test_index_serialization[False-KacohaConfig] PASSED [242/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx0-True-00] PASSED [243/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx1-False-00] PASSED [244/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx2-True-0202] PASSED [245/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx3-False-0101] PASSED [246/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx4-True-06fe011012] PASSED [247/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx5-False-037f0809] PASSED [248/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx6-True-028002] PASSED [249/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx7-False-018001] PASSED [250/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx8-True-02feffffffffffffffff01] PASSED [251/252] >tests/schema_registry/test_proto.py::test_index_encoder[msg_idx9-False-01ffffffffffffffff7f] PASSED [252/252] > >==================================== ERRORS ==================================== >_____________________ ERROR at setup of test_consume_error _____________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >---------------------------- Captured stdout setup ----------------------------- >[2022-06-02 00:57:47.884139] ZookeeperApp-1: Creating ZookeeperApp instance >[2022-06-02 00:57:47.885162] KafkaBrokerApp-2: Creating KafkaBrokerApp instance >[2022-06-02 00:57:47.885484] KafkaBrokerApp-2: Listeners: PLAINTEXT://0.0.0.0:35021,DOCKER://0.0.0.0:49935 >[2022-06-02 00:57:47.885513] KafkaBrokerApp-2: Advertised Listeners: PLAINTEXT://localhost:35021,DOCKER://localhost:49935 >[2022-06-02 00:57:47.886483] KafkaBrokerApp-3: Creating KafkaBrokerApp instance >[2022-06-02 00:57:47.886793] KafkaBrokerApp-3: Listeners: PLAINTEXT://0.0.0.0:45741,DOCKER://0.0.0.0:37773 >[2022-06-02 00:57:47.886823] KafkaBrokerApp-3: Advertised Listeners: PLAINTEXT://localhost:45741,DOCKER://localhost:37773 >[2022-06-02 00:57:47.887754] KafkaBrokerApp-4: Creating KafkaBrokerApp instance >[2022-06-02 00:57:47.888062] KafkaBrokerApp-4: Listeners: PLAINTEXT://0.0.0.0:53169,DOCKER://0.0.0.0:33513 >[2022-06-02 00:57:47.888093] KafkaBrokerApp-4: Advertised Listeners: PLAINTEXT://localhost:53169,DOCKER://localhost:33513 >[2022-06-02 00:57:47.889096] SchemaRegistryApp-5: Creating SchemaRegistryApp instance >[2022-06-02 00:57:47.889268] KafkaBrokerApp-2: Deploy KafkaBrokerApp version 2.8.0 on localhost to /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0 >[2022-06-02 00:57:47.889435] KafkaBrokerApp-2: Deploy command: /usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" >### /usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh: Downloading 2.8.0 from https://archive.apache.org/dist/kafka/2.8.0/kafka_2.12-2.8.0.tgz >### /usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh: Downloading 2.8.0 from https://archive.apache.org/dist/kafka/2.8.0/kafka_2.11-2.8.0.tgz ># Download of 2.8.0 failed >---------------------------- Captured stderr setup ----------------------------- > >gzip: stdin: unexpected end of file >tar: Child returned status 1 >tar: Error is not recoverable: exiting now > >gzip: stdin: unexpected end of file >tar: Child returned status 1 >tar: Error is not recoverable: exiting now >________________ ERROR at setup of test_cooperative_rebalance_1 ________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >________________ ERROR at setup of test_cooperative_rebalance_2 ________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________________ ERROR at setup of test_incremental_assign ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________________ ERROR at setup of test_commit_transaction ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________________ ERROR at setup of test_abort_transaction ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >____________ ERROR at setup of test_abort_retry_commit_transaction _____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________ ERROR at setup of test_send_offsets_committed_transaction ___________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________________ ERROR at setup of test_api_register_schema __________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________ ERROR at setup of test_api_register_schema_incompatible ____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >______________ ERROR at setup of test_api_register_schema_invalid ______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >____________________ ERROR at setup of test_api_get_schema _____________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_______________ ERROR at setup of test_api_get_schema_not_found ________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >________ ERROR at setup of test_api_get_registration_subject_not_found _________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >____________ ERROR at setup of test_api_get_register_schema_invalid ____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________________ ERROR at setup of test_api_get_subjects ____________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_______________ ERROR at setup of test_api_get_subject_versions ________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________________ ERROR at setup of test_api_delete_subject ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_____________ ERROR at setup of test_api_delete_subject_not_found ______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >________________ ERROR at setup of test_api_get_subject_version ________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________ ERROR at setup of test_api_get_subject_version_no_version ___________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >____________ ERROR at setup of test_api_get_subject_version_invalid ____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_____________ ERROR at setup of test_api_post_subject_registration _____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >______________ ERROR at setup of test_api_delete_subject_version _______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_______________ ERROR at setup of test_api_subject_config_update _______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________________ ERROR at setup of test_api_config_invalid ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________________ ERROR at setup of test_api_config_update ___________________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >______________ ERROR at setup of test_api_register_logical_schema ______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_avro_record_serialization[basic_schema.avsc-data0-record] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_avro_record_serialization[primitive_string.avsc-J\xe4mtland-string] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_avro_record_serialization[primitive_bool.avsc-True-bool] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_avro_record_serialization[primitive_float.avsc-32768.2342-float] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_avro_record_serialization[primitive_double.avsc-68.032768-float] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_delivery_report_serialization[basic_schema.avsc-data0-record] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_delivery_report_serialization[primitive_string.avsc-J\xe4mtland-string] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_delivery_report_serialization[primitive_bool.avsc-True-bool] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_delivery_report_serialization[primitive_float.avsc-768.234-float] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_delivery_report_serialization[primitive_double.avsc-6.868-float] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________ ERROR at setup of test_avro_record_serialization_custom ____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_______________ ERROR at setup of test_json_record_serialization _______________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >________ ERROR at setup of test_json_record_serialization_incompatible _________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________ ERROR at setup of test_json_record_serialization_no_title ___________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________ ERROR at setup of test_json_record_serialization_custom ____________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_________ ERROR at setup of test_json_record_deserialization_mismatch __________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___ ERROR at setup of test_protobuf_message_serialization[TestMessage-data0] ___ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___ ERROR at setup of test_protobuf_message_serialization[TestMessage-data1] ___ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__ ERROR at setup of test_protobuf_message_serialization[NestedMessage-data2] __ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_protobuf_reference_registration[TestMessage-expected_refs0] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_protobuf_reference_registration[NestedMessage-expected_refs1] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_protobuf_reference_registration[DependencyMessage-expected_refs2] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_protobuf_reference_registration[ClickCas-expected_refs3] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________ ERROR at setup of test_protobuf_serializer_type_mismatch ___________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >__________ ERROR at setup of test_protobuf_deserializer_type_mismatch __________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_numeric_serialization[serializer0-deserializer0-6.21682154508147] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_numeric_serialization[serializer1-deserializer1-4124] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_numeric_serialization[serializer2-deserializer2-None] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_numeric_serialization[serializer3-deserializer3-None] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >________ ERROR at setup of test_string_serialization[J\xe4mtland-utf_8] ________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >______ ERROR at setup of test_string_serialization[H\xe4rjedalen-utf_16] _______ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >___________ ERROR at setup of test_string_serialization[None-utf_32] ___________ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_mixed_serialization[key_serializer0-value_serializer0-key_deserializer0-value_deserializer0--31.216821545081448-J\xe4mtland] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >_ ERROR at setup of test_mixed_serialization[key_serializer1-value_serializer1-key_deserializer1-value_deserializer1-H\xe4rjedalen-1.2168215450814477] _ > > @pytest.fixture(scope="package") > def kafka_cluster(): > """ > If BROKERS environment variable is set to a CSV list of bootstrap servers > an existing cluster is used. > Additionally, if SR_URL environment variable is set the Schema-Registry > client will use the given URL. > > If BROKERS is not set a TrivUp cluster is created and used. > """ > > bootstraps = os.environ.get("BROKERS", "") > if bootstraps != "": > conf = {"bootstrap.servers": bootstraps} > sr_url = os.environ.get("SR_URL", "") > if sr_url != "": > conf["schema.registry.url"] = sr_url > print("Using ByoFixture with config from env variables: ", conf) > cluster = create_byo_cluster(conf) > else: >> cluster = create_trivup_cluster() > >bootstraps = '' > >tests/integration/conftest.py:64: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >tests/integration/conftest.py:30: in create_trivup_cluster > return TrivupFixture({'with_sr': True, >tests/integration/cluster_fixture.py:243: in __init__ > self._cluster = KafkaCluster(**conf) > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > self = <tests.integration.cluster_fixture.TrivupFixture object at 0x7feb93701db0> >/usr/lib/python3.8/site-packages/trivup/clusters/KafkaCluster.py:204: in __init__ > self.cluster.deploy() > __class__ = <class 'trivup.clusters.KafkaCluster.KafkaCluster'> > all_listeners = ['PLAINTEXT://localhost:35021', > 'DOCKER://localhost:49935', > 'PLAINTEXT://localhost:45741', > 'DOCKER://localhost:37773', > 'PLAINTEXT://localhost:53169', > 'DOCKER://localhost:33513'] > bconf = {'conf': ['broker.rack=RACK${appid}', > 'replica.selector.class=org.apache.kafka.common.replica.RackAwareReplicaSelector'], > 'kafka_path': None, > 'listener_host': 'localhost', > 'num_partitions': 4, > 'replication_factor': 3, > 'sasl_mechanisms': '', > 'sasl_users': 'testuser=testpass', > 'version': '2.8.0'} > broker = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c9a0> > broker_cnt = 3 > conf = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > kwargs = {'broker_conf': ['transaction.state.log.replication.factor=1', > 'transaction.state.log.min.isr=1'], > 'cp_version': 'latest', > 'debug': True, > 'with_sr': True} > n = 2 > self = <trivup.clusters.KafkaCluster.KafkaCluster object at 0x7feb9250c640> >/usr/lib/python3.8/site-packages/trivup/trivup.py:131: in deploy > app.deploy() > app = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > self = <trivup.trivup.Cluster object at 0x7feb9250c7c0> >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ > >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> > > def deploy(self): > destdir = os.path.join(self.cluster.mkpath(self.__class__.__name__), > 'kafka', self.get('version')) > self.dbg('Deploy %s version %s on %s to %s' % > (self.name, self.get('version'), self.node.name, destdir)) > deploy_exec = self.resource_path('deploy.sh') > if not os.path.exists(deploy_exec): > raise NotImplementedError('Kafka deploy.sh script missing in %s' % > deploy_exec) > t_start = time.time() > cmd = '%s %s "%s" "%s"' % \ > (deploy_exec, self.get('version'), > self.get('kafka_path', destdir), destdir) > self.dbg('Deploy command: {}'.format(cmd)) > r = os.system(cmd) > if r != 0: >> raise Exception('Deploy "%s" returned exit code %d' % (cmd, r)) >E Exception: Deploy "/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" "/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"" returned exit code 256 > >cmd = ('/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh 2.8.0 ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0" ' > '"/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0"') >deploy_exec = '/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp/deploy.sh' >destdir = '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tmp-KafkaCluster/KafkaCluster/KafkaBrokerApp/kafka/2.8.0' >r = 256 >self = <trivup.apps.KafkaBrokerApp.KafkaBrokerApp object at 0x7feb9250c5b0> >t_start = 1654124267.8894262 > >/usr/lib/python3.8/site-packages/trivup/apps/KafkaBrokerApp.py:440: Exception >=============================== warnings summary =============================== >tests/avro/test_cached_client.py: 22 warnings > /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tests/avro/test_cached_client.py:35: DeprecationWarning: CachedSchemaRegistry constructor is being deprecated. Use CachedSchemaRegistryClient(dict: config) instead. Existing params ca_location, cert_location and key_location will be replaced with their librdkafka equivalents as keys in the conf dict: `ssl.ca.location`, `ssl.certificate.location` and `ssl.key.location` respectively > self.client = CachedSchemaRegistryClient('http://127.0.0.1:' + str(self.server.server.server_port)) > >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_no_key > /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tests/avro/test_cached_client.py:143: DeprecationWarning: CachedSchemaRegistry constructor is being deprecated. Use CachedSchemaRegistryClient(dict: config) instead. Existing params ca_location, cert_location and key_location will be replaced with their librdkafka equivalents as keys in the conf dict: `ssl.ca.location`, `ssl.certificate.location` and `ssl.key.location` respectively > self.client = CachedSchemaRegistryClient(url='https://127.0.0.1:65534', > >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_path > /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tests/avro/test_cached_client.py:153: DeprecationWarning: CachedSchemaRegistry constructor is being deprecated. Use CachedSchemaRegistryClient(dict: config) instead. Existing params ca_location, cert_location and key_location will be replaced with their librdkafka equivalents as keys in the conf dict: `ssl.ca.location`, `ssl.certificate.location` and `ssl.key.location` respectively > self.client = CachedSchemaRegistryClient(url='https://127.0.0.1:65534', > >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_cert_with_key > /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tests/avro/test_cached_client.py:147: DeprecationWarning: CachedSchemaRegistry constructor is being deprecated. Use CachedSchemaRegistryClient(dict: config) instead. Existing params ca_location, cert_location and key_location will be replaced with their librdkafka equivalents as keys in the conf dict: `ssl.ca.location`, `ssl.certificate.location` and `ssl.key.location` respectively > self.client = CachedSchemaRegistryClient(url='https://127.0.0.1:65534', > >tests/avro/test_cached_client.py::TestCacheSchemaRegistryClient::test_invalid_type_url > /var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2/tests/avro/test_cached_client.py:180: DeprecationWarning: CachedSchemaRegistry constructor is being deprecated. Use CachedSchemaRegistryClient(dict: config) instead. Existing params ca_location, cert_location and key_location will be replaced with their librdkafka equivalents as keys in the conf dict: `ssl.ca.location`, `ssl.certificate.location` and `ssl.key.location` respectively > self.client = CachedSchemaRegistryClient( > >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >=========================== short test summary info ============================ >SKIPPED [1] tests/test_Producer.py:114: Old versions should fail when using headers >SKIPPED [3] tests/test_misc.py:174: requires confluent-librdkafka-plugins be installed and copied to the current directory >ERROR tests/integration/consumer/test_consumer_error.py::test_consume_error >ERROR tests/integration/consumer/test_cooperative_rebalance_1.py::test_cooperative_rebalance_1 >ERROR tests/integration/consumer/test_cooperative_rebalance_2.py::test_cooperative_rebalance_2 >ERROR tests/integration/consumer/test_incremental_assign.py::test_incremental_assign >ERROR tests/integration/producer/test_transactions.py::test_commit_transaction >ERROR tests/integration/producer/test_transactions.py::test_abort_transaction >ERROR tests/integration/producer/test_transactions.py::test_abort_retry_commit_transaction >ERROR tests/integration/producer/test_transactions.py::test_send_offsets_committed_transaction >ERROR tests/integration/schema_registry/test_api_client.py::test_api_register_schema >ERROR tests/integration/schema_registry/test_api_client.py::test_api_register_schema_incompatible >ERROR tests/integration/schema_registry/test_api_client.py::test_api_register_schema_invalid >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_schema >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_schema_not_found >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_registration_subject_not_found >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_register_schema_invalid >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_subjects >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_subject_versions >ERROR tests/integration/schema_registry/test_api_client.py::test_api_delete_subject >ERROR tests/integration/schema_registry/test_api_client.py::test_api_delete_subject_not_found >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version_no_version >ERROR tests/integration/schema_registry/test_api_client.py::test_api_get_subject_version_invalid >ERROR tests/integration/schema_registry/test_api_client.py::test_api_post_subject_registration >ERROR tests/integration/schema_registry/test_api_client.py::test_api_delete_subject_version >ERROR tests/integration/schema_registry/test_api_client.py::test_api_subject_config_update >ERROR tests/integration/schema_registry/test_api_client.py::test_api_config_invalid >ERROR tests/integration/schema_registry/test_api_client.py::test_api_config_update >ERROR tests/integration/schema_registry/test_api_client.py::test_api_register_logical_schema >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[basic_schema.avsc-data0-record] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_string.avsc-J\xe4mtland-string] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_bool.avsc-True-bool] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_float.avsc-32768.2342-float] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization[primitive_double.avsc-68.032768-float] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[basic_schema.avsc-data0-record] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_string.avsc-J\xe4mtland-string] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_bool.avsc-True-bool] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_float.avsc-768.234-float] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_delivery_report_serialization[primitive_double.avsc-6.868-float] >ERROR tests/integration/schema_registry/test_avro_serializers.py::test_avro_record_serialization_custom >ERROR tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization >ERROR tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_incompatible >ERROR tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_no_title >ERROR tests/integration/schema_registry/test_json_serializers.py::test_json_record_serialization_custom >ERROR tests/integration/schema_registry/test_json_serializers.py::test_json_record_deserialization_mismatch >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[TestMessage-data0] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[TestMessage-data1] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_message_serialization[NestedMessage-data2] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[TestMessage-expected_refs0] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[NestedMessage-expected_refs1] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[DependencyMessage-expected_refs2] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_reference_registration[ClickCas-expected_refs3] >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_serializer_type_mismatch >ERROR tests/integration/schema_registry/test_proto_serializers.py::test_protobuf_deserializer_type_mismatch >ERROR tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer0-deserializer0-6.21682154508147] >ERROR tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer1-deserializer1-4124] >ERROR tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer2-deserializer2-None] >ERROR tests/integration/serialization/test_serializers.py::test_numeric_serialization[serializer3-deserializer3-None] >ERROR tests/integration/serialization/test_serializers.py::test_string_serialization[J\xe4mtland-utf_8] >ERROR tests/integration/serialization/test_serializers.py::test_string_serialization[H\xe4rjedalen-utf_16] >ERROR tests/integration/serialization/test_serializers.py::test_string_serialization[None-utf_32] >ERROR tests/integration/serialization/test_serializers.py::test_mixed_serialization[key_serializer0-value_serializer0-key_deserializer0-value_deserializer0--31.216821545081448-J\xe4mtland] >ERROR tests/integration/serialization/test_serializers.py::test_mixed_serialization[key_serializer1-value_serializer1-key_deserializer1-value_deserializer1-H\xe4rjedalen-1.2168215450814477] >=========== 186 passed, 4 skipped, 26 warnings, 62 errors in 38.69s ============ > * ERROR: dev-python/confluent-kafka-1.8.2-r1::guru failed (test phase): > * pytest failed with python3.8 > * > * Call stack: > * ebuild.sh, line 127: Called src_test > * environment, line 3412: Called distutils-r1_src_test > * environment, line 1627: Called _distutils-r1_run_foreach_impl 'python_test' > * environment, line 692: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' > * environment, line 3085: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > * environment, line 2574: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' > * environment, line 2572: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' > * environment, line 1028: Called distutils-r1_run_phase 'python_test' > * environment, line 1553: Called python_test > * environment, line 3379: Called distutils-r1_python_test > * environment, line 1504: Called epytest > * environment, line 2085: Called die > * The specific snippet of code: > * "${@}" || die -n "pytest failed with ${EPYTHON}"; > * > * If you need support, post the output of `emerge --info '=dev-python/confluent-kafka-1.8.2-r1::guru'`, > * the complete build log and the output of `emerge -pqv '=dev-python/confluent-kafka-1.8.2-r1::guru'`. > * The complete build log is located at '/var/log/emerge-log/build/dev-python/confluent-kafka-1.8.2-r1:20220601-225650.log'. > * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/temp/build.log'. > * The ebuild environment file is located at '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/temp/environment'. > * Working directory: '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2' > * S: '/var/tmp/portage/dev-python/confluent-kafka-1.8.2-r1/work/confluent-kafka-python-1.8.2' >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 849233
: 782180